const vs define:
We may call it "prefer the compiler to the preprocessor" because #define is often treated as if it's not part of the language per se. That's one of its problems. When you do something like
#define RATIO 1.653
the symbolic name RATIO may never be seen by compilers; it may be removed by the preprocessor before the source code ever gets to a compiler. As a result, the name RATIO may not get entered into the symbol table. This can be confusing if you get an error during compilation involving the use of the constant, because the error message may refer to 1.653, not RATIO. If RATIO was defined in a header file you didn't write, you'd then have no idea where that 1.653 came from, and you'd probably waste time tracking it down. This problem can also crop up in a symbolic debugger, because, again, the name you're programming with may not be in the symbol table.
The solution to this sorry scenario is simple and succinct. Instead of using a preprocessor macro, define a constant:
const double RATIO = 1.653;