Is there any way to define Integer constants? Numeric constants all seem to show as double, and I don’t want the compiler to waste cycles performing a type conversion every time that I compare them to, or use them in integer calculations.
Seems like a very basic feature that I’m obviously overlooking…
You are making an assumption… and I’m about to make another one.
Perhaps the compiler is smarter than you think?
and that constants are replaced inline for the compiler as static entities therefore actually taking an appropriate type casting based on scope and context?
I just replace hundreds of INTEGER varibles with DOUBLES (due to other issues) and noticed no slow down at all
Are you saying that the compiler uses the appropriate numeric type depending on how it is used in code? Are you certain about this? If so, that would be impressive. If so, this implies that I can force a type conversion to a constant in an algebraic expression with absolutely no speed penalty whatsoever?
Yeah, assumptions always come back to bite me. I know that CPU’s are considerably faster than they used to be, but I’d like to know exactly what is happening behind the scenes, since I actually rely on the correct type casting in many of my calculations. Hopefully someone can confirm this, one way or another.