Integer constants?

Is there any way to define Integer constants? Numeric constants all seem to show as double, and I don’t want the compiler to waste cycles performing a type conversion every time that I compare them to, or use them in integer calculations.

Seems like a very basic feature that I’m obviously overlooking…

You are making an assumption… and I’m about to make another one.
Perhaps the compiler is smarter than you think?
and that constants are replaced inline for the compiler as static entities therefore actually taking an appropriate type casting based on scope and context?

I just replace hundreds of INTEGER varibles with DOUBLES (due to other issues) and noticed no slow down at all

Are you saying that the compiler uses the appropriate numeric type depending on how it is used in code? Are you certain about this? If so, that would be impressive. If so, this implies that I can force a type conversion to a constant in an algebraic expression with absolutely no speed penalty whatsoever?

Can one of the compiler engineers verify this?

Yeah, assumptions always come back to bite me. I know that CPU’s are considerably faster than they used to be, but I’d like to know exactly what is happening behind the scenes, since I actually rely on the correct type casting in many of my calculations. Hopefully someone can confirm this, one way or another.

There are many, many worse inefficiencies to worry about. Integer vs. Double should be the least of your worries.

It uses the correct type based on the literal entered
If you enter 1.0 it uses a double
If you enter 1 it uses an integer

If you need to be sure put the value in a property or local variable of the right type & be sure