Arrays.Count returns an Int32 not Integer

I’m moving my app from Xojo 18R3 to 20R2 and after lots of debug discovered that Arrays.Count actually returns an Int32 and not the Integer the docs claim. So because of this, methods that Extend Integer no longer work.

Integers are supposed to be Int64 on 64bit platforms right??

And the data type returned by Arrays.Count should match the signature of Extends i as Integer, right??

Or are my Xojo skills too rusty?

Please fill a feedback case, so they can change it.

I think its right, because Arrays currently have a maximum number of 2,147,483,646 entries. Int32 Datatype highest number is 2,147,483,647.

Would be event better if this was a UInteger and for the index size also.
Currently they allow negative values which seem to be very useless.

Var myArr(-1000) As String // this would be useless better if it would have Unsigned sizing
Var count As Uinteger = myArr.Count // <-- this would be great
1 Like

Please note that -1 is a valid ubound for an empty array!


ah y true, forgot that one

Have a look at <>

Wayne thanks for that related feedback case. I’ve requested that it be reopened. Regardless of the internal implementation details the data type of the actual returned value needs to be consistent, even if the value needs to be converted.

If the documentation says the function returns an Integer then on a 64 bit platform that function needs to return a 64 bit value, regardless of the internal implementation. This is especially important since Xojo does strict type checking when matching overloaded function signatures.

If Xojo wants to return an Int32 then go ahead and document it that way. But note that returning Int32 breaks code from the 2018r3 era.

1 Like

I’ve filed a bug report for the specific case of Arrays.Count returning Int32 instead of Integer.
Since this breaks existing code I agree with Wayne and Norman that this issue should be revisited.

1 Like

Report 62876 also got closed “As Designed”.

So we’re left with the situation where the .Count function returns an INTEGER but still does NOT match the Signature of an Overloaded functions that requires an INTEGER.

This seems like a fundamental language consistency issue to me. How can type matching be left this way??? And especially since it breaks prior working code.

I agree with you, “integer” should always mean Int64 type, and not Int32, even tho its value will never exceed 2e9. It’s a matter of typing, not magnitude.

1 Like


and another 32-bit issue :sleepy: