Segmentation fault with lots of memory

Hi All,

My counting skills seem to have gone awry today and I wonder if someone can point out my simple error?

The minimal application below is supposed to create an array with a total size of around 5GB (46000001 x 13 x 8) and set an integer. I compile it for linux x86_64 as a console app using Xojo 2016 release 4

	dim ps(46000000,12) as integer
	print "trying"
	ps(15129213,1)=2
	print "success"

when I try to run this on ether of my CentOS7 servers with either 120GB or 400GB free memory, I get segmentation
faults. Am I badly underestimating the size of the array here or is there another problem? Thanks for your thoughts!

first server:

[aplatts@ 1]$ free -g
total used free shared buff/cache available
Mem: 125 1 124 0 0 124
Swap: 3 0 3
[aplatts@ 1]$ ./test
trying
Segmentation fault (core dumped)
[aplatts@ 1]$

Second server:

[aplatts@ ~]$ chmod u+x test
[aplatts@ ~]$ free -g
total used free shared buff/cache available
Mem: 1511 1022 444 0 43 470
Swap: 3 0 3
[aplatts@ ~]$ ./test
trying
Segmentation fault (core dumped)
[aplatts@ ~]$

Does Xojo really support arrays of more than 2 GB of size?
I just wonder if all the access code for arrays has benn tested and updated for 64Bit.

Ah … I’d kind of assumed the 64bit transition would have included arrays. But maybe not. Perhaps I’d be better with memoryblocks (if these have been tested)?

It doesn’t yet, but I would have expected it to raise exceptions instead of outright crash.

I think Xojo updated the x86_64 guideline page yesterday to clarify that array sizes >2GB are not yet supported, which is helpful. In the meantime I’ll chunk the data and wait for large array support (I don’t know if there’s a planned release yet).

I just wrote three bug reports:
feedback://showreport?report_id=46221: bad_alloc crashes with low memory
feedback://showreport?report_id=46222: redim with big array causes OutOfBoundsException
feedback://showreport?report_id=46223: Creating big array can lead to small array