SQLiteDatabase.dylib hard crash on SELECT *

Xojo2018r4, MacBook Pro 2015

I’ve got a DB with about 88,5 million 14,6 million records in one table. The following line produces a hard crash:

dim rs As RecordSet = db.SQLSelect("SELECT * FROM myTableName;")

Is this to be expected? The crash report always looks the same:

[code]Crashed Thread: 0 Dispatch queue: com.apple.main-thread

Exception Type: EXC_BAD_ACCESS (SIGSEGV)
Exception Codes: KERN_INVALID_ADDRESS at 0x00000001688a9000
Exception Note: EXC_CORPSE_NOTIFY

VM Regions Near 0x1688a9000:
MALLOC_LARGE_REUSABLE 0000000145c73000-00000001688a9000 [556.2M]
rw-/rwx SM=PRV
–>
STACK GUARD 0000700000083000-0000700000084000 [ 4K]
—/rwx SM=NUL stack guard for thread 1

Thread 0 Crashed:: Dispatch queue: com.apple.main-thread
0 SQLiteDatabase.dylib 0x0000000110831a35 0x110829000 +
35381
1 SQLiteDatabase.dylib 0x0000000110832899 0x110829000 +
39065
2 SQLiteDatabase.dylib 0x00000001108330e5 0x110829000 +
41189
3 SQLiteDatabase.dylib 0x000000011082db94 0x110829000 +
19348
4 com.myCompany.myApp 0x000000010bb214e3
SQLiteDatabase.SQLSelect%o%os + 83
5 com.myCompany.myApp 0x000000010c270c11
myAppDatabase.SQLSelect_do%o%os + 81
6 com.myCompany.myApp 0x000000010c454c7c
DatabaseWindow.DatabaseWindow.DB_LoadCurrentTable%%o<DatabaseWindow.DatabaseW
indow> + 220[/code]

I’ve looked at the SQLite Limits documentation and I don’t see anything about a limit on the number of records that can be returned. I’ve set the cache size to 16GB with the following line when creating the DB.

me.SQLExecute "PRAGMA cache_size = 16000000"

try it without the cache statement to start.

It happens both with and without the cache pragma. Forgot to mention that I added the cache statement thinking it might solve the problem.

So looking at the docs, cache_size with a positive number is the maximum number of “pages”, whereas it looks like a negative number is used to specify the amount of memory. If you’re trying for 16GB (make sure you have that much free) you should be using -16000000.

Also. 16000000 Is 16MB, not 16GB.

[quote=427369:@Greg O’Lone]So looking at the docs, cache_size with a positive number is the maximum number of “pages”, whereas it looks like a negative number is used to specify the amount of memory. If you’re trying for 16GB (make sure you have that much free) you should be using -16000000.

Also. 16000000 Is 16MB, not 16GB.[/quote]

Yikes!

Okay, thanks. …

EDIT: One of us has read the documentation wrong:

If the argument N is negative, then the number of cache pages is adjusted to use approximately abs(N*1024) bytes of memory.

… so in this case 16000000 means 1024 x 16000000 = 16GB, right?

oh and yeah, with a 2015 Macbook Pro, don’t set the cache to 16GB. That’s the amount of RAM in the whole machine, but you will always have other apps running that take RAM. For instance, on my mostly idle machine, I’ve got 8GB free. I suggest trying something smaller.

https://sqlite.org/pragma.html#pragma_cache_size

Thanks, I know that. I’ve selected 16GB with the knowledge that it could freeze the machine. We need as much speed as possible and I haven’t read anything which convinces me that changing the cache size to a smaller value will help.

I believe the corrected (see above) cache line should be:

me.SQLExecute "PRAGMA cache_size = -16000000"

But this doesn’t solve the problem.

I was misinformed about the table size and have corrected it in the OP. It’s crashing on loading 14,6 million records, not 88,5 million.

…or crash the app when it runs out of RAM. You’re telling SQLite that this much memory is available for use and it probably doesn’t check on its own.

This line in your crash log seems to indicate that it crashed while trying to allocate 556.2MB.

MALLOC_LARGE_REUSABLE  0000000145c73000-00000001688a9000 [556.2M]

Considering the cache size defaults to 2K, I still think you should try something in between.

You can easily do a binary search to find the optimum non-crashing value. At max, it’ll only take 24 attempts.

NOTE: I fixed an error above -16000000 is 16GB.

[quote=427382:@Greg O’Lone]…or crash the app when it runs out of RAM. You’re telling SQLite that this much memory is available for use and it probably doesn’t check on its own.

This line in your crash log seems to indicate that it crashed while trying to allocate 556.2MB.

MALLOC_LARGE_REUSABLE  0000000145c73000-00000001688a9000 [556.2M]

Considering the cache size defaults to 2K, I still think you should try something in between.

You can easily do a binary search to find the optimum non-crashing value. At max, it’ll only take 24 attempts.

NOTE: I fixed an error above -16000000 is 16GB.[/quote]

Thank you. You have convinced me :slight_smile:

I think you’re both barking up the wrong tree. The Page Cache is just a performance mechanism to cache records (pages actually) read from the db so they are available for further operations instead of having to read from disk again. I think your actual issue is total memory available to hold all the records in the Xojo recordset. So you probably need to minimize memory used by all other processes, including the sqlite db library. Increasing the sqlite cache size exasperates the issue by taking memory away from your Xojo app.

I would start with determining how big this entire recordset is so you can determine if it’s even possible to read the entire set into memory. I also think you need to reassess why you need to read this entire recordset at one time. If you really do, you may need to devise a paging system to read smaller chunks of records. And I assume this is a 64-bit app? A 32-bit app will have definite limitations (I think 2 GB limit on object sizes?).

But the crash isn’t happening in the Xojo app, it’s happening in the DB object. This is why it would seem to make sense to allocate more memory to the DB object.

Yes, it’s 64-bit. I think you’re correct that I am going to have to load it in chunks, but I’m trying out a few different settings before working out a chunking method because the latter is sub-optimal.

Hmm. I don’t know how the sqlite library works (too lazy to dig through the source code), but if it is trying to load the entire recordset into memory before handing it off to Xojo, I hope it’s doing it by reference. Otherwise you would need twice the memory. Also, I think the cache is still a separate memory area from the recordset (within the db library), so the same guidelines I mentioned apply.

Have you tried this query in another query tool? If it works you can monitor its memory usage to get an idea of the memory required.

No, I wouldn’t know where to begin.

Open the SQLite database in a different program. Run the same query.

SQLiteManager from SQLabs is the most well rounded Xojo built SQLite tool in my opinion.
There are free options available, but I don’t have any recommendations.

I made a test DB file filled with dummy data for the 14,6 million records. The .rsd file is 1.24 GB. No matter what the cache setting, the table cannot be loaded. It crashes every time and the report is always the same, indicating the crash is happening in the SQLiteDatabase.dylib when the SELECT * is attempted.

[quote=427419:@Tim Parnell]Open the SQLite database in a different program. Run the same query.

SQLiteManager from SQLabs is the most well rounded Xojo built SQLite tool in my opinion.
There are free options available, but I don’t have any recommendations.[/quote]

Well I downloaded the demo, but this doesn’t help. It only allows me to load 20 rows. Anyway, if the app is made with Xojo then I predict that it will also crash with the same report when it tries to load the DB.

That’s exactly what Jay and I are asking you to test. If it does, then it may be Xojo problem. If it doesn’t, then it’s your code :slight_smile:

I’d love to do the test. But sorry, I’m not going to pay $50 to do a test. And I don’t see any way this could be my code. All I do is create a database and load a table. The pragma settings make no difference. If you have a license for SQLiteManager and are willing to help, I’m happy to send you a link to download the DB file so you can try opening it.