Xojo 2020R1.2 has a Problem with Memory Consumption. I have a Database I have to transfer sequentially to Single Files for the transactions in the Database. There will be around 11 Mio. Datarows to import and around 20000 Files to write. After 10000 Files xojo has the need of 8 GB Memory, after 17000 Files 16. It is not releasing the Memory used for the transaction what I can not even understand because it is the same routine runs 20000 times.
Do you use sqlite wal or non-wal?
Do you have an example?
Do you use commit in-between?
I personally use huge caches for SQLite, so the database fits in memory if possible.
MySQL and no comit between only reading from database no writing at all
Do you read 20000 records in one record set?
Xojo caches the records in memory.
You may better read in chunks.
Or use a plugin…
I read it in chunks and every chunk makes more memory consumption
No need anymore will write in java it’s kids game with xojo
Does it make difference using recordset iso rowset ?
Maybe I have a deprecated and a not deprecated one. So I can only use the not deprecated one
First things first, can you just try it and let me know the result ?
Try in 2019 with old one makes the same as the new one doing, has to do with the plug-in. A few million lines are blowing up the memory use. It’s no problem cause this is a small 120 lines app I wrote now with another tool cause I had to get the job done now. It’s really no problem, xojo is for me only to try from time to time and for maintenance on old apps, this try is maybe a good advise to rework the plugin
Ok well obv that’s not very good…
Can you post some code showing the problem?
It’s not needed comes up only with really big databases and has then this effect. With 9 Million lines to read it makes problems what is in normal circumstances not coming up