Normal time to send data to DB

Hi Everyone,
It take about 45 sec to send 6000 row ( row by row of 22 record ) to SqlLite DB
Normal time to send 132 000 record Or it’s because i read my CSV (TextInputStream.ReadLine)… then send to DB in the same process ( read, write, read, write… )
Is there a way to speed up the process
Thanks

Around 300 Rows in each Record ?
The csv holds 6000 lines…
Not a hard job.

45 seconds: Unless your clients have to do that each and every day, add a Progress Bar and let the user occupy its time doing something else… Yes / No ?

What OS (I guess Windows) ?

What Xojo version ?

Environment ?
(Computer: CPU Speed / SSD / RAM ?)
(Software: only Xojo ? Only the stand alone ?)

And so on…

Have you got the updates within a transaction?

The speed difference between updates within a transaction and updates with autocommit on is massive.

Yes, just put a BEGIN TRANSACTION before and COMMIT after to make it faster.

Also have a big cache.
https://www.mbsplugins.de/archive/2017-10-16/Big_SQLite_Cache_for_64-bit_in/monkeybreadsoftware_blog_newest

[quote=427191:@Christian Schmitz]Yes, just put a BEGIN TRANSACTION before and COMMIT after to make it faster.
[/quote]
Just to clarify, what Christian means is to only begin the transaction at the start of the entire import and then commit the whole import at the end, not creating a transaction for each row.

Other than that, we’d need to see your code to determine if there’s anything else that could be slowing it down.

I wrote up two blog posts recently about timing of Inserts and Updates with and without transactions.

https://www.bkeeneybriefs.com/2019/02/database-transactions-in-xojo/
https://www.bkeeneybriefs.com/2019/02/updating-records-with-transactions/

In short, if you’re not doing multiple inserts/update inside a transaction you’re doing it slow.

yes agree with Bob. insert 1000 rows at a time using sqlexecute
otherwise it is slower to much slower.

same apply to local or remote databases.

[quote=427184:@Denis Despres]Hi Everyone,
It take about 45 sec to send 6000 row ( row by row of 22 record ) to SqlLite DB
Normal time to send 132 000 record Or it’s because i read my CSV (TextInputStream.ReadLine)… then send to DB in the same process ( read, write, read, write… )
Is there a way to speed up the process
Thanks[/quote]

There are so many unknowns here it is hard to comment…

First of all when you say send rows to a database what do you actually mean? Just insert rows into a table or multiple tables? Or are you inserting, updating and deleting at the same time, on a single table or multiple tables?

Secondly there is the structure of your database, is there an auto generated primary key? Unique field constraints to be applied, foreign key constraints, indexes and composite indexes to be maintained and so on. And of course the size of the database and the table can also impact it.

And then there is the actual XOJO code being used to perform the operation are you using strings to build up sql commands, using the database record or prepared statements.

The more specific the information you give, the better the suggestions we can give.

ok this is my code As Is ( Before applying any of your advice )
This method take as parameter an array of 22 items pull from live cycle TextInputStream.ReadLine
of CSV file of 6000 row

s=tis.ReadLine fields=Split(s,chr(124)) 'Split by pipe

param: PushClientCSV() from fields()

[code]Dim dbFile As FolderItem
Dim db As New SQLiteDatabase
dbFile = GetFolderItem(BDRename+".sqlite")
db.DatabaseFile = dbFile

If db.Connect Then
Dim ps As SQLitePreparedStatement = _
db.Prepare( “INSERT INTO CLIENT (Sku1,Sku2,Sku3,Sku4,idProduct,Reg,special,Start,EndD,Ucode,Dep,Loc,Bin,Store,Event,Saved,Label,Qty,Size,UM,Page,ExtractDate) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)”)

for i as integer = 0 to PushClientCSV.Ubound-1
ps.Bind(i, PushClientCSV(i), SQLitePreparedStatement.SQLITE_TEXT)
next

ps.SQLExecute

If db.Error then
MsgBox("Erreur de BD: " + db.ErrorMessage)
db.Rollback
Else
db.Commit
End If
Else
MsgBox("La Base de donne n’a pu tre ouverte. Error: " + db.ErrorMessage)
End If[/code]

insert in Fresh new & empty single table with auto generated primary key
Xojo 18.4 on MacBookAir, Mojave, 32g ram, ssd for a local stand alone app

Thanks

But where exactly is the over all loop the reads the rows from the text fire?

@James Dooley

[code]While Not tis.EOF
s=tis.ReadLine
fields=Split(s,chr(124)) /// split by pipe

SQLInsertToClient(fields) /// send array fields() to method SQLInsertToClient

Wend[/code]

Ok, so you’re opening the db then doing the insert for each row you’re inserting. This will never be very quick for bulk inserting a lot of rows. For the best performance, you need to roll the entire read-from-csv and write-to-db operation into a single loop, and before the loop open the db and begin a transaction, then after the loop commit the transaction.

For any application you usually should open the db only once and assign it to a global variable to use throughout the application, instead of opening the db every place you need it.

As Jay says…

  • Open the database before you enter the loop and use the same connection in the loop
  • Prepare the statement before entering the loop and define the binding
  • In the loop just bind the data and execute the statement

Once you are there have a look at the transaction stuff mentioned earlier.

Yes it work !!!, i’ve apply every advice ( Open BD once / Set a big Cache / “Begin transaction” / “Commit”)
Then the writing time is down from 45 sec to 3 sec
Thank you all of you

And don’t read line by line - that is slow.

Use ReadAll and split it into lines, then split each line into your fields.

On Windows (or/and Linux) ? Probably.

On macOS ? No. I loaded a 60,000 Lines (Rows, with at least 2 filled Columns on 16)a .csv file in less than 2 seconds and I read them line after line (and checking each atom part: Is there’s quotes ?; Delimiter, etc.) into a Listbox. The test file size is around 900KB.
Save into a new SQLite file took 2 / 3 seconds (while reading data from the Listbox).

No ProgressBar. No Listbox refresh while reading the csv.

@Emile: SSD or hard disk?

With an SSD it will not make much difference, but with a hard disk