Bulk Insert with Prepared Statements

I have a lot of rows to insert into a mySQL database. I would like to use a prepared statement to do the insert. Is there a way to do a multiple row insert using a prepared statement rather than iterating over all the values and doing 200K inserts? I am currently doing the latter but it is taking WAY too long. I know in PHP for instance you can pass arrays instead of single values to the prepared statement. Is anything like this possible with Xojo? I have not tried yet, thought I would ask first in case it is documented somewhere. I am not seeing it. I am pulling records from one database query and then dropping them into another database on another machine.

I am doing something like this now (slooooooow):

      //Already have details from db1 in a db1RecordSet - 200K rows
      //Prepare Statment for inserting into db2
      Dim psM as PreparedSQLStatement = db2.Prepare("Insert into ProductDetails (SequenceNumber,ProfitCenter,AvailableQty) Values (?,?,?)")
      psM.BindType(0,MySQLPreparedStatement.MYSQL_TYPE_STRING)
      psM.BindType(1,MySQLPreparedStatement.MYSQL_TYPE_LONG)
      psM.BindType(2,MySQLPreparedStatement.MYSQL_TYPE_LONG)
      
      // Row by row
      While Not db1RecordSet.EOF
        
        //Insert results into db2
        psM.Bind(0,db1RecordSet.Field("SEQUENC_NO").StringValue.Trim)
        psM.Bind(1, db1RecordSet.Field("PROFIT_CENTER").StringValue.Trim)
        psM.Bind(2, db1RecordSet.Field("AVAILABLE_QTY").DoubleValue)
        
        psM.SQLExecute
        db1RecordSet.MoveNext
      Wend
      
      db1RecordSet.Close
      db2.SQLExecute("COMMIT")

What I want to do is move the execute step outside of the loop and just execute it at the end after all the values are loaded into the query. Is this possible?

Dont think so
This did just come up on another thread but using sqlite
Big inserts often suffer if you have

  1. lots of triggers
  2. lots of indexes
  3. try to wrap too many rows in a single transaction

Usual means to get a ton of speed on a big import

  1. drop triggers
  2. drop indexes
  3. import
  4. recreate triggers & indexes

EDIT : Oh and “No I don’t think so” :stuck_out_tongue:

Maybe I should just use a regular string based sql statement and just pass it a huge concatenated string. Or maybe a temporary text file with a “LOAD DATA INFILE” command?

If you have lots of triggers & indexes they’ll still suffer
Neither of those methods avoids that

You could create a temporary table with no indexes & no triggers
Insert into that
Then select from it into your real table with all the indexes & triggers in place

But NONE of these will avoid speed hits that indexes & triggers can impose
Hence why with big imports dropping triggers & indexes is often a way to get a ton of speed

Been there done this with some imports that took days to complete (some well over a billion rows)

The other thing you can think about is putting the inserts inside of a database transaction. Then, it’s all in memory until you commit. That usually can give you a HUGE speed gain with lots of inserts.

There are no triggers or unusual indexes on the table so that is not an issue (although caching is on). I am trying @Bob Keeney 's transaction suggestion now - it does seem to be making a difference although it is still quite slow. I will let you know what I find out tomorrow. I will test the giant sql string and load-data-infile options and post the results.

Thanks for the suggestions!

I’ve got a problem like this. Any update on this?

what type of an update did you expect. Norman very clearly laid out the trials and tribulations, and how best to work around them.

Perhaps if you provided details about your particular situation, a targeted solution might be found

This is the type of update I would expect.

since that was TWO years ago, he may not even remember the project let alone what he did :slight_smile:

maybe, maybe not. But I saw him online this morning so thought I would ask. OK?

[quote=351904:@Duane Mitchell]This is the type of update I would expect.
[/quote]

Ya know - I really do try to put the answers I stumble upon back up here for reference but you caught me on this one - sorry :slight_smile:

In the end what I found to be the best route was base on what @Norman Palardy suggested: to create a new table with the same structure as the target table (no indexes), then insert the data into that table, then rename the tables so that the new table replaces the previous table. That was the fastest and ensures there is never any table with partial data in it. (My table is live the entire time during the inserting). I also wrapped the whole thing in a transaction.

Something else you can try is to create a prepared statement which is a multi row insert like this:

Dim psM as PreparedSQLStatement = db.Prepare("Insert into myTable (rowA, rowB,rowC) Values (?,?,?),(?,?,?),(?,?,?),(?,?,?),(?,?,?),(?,?,?),(?,?,?),(?,?,?),(?,?,?),(?,?,?)")

//whatever works for you
psM.BindType(0,MySQLPreparedStatement.MYSQL_TYPE_STRING) 
psM.BindType(1,MySQLPreparedStatement.MYSQL_TYPE_STRING)
psM.BindType(2,MySQLPreparedStatement.MYSQL_TYPE_STRING)

psM.BindType(3,MySQLPreparedStatement.MYSQL_TYPE_STRING) 
psM.BindType(4,MySQLPreparedStatement.MYSQL_TYPE_STRING)
psM.BindType(5,MySQLPreparedStatement.MYSQL_TYPE_STRING)

...

psM.Bind(0,myValueA1)
psM.Bind(1,myValueB1)
psM.Bind(2,myValueC1)

psM.Bind(0,myValueA2)
psM.Bind(1,myValueB2)
psM.Bind(2,myValueC2)

...

psM.SQLExecute

Then with every SQLExecute you are inserting 10 rows (or however many). You can make this a loop and iterate through it inserting at the appropriate interval if you want. This will speed up the inserts but you have to be careful about iterating through your data and the number of rows in it, and what to do about the final rows that might not match the above format. You might just end up with some blank rows in your database, which you can delete easily, but you also might get an error. So you need to work that out.

This gave me satisfactory results so I did not ever experiment with writing a text file and doing a Load Data Infile approach.

It is always OK to ask - I did it to someone else on a two year old thread earlier this week so no worries.

I have THE WORST memory - seriously my wife jokes that I have alzheimers sometimes, but I have always been like this. My mental hard drive frees up disk space with reckless abandon. Somehow though, I remember this project and problem quite clearly. Whether the information will be useful, is another matter entirely.

I’m sorry I forgot what the discussion was [smile]

Ya know - I really do try to put the answers I stumble upon back up here for reference but you caught me on this one - sorry :)

I think most have the intention of circling back to update a thread and I must say I don’t always do so myself. So I thought I’d take a shot at this since I saw the little green ball next to your name this morning indicating you were online.

Glad this worked for you. I am doing an UPDATE on rows that already exist with some new data. Not sure if this will work but I’ll play with this.

Thanks.

They say your memory is the second thing to go. I don’t remember what the first is!

I made this one change and had over a 60% performance improvement.

I am updating existing records, not inserting new ones. There’s about 9200 records to update in a table of 21,000. Without the transaction it was about 21 minutes. With the transaction it’s about 8 minutes.

That might be good for now.

I should add that to get the 21,000 records into the table I batch inserted in batches of 1000 each and total time for that was about 2.5 minutes. Quite a difference. I’m using prepared statement in the update, not the insert.