I'm trying to understand how to improve the speed with which a few thousand records can be inserted into a MySQL table using 1 transaction. The circumstances are that there are about 10 users, they receive an Excel file with an arbitrary number of columns, they do a little preparation, and save it to a tab delimited file. The first row of the file are the column names and the remainder are data rows. As it is now the data is read one row at a time and inserted using an INSERT statement. Prepared statements can't be used because of the variable columns.
The above is done using a temporary table. There is never any reason to save this table as permanent. Two more operations are then performed on the temporary table. When these operations are done the table goes away. First some columns are added to the table and next a join is performed to bring data into the new columns. A list box is populated and the user then downloads the data to a tab delimited file. There is no need to keep any of this data in the database so temporary tables work well. They just go away.
I believe this comes down to how I structure the code to make this work correctly. Here is my pseudo code describing where I'm at.
db.SQLExecute("START TRANSACTION") While Not input.EOF 'read the data file If vLineNumber=1 Then 'this is the column names, read them and create the table // CREATE THE TEMPORARY TABLE Else // ADD THE INDIVIDUAL RECORDS // REPEAT THROUGH EACH ROW, READING AND INSERTING line = input.ReadLine 'should be line 2 the first time End If Wend db.Commit
In the current structure the transaction begins before the table is created. I don't think that's right. And I'm reading one line at a time. Maybe I need to restructure the file reading to read it twice. Once for the header row, close it, and then open it again and bang through the inserts beginning at row 2.
Thanks for any suggestions on how to insert a table of data using transaction.