SQL Insert vs Insertrecord

No, but that may be my next step. Thanks for the idea.

Right now I’m running some speed tests to eliminate a couple of “commits” that are occurring in the import. I just ran a test eliminating one of them just to make sure that everything would get updated properly.

I have to use the LASTROWID in order to get the foreign key that needs to be saved to one of the tables. The data being imported is the actual code, but my table saves it to a lookup table and then needs the foreign key to save to the accounts table. Not sure why I left the commit in there but I normally do that during testing so I can trap for any DB errors.

The impact of this is during a 5000 record import, there are 9000+ commits to the lookup table. There are as many as 11 values that are might need to be saved to the lookup table for each account.

I’m now in the process of eliminating another commit that I didn’t realize was there to see if that will speed it up further.

After that, it might be time to take a look at the profiler and see where the time is being spent to see if it can be further optimized.

The profiler is useful but keep in mind that it will only time your methods, not framework calls.

I’ve run a number of timing tests and the profiler seems to be fairly accurate, although I can’t really separate out the framework calls, but I’m going to assume that they are minor compared to the other code that is being executed.

Here is the most significant item that I discovered. Approximately 40% of the total time for the import is taking place in ONE line of code, and that line is this:

cData = Excel.Range(cell).value

As you can sell, all this is doing is reading the cell contents from the Excel spreadsheet. Removing 3 commits dropped the total time by about 10%. Not as big a savings as I would have thought.

Now if I could find a way to speed up this one line of code, maybe I could make up some real time savings.

This is as I suspected, and I renew my recommendation that you you grab a row or even a sheet at a time, then parse the cells from that. It will speed up dramatically.

Kem

I like your idea and would be willing to try it, but I’m afraid I don’t know how to do that in XOJO. If you have some code snippets that I can use, I’d be more than willing to try it.

One of the features we are going to add is an option to convert the Excel file to a CSV file and them import from it. When we do that we read the entire file in and create an array containing each row and then use the nthfield() function to read each cell value.

I do have code to open the Excel file and then do a Save As to save it as a CSV file.

Unfortunately, I’m on a Mac (as mentioned), so I only know the AppleScript way, but I do know that it can be done. It may be as simple as defining your range as a row instead of a single cell, e.g., “A1:G1”, and you should get the result back as an array.

Once you do that, there will be little benefit in adding CSV as an intermediate step.

Wow, if it came back as an array, that would be excellent. I’m going to give it a try and see if I can get it to work.

John

See if there is something like:

cData = Excel.Range("A1:G1").ValueArray

If I’m right, that’s how you access it, and it will be an array of Variant.

Tried it and XOJO generates a OLEException “Unknown name, (failed on ValueArray)” . I thought you were on to something that would work.

Well, it was just a shot. Usually I’d try these things myself, but alas… Anyway, see if Value will return the array.

Kem, it was a great idea and you only missed one thing. You have to declare the array as a variant.

dim cArrayValues

So you’ve got that working now?

Dang, I hate then that happens…

dim cArrayValues As Variant
cArrayValues = Excel.Range(StartCell:EndCell).Value

This works and I do end up with an array with the proper number of elements, but I have to write some code to read the array based on the expected type of value that should be in each element to make sure it worked. Right now, if I do a break, I can’t see the values of the array in the debugger.

I think it might work, but I’ll let you know after a little more work to verify that the values being added to the array are correct.

Excellent. The next step, potentially, would be to read the entire sheet instead of each line, but this might boost processing speed sufficiently.

Kem, Not so fast. I get an array with the correct number of elements, but nothing in them. Tried a bunch of code to get the data in the array, but no go. Any ideas?

So picky. Data is overrated. :slight_smile:

This is where I hope someone who has actually done this through the Xojo classes will chime in. In the meantime, are you sure you’re getting the right range?

Yes, I’m reading one row with data in every column. Yes, we need an office automation expert and I think I know one.

Just sent a private message to Eugene Dakin who appears to know more about Office Automation than anyone else I’ve talked to. I’ll let you know what his response is.

Recently I wrote a small app to import Excel data into an SQLite database in Xojo. I tried the Xojo Excel class and was having speed issues, so I tried the MBS XL plugin that Christian mentioned. To me, it was much easier to code, works perfectly, is very fast and does not require Excel to be installed. I just had the app import a mailing list spreadsheet and timed it (not very accurately, but good enough). The spreadsheet is in the new Excel “.xlsx” format and takes right at 5 seconds to import 27,618 rows with 9 columns. My code is using DatabaseRecord to do the inserts. It is not free but it solved my problem and I really like using the plugin versus having the Excel dependency. If you would like to send me a sample spreadsheet, I would be glad to see how long it takes to import.

Brandon, thanks for that information. I already use a number of MBS’s plugins and can’t say enough about them. Perhaps that would be the best option and as you say, your timings are excellent compared to what I have experienced using the office automation plugin.

Looks like MBS may be the answer I’m looking for.