DatabaseRow Loops Slower Than RowSet Loops


I have a MySQL query with more than 4000 rows that I need to loop through. I just realized that if I use RowSet.Column(), it’s much faster than if I use DatabaseRow.Column().

var rs as RowSet = mysqldatabase.SelectSQL("SELECT something FROM table")
while rs <> nil and rs.AfterLastRow = false
  // do something here

is faster than

var rs as RowSet = mysqldatabase.SelectSQL("SELECT something FROM table")
var row as DatabaseRow
for each row in rs
  // do something here

For 4000+ rows, the difference is 0.15 vs 2.50 seconds (94% reduction). Anybody else see something similar?

You can make it even faster by removing one of the tests for each iteration:

var rs as RowSet = mysqldatabase.SelectSQL(“SELECT something FROM table”)
if rs <> Nil Then
While rs.AfterLastRow = False
// do something here
End If

In your case this will only check rs <> Nil once, saving 3,999 unnecessary checks of this, saving time.


Well then, I guess that’s another API2 feature I should stop using. Shame, it’s much nicer.

Compiled app or through the ide?

Compiled app for Mac

In my tests, WHILE was faster compared to FOR.
I use the POSTGRES database. I did it out of curiosity



I would say this time difference is not surprising; if you use DatabaseRow Xojo creates a new object „DatabaseRow“ for every record in the RowSet. So, it‘s copying the data 4000 times.


It might be worthwhile creating a feedback case so that an engineer can look at it and hopefully improve performance in a future version.


Improve what, a creation time of 0,0005875 seconds per DatabaseRow?
IMHO you don‘t know what you do if you create thousands of DatabaseRows in a single loop.

Edit: Sorry, I wrote RowSet instead of DatabaseRow.

In this case, I’m copying data from a MySQL database to SQLite in memory. This initial data transfer makes the rest of the calculation way much faster.

You might want to try using a transaction. That is:


Before, and


after your loop


Seriously? Is it re-running the query each iteration?

Hi, my code is basically like this:

var rs as RowSet = mysqldatabase.SelectSQL("SELECT something FROM table")
var itext as String 
var i as Integer
if rs <> nil then
	itext = "INSERT INTO table (something) VALUES "
	while rs.AfterLastRow = false
		if i > 0 then itext = itext + ","
		itext = itext + "('"+rs.column("something").stringValue+"')"
		i = i + 1  
	if i > 0 then sqlitedatabse.executeSQL(itext) 
end if

It runs the SQLite insert only once. I’ve also written options where the INSERT is split up in blocks of 300-500 lines.

Around many inserts, please always use a transaction.


I’d also consider the “do something here”. Using the first “faster” method probably means doing more work (if for instance you were reformatting the tables) than if you had the DatabaseRow objects created for you. Perhaps Xojo spends more time generating objects up front to make future operations faster on the back end? So it could depend on how much “do something” you actually do.

Hi, the internal part is documented here

I understand what you want to do.
This way is not the best to do it. Although if possible, it is slow.

I recommend that you use delimited text files and import them directly using SQLite.

If what you want is Data Migration from Legacy Systems to Modern databases, there are specific tools. They are great.

You can do it quickly, safely, and automated.

If doing this is what you want, always use a transaction, as Christian Schmitz recommends.

And if yout want to speed this up, use an array instead of string concatenation and add the INSERT‘s to it.