Speed up of populating Oracle data


Below is my logic to populate Oracle data.

  1. select from Oracle : 5,000 rows ( 10 columns ) / in remote server
  2. Insert into PostgreSQL through loop ( PostgreSQLPreparedStatement / internal repository)
  3. Update PostgreSQL with other local data
  4. Finally, populate it to Listbox

It takes about 15 seconds, and most time belongs to #2 step.

Do you have any idea to speed-up these steps?


Without knowing just how much data is involved have you considered:

1. select from Oracle
2. Insert into in memory SQLite DB
3. Update SQLite DB with other data
4. Populate listbox

And potentially if required
5. Insert into PostgreSQL database in a thread

In memory SQLite DB’s are incredibly fast, but non persistent (obviously), don’t know if you need step 5 or not, but the user will get a very snappy response.

Good idea.
However, ‘other data’ is saved into the internal repository(PostgreSQL) and it should join together, so using SQLite could be burden.

Just in case, paste my code for the step #2.
With this loop, 5000 rows are inserted into INTERNAL_REP and it takes 13 seconds. That is the problem.

        logging("TimeCheck : Before PostgreSQLPreparedStatement")
        Dim ps As PostgreSQLPreparedStatement
        ps = mPostgreSQLDB.Prepare("INSERT INTO INTERNAL_REP  VALUES($1,$2,$3,$4,$5,$6,$7,$8,$9,$10,$11,$12)")
        While Not rs.EOF
          ps.Bind(0, rs.IdxField(1).StringValue)
          ps.Bind(1, rs.IdxField(2).StringValue)
          ps.Bind(2, rs.IdxField(3).StringValue)
          ps.Bind(3, rs.IdxField(4).StringValue)
          ps.Bind(4, rs.IdxField(5).StringValue)
          ps.Bind(5, rs.IdxField(6).StringValue)
          ps.Bind(6, rs.IdxField(7).StringValue)
          ps.Bind(7, rs.IdxField(8).IntegerValue)
          ps.Bind(8, rs.IdxField(9).IntegerValue)
          ps.Bind(9, rs.IdxField(10).IntegerValue)
          ps.Bind(10, rs.IdxField(11).IntegerValue)
          ps.Bind(11, rs.IdxField(12).IntegerValue)
        logging("TimeCheck : After PostgreSQLPreparedStatement")

5000 individual inserts and commits WILL take a while…

  • Try to NOT commit until you are done
  • Use a TRANSACTION to encompass the entire thing

But as it is you are executing 10,000 Database operations

With post commit, I could reduce 2 seconds. It is good.

Can you let me know about any material or comments for TRANSACTION you were mentioning?

the long time is caused by sqlexecute many times, you must reduce it
I had quite success in concatening the many strings that goes into sqlexecute into one only string that goes into sqlexecute
that way you reduce the time needed to send datas to the server.

instead of

sqlexecute "insert into table 1 values ()"
sqlexecute "insert into table 2 values ()"
sqlexecute "insert into table n values ()"

change it to :

string(1) =  "insert into table 1 values ()"
string(2) =  "insert into table 2 values ()"
string(n) =  "insert into table n values ()"
string = string(1)+string(2)+...+string(n)
sqlexecute (string)

I used it to import 30000 rows of datas into a postgresql remote server in 5 mins instead of 3 hours …
I use N=1000 in the above example, after that xojo is getting very slow to concat strings.

BTW, you can connect to Oracle from PG directly: http://blog.dbi-services.com/connecting-your-postgresql-instance-to-an-oracle-database/

Thank you so much for the comments.
With your advice, I am coming up with other idea to reduce the time.

Your comments are really helpful.

Thanks again.