My application receives a query result from Oracle and uses PreparedStatement to issue INSERT statements(into PostgreSQL).
It works but in case the number of result is over 3000 rows then it slows.
So, I am considering to save the result into a local file and runs PostgreSQL’s direct load feature to speed it up.
Database resultSet: rs.
I know how to create a file and iterates the result set until EOF.
But I want to check if there is any efficient method to save the ResultSet into a file fast.
Do you have any idea on it?
Looping rs and writing each record into a file, then run direct loading to PostgreSQL.
It looks better.
With 4000 rows, PreparedStatement takes 8 seconds, but the way of making a file just takes 2 seconds.
yes the key is to reduce the most possible every traffic between client and server.
so better send one file with 4000 inserts queries, than 4000 requests with one insert in each.
This is the code that I’m using with Postgresql.
Use Truncate if you want to clear the table before copy.
[code] dim sql As String
sql =“TRUNCATE TABLE ““tblCommesse”””
rdb.SQLExecute(“COPY ““tblCommesse”” FROM ‘/Users/lto_slave/Library/Application Support/Postgres/var-9.6/file.csv’ DELIMITER ‘;’ ;”)
if rdb.Error then
App.ElapsedTime = rdb.ErrorMessage
The file.csv must be in var-9.6 in order to be visible to postgresql.
file.csv can be in a remote server or just in local file system?
remote or local, doesn’t matter, but must be in postgresql directory.
With a copy I can insert 66.000 records in 2/3 second.
Wonderful. I would like to test it soon.