Really? In my experience, updating a progress bar will dramatically slow your code.
Maybe I should send you my test project so you can adapt it and we can compare apples to apples.
@Kem T Really? In my experience, updating a progress bar will dramatically slow your code.Maybe I should send you my test project so you can adapt it and we can compare apples to apples
I would be interested in seeing your test. I only added the progress bar after the app seemed to freeze for an extended period. On other tests the progress bar added about exactly one second for 2095 records.
@Neil B How difficult is it to convert from SQLite to PostgreSQL?
I don't use any foreign keys, joins or unions. I do use complex select statements and a lot of prepared statements. The latter would need changing I assume.
No in that case it should be just a case of choosing the correct data types to match the ones you are using. If you are on a mac you can even use the https://postgresapp.com rather than installing it.
I've posted my project here:
You just have to fill in the appropriate credentials for your DB servers in the appropriate places.
Kem Tekinay: 100 ms for Valentina SQLite Server. Very impressive, but expected.
So 100ms is better of postgre?
I see that nobody yet try VDB.
I see that so far only inserts are tested. Why?
Inserts should be similar for most dbs. Because HDD plays the main role saving N Mb.
On my laptop postgres has shown the worst performance, but I also suspect there is something about my installation that's making it slower generally.
I tried Valentina yesterday but couldn't get it going, then ran out of time.
Here is a summary of the conversion steps required. Not simple for my size project. 45 tables. About 8 hours start to finish.
be very careful with the data types. sqlite is very weak about it, and postgres is very strict.
a query can run ( and give good results) in sqlite, and completely fail in postgres.
also write method to convert one db to the other, dont do it by hand.
you will likely do it more than one time ( and more than one database).
I did create a conversion program first thing. As you said data types was the biggest hurdle. SQLite would even let you enter 1.5 into an integer field.
The obstacle I'm hung up on at the moment is AUTOINCREMENT. I converted using Serial as a replacement. The problem is that if an ID is specified when inserting like "INSERT table(ID,Test) Values(Max(ID)+1, 'test')" the serial number isn't incremented. So in this case next time an insert is done the ID is less than the max ID. I tried using "ID INTEGER DEFAULT MAX(ID + 1)" but get an error saying an aggregate can't be used for a default value.
I'm thought about adding a trigger to increase the serial next value whenever a record is inserted, but I'm not sure how.
Why do you want to write a value into 'ID' (probably the primary key). Don't write anything in it and everything will be ok. (If your datatype is bigserial - but serial should also be possible, but I never have used it for that purpose.)
Or do I missunderstand the intention?