Ping database on regular interval

I am finding that when users leave the application idle for a few minutes, the connection to the (Postgres) database is lost.

I do test the database connection, but on occasion some connection errors do slip by.

I would like to do a simple query every few minutes to keep the connection alive (if there is no activity).

What would be the best way to try to implement this strategy?

I put this in the app.Open event:

  Dim theTimer as New myTimer
  theTimer.Enabled = true
  theTimer.Mode = Timer.ModeMultiple

And this in the myTimer.action event:

  Dim r As recordset
  r=app.myDbase.SQLselect("SELECT COUNT(*) FROM test_db")

Hoping it does the trick.

The timer needs to have a more global scope. If you dim it local to a method/event, it will be destroyed as soon as the method exits and will never get a chance to do anything.

I think you mean make a global property (timer) for the application, right?

Tried making a property of the application called pingTimer.

Put this code in the app.Open event:

  app.pingTimer.Enabled = true
  app.pingTimer.Mode = Timer.ModeMultiple

but I am getting a Nil Object Exception on the first line. The debugger says that app.pingTimer is Nil.

Can you give me a little insight on how to initialize the timer and keep it persistent. Thanks.

Where did you instantiate the PingTimer? (ie… somewhere you need PingTimer=new Timer)

BTW: Better use

RecordSetObject = DatabaseObject.SQLSelect("SELECT 1 FROM DUAL") If RecordSetObject = Nil Then BooleanObject = ConnectToDatabase
in your timer, because this reduces the load you put on the Database.

ConnectToDatabase would be a method which reconnects to the server :wink:

  Dim r As recordset
  r=app.myDbase.SQLselect("SELECT COUNT(*) FROM test_db")

test_db is an empty table. I thought that would be a minimal load on the database.

I can also put a method to reconnect to the database, but for some reason if the ssh tunnel closes, the connect script has difficulty reconnecting even if I re-establish the ssh connection. This timer is to hopefully keep the ssh tunnel alive.

[quote]I am finding that when users leave the application idle for a few minutes, the connection to the (Postgres) database is lost.

I would actually bet that your problem is in some router between the client and
the server. In particular, routers that do NAT address mapping
typically have a timeout after which they will forget the mapping for an
idle connection. If you’ve got one of those, see if it’ll let you
change the timeout.

Just out of curiosity.

Why are you leaving database connections open rather than closing them when you are finished?

To quote from this forum post: link text

[quote]Connecting & disconnecting would work but is major slow down. The
solution that is easiest to implemented would be a “heart beat”, I
think. Perhaps when there is no application activity for awhile, the
application would just sent a generic query to the server once every
couple of minutes. The other idea would be to subclass the database as
suggested but I am not sure much of a pain that would be to implement.[/quote]

Thats bad practice really, your limiting the scalability of your application through these needless calls for various reasons.

Plus, to a database admin - this style of “heart beat” programming is a nightmare. I’ve slapped enough dev’s who’ve tried this on my databases in the past. I’ve lost count of the amount of times I’ve had to waste a lot of time tracking down these pointless connections down and killing them, and if you’ve got God knows how many connections to trawl through… (sods law its an emergency, 2am and the applications have just been left open overnight).

If you want to reuse existing connections you’d be better off looking at the database to manage this through “Connection Pooling”. Though I’ve not tried to implement this with Xojo connections.

I would suggest you open a new connection as late as possible and close it again as soon as your finished your processing/querying.

The hit in Connecting/disconnecting is small really, “major slow down” is just not right. If your application is taking a long time to establish a connection then I’d suggest there’s another problem somewhere.

This is just not true, from my experience: Talking to the Server and detecting it’s not reachable plus reestablishing the connection just to querry it again, instead of making sure the 1 (!) connection is always alive and just talk to the Server, makes a MAJOR difference when it comes to speed.

Point I’m trying to make is that the connection should not be left open via a heartbeat in the first place. Open it/Use it/Close It.

In a single user environment (SQLite?) - fine - who cares. In a multi-user environment like PostgreSQL and the others this is not good practice. The more users your application handles the more problematic this approach becomes.

[quote=86972:@Patrick Delaney]Point I’m trying to make is that the connection should not be left open via a heartbeat in the first place. Open it/Use it/Close It.
The DB admin that gets a call at 2am because some dev does this and the database backups are not running because some application is holding the DB hostage WILL be the next days sacrifice

Been there - have the shrunken skulls to prove it :slight_smile:

As a test, I wrote a method to test the connection to the database (and reconnect, if needed) prior to each database query. I set the database to close when I closed a specific window (just to test it). Surprisingly, the time to reconnect was minimal. So, this strategy actually works quite well.

I could find each “END TRANSACTION” or db.commit statement and place a subsequent db.close statement following each one.

However, this does not address the problem when a window loads (let’s say when a listbox is populated). Unfortunately, the recordset needs to be processed prior to closing the database again. Is some sort of generic strategy for this? I can only see going through each window and finding each recordset and placing a db.close at the end of the method (tedious).

Would it make sense to place a timer in each window to close the db after a set period of time after the window is opened (not sure how long - say 1-2 minutes???). This would give the window plenty of time to load and then close the connection. If the window is not idle, it looks like this delays the timer - which wouldn’t be a problem, and may even make sense.

What would be perfect is if there was a way to tell when the window has finished loading and is idle. That would surely indicate a safe time to execute db.close.

What strategies have you used that worked best?

This is an unlikely scenario with Postgres and pgDump. You probably remember your DB2 days, Norman, right?

Persistent db connections aren’t all that bad.
For the record, one of my apps is used with 300 concurrent persistent connections (might have to use connection pooling soon) since 13 years and I have yet to run into an insurmountable problem. The basic approach to deal with dropped connections is this: in a wrapper to sqlselect/execute, check for a “connection dropped” error. Reconnect if it occurs. The user won’t even notice. But in the OPs situation where connection are dropped regularly, sending “keep alive” queries might be necessary. I remember testing pgpool with RB (at the time) and it worked just fine, so that’s what I would do if I had to do it all again.

[quote=87099:@Maximilian Tyrtania]This is an unlikely scenario with Postgres and pgDump. You probably remember your DB2 days, Norman, right?
Sybase & Oracle :stuck_out_tongue:

If you were ever on a dev team with me I’d be looking for my Wet Fish!. :slight_smile:

I can see the attraction, the connection is always there, just hit the database. No opening/closing the connection every time has to be quicker right? Yep, you save yourself a very small hit - great - but there’s a trade off at the database end. Every DBA I’ve ever met winces at the idea, it’s ugly and brings it’s own problems. These are your apps, do it if you like - I’m not here to argue.

Is it not possible for you to use tcp_keepalives from the Postgresql server.

Or are you using windows. This might help you if using windows.