I’m looking for some advice/best practice regarding a multi user database setup.
Picture if you will, a small server running (probably) Mysql server, with a modest size database (10-15 tables, a few hundred rows in each).
Connecting to that database will be a number of client machines. The maximum would be 10, but typical usage would see 6 or 7 connected at any given time.
Each client will run the same software, which will enable them to view and change the database contents, typically using a series of listboxes, textfields and Save/Cancel buttons. Nothing out of the ordinary so far then.
Now, when any given user saves changes to the database on the server, I need to find the most efficient way to push those changes out to each of the other clients as quickly as possible. The whole idea depends on how fast these changes can be pushed out, so that as soon as a change is saved to the database, the clients are instantly updated to reflect the changes. This needs to be as fast as possible.
My first thought for this is to setup a thread and timer on each client which simply refreshes the listboxes, and set this to occur every second or two.
But I’m also thinking this would slow things down due to the constant refreshing and network activity, as well as potentially slowing down the performance of the clients.
Is there a way to maybe set something running on the server that would push the changes to the clients only when a change is made ? Or a more efficient method of updating the clients, essentially in real-time ?
The clients would all be conected over standard ethernet.
I’m open to the possibility of using a more fully-featured database, and I’m assuming that any possible solution would be able to scale upwards to enable this for more tables as required. The number of users is never going to exceed 10 at any given time.