in-memory database shared between processes

Normally I’d use a single application to do this, and that might still be my best choice, however I thought I’d ask.

Helper app does some analysis of some files, once it’s complete this analysis to be displayed in the GUI. I want the GUI to be buttery smooth and allow the user to do some things while the Analysis is on going in the background.

I want to use an in-memory database, because the analysis results are session specific, and I don’t want to hit the disk any more than I have too with the analysis, but if I separate this app into two…

I have shared memory, so perhaps there’s something I can do with that and the in-memory database, ideally I’d like to avoid serialization as it could come with a serious performance penalty. The number of files it analyzes are typically < 100, but in some cases it could be > 1000, creating multiple records across tables for each file.

Ideas, is there something I’m missing?

Could you use an SQLite database in multi-user mode?

Thanks for the suggestion; I don’t know, looking at the docs I don’t think so. It appears that by default in-memory databases are process specific.

I believe that because of this, there is no way for APP#2 to even get a handle on the in-memory database instantiated by APP#1

Does the data ever end up in a physical on disk database?
If so, perhaps APP#1 and APP#2 could do their thing, and then synch back to a central version?

it needs to be a database on disk to share it.
You can still have a cache big enough to hold all data in memory.

[quote=443536:@Christian Schmitz]it needs to be a database on disk to share it.
You can still have a cache big enough to hold all data in memory.[/quote]
Thanks but I’m trying to avoid any disk transactions if I can.

Impossible I assume.

Well, an in memory database is a database with only cache and no file. But without the file, there is no way the data leaves the process memory.

Okay, you can create an IPC Socket and build your own interface.

IPC socket is also interaction via disk, isn’t it ?

@Jooset Rongen: yep, you are correct. I think that a database in multi-user mode will avoid all the pickling and unpickling.

you cant share one between processes without it being on disk

now processes CAN share memory
I know mbs has plugins to enable the use of the various shared memory options on macOS

see shm_open which is available on macOS

[quote=443548:@Norman Palardy]you cant share one between processes without it being on disk

now processes CAN share memory
I know mbs has plugins to enable the use of the various shared memory options on macOS[/quote]

An Xplatform API within Xojo to enable use of shared memory would be REALLY useful for helper app architecture … I bet there is already a feedback request…

But i suspect that it something they will never do both because of lack of resources and worry that we will shoot ourselves in the foot with it…

  • Karen

You could use a ramdisk https://www.techjunkie.com/how-to-create-a-4gbs-ram-disk-in-mac-os-x/

Yes, MBS Plugins come with shared memory functions in FileMappingMBS class.

https://www.monkeybreadsoftware.net/topic-filemappingandsharedmemory.shtml

Obviously I don’t know anything about your data or your processes but I can’t think of many situations where I would invoke an entire database just to send data back and forth between 2 processes. If you’re doing the work in a helper process why not just send the results back through the stdio to the shell you use to start the helper process? Wrapping the data in some kind of packet that is easy to process is simple enough, or go with JSON. If the returned data set is too large then make a 2 way protocol so that you can ask the helper for just what you need to display at this moment and not have it dump everything to you at once.

No matter how much work you do to encapsulate the data over that pipe it’s going to be less work than trying to setup shared memory to pass stuff back and forth to separate database instances.

James;
YES mate; that’s it! Once the helper has done the hard work, it then signals to the GUI “It’s ready” and in turn acts like a SQLserver via stdio. The GUI then treats it as a SQLDatabase.

Just to clarify, my intention wasn’t to use SQLite for IPC, and I didn’t want separate database instances. I was hoping that if I couldn’t share an in-memory DB between processes, I could at the very least get the Ptr & size, then copy the data into shared memory and have the GUI create a new DB instance on that shared data (or copy it into it’s own memory and do it from there). I couldn’t find a way to do either (doesn’t mean it can’t be done).

The GUI doesn’t need all the data at once, so that’s good. To start with it will display a report giving only critical errors, but the intention is to allow the user to explore that data also, and hopefully offer advice and such in the future.

This also pans out; as depending on the users choices, the helper might then use the data it’s collected to process these files. So by keeping it all there in the Helpers memory space, it has the fastest access to the data.

Now I gotta figure out how to create an interactive console app that can just sit there using zero CPU while it awaits instructions, and how to handle the GUI suddenly disappearing.

Thank you all for your help and input, I really do appreciate this forum and the variety of ideas that are so freely shared.

You just describedusing shared memory
http://pubs.opengroup.org/onlinepubs/009695399/functions/shmget.html
And MBS has it already wrapped up into a plugin
https://www.monkeybreadsoftware.net/topic-filemappingandsharedmemory.shtml

If I follow your intent right, what I did in that situation was use Aloe Express in the console app to sit and await instructions via a prescribed port. The GUI would first test if that port was already active (console listens for a “ping” instruction), and if not there will start the helper console app. It uses near 0% cpu while waiting. But when the console app gets an instruction for a long running operation, it does its work and updates shared memory using the MBS plugins (as opposed to returning a payload via Aloe Express).

[quote=443614:@Norman Palardy]You just describedusing shared memory
http://pubs.opengroup.org/onlinepubs/009695399/functions/shmget.html
And MBS has it already wrapped up into a plugin
Monkeybread Xojo plugin - Filemapping and Shared Memory [/quote]
Thanks Norman, yes I was hoping that I might be able to utilize shared memory with an in-memory database. Initally I had hoped to create the in-memory database in shared memory, but after I couldn’t figure that out, I tried to research a simple copy the bytes from process memory to shared and back again. Alas, I couldn’t figure that out either.

That’s good to know thanks; at first I’ll experiment with using stdio as in theory should have less overhead and hopefully be simple enough… Famous last words!

How do you keep a console application alive without using any CPU?

I don’t have one of the processes running now to check its cpu; I’m sure it isn’t 0% but I think it was low. I think Aloe is essentially sitting on server socket requests. I preferred the Aloe Express paradigm to IPC.

Sorry for not being clear, do you use a loop which sleeps the process for a bit, then checks to see if an instruction has come in, or is it entirely event driven?

I’ve only ever created helpers that are fired to carry out a specific task and then exit.