Plugin, library or helper app?

Hope you all had a good Christmas (if you celebrate).

I have written a chunk of Swift code that takes an audio file input, processes the audio and then saves out a new file. It uses AVAudioEngine with a few effect nodes. Obviously not something I can do within Xojo, so I’ve had to resort to Swift.

I need to call this Swift code from within a Xojo app, but am unsure of the best way to do it. My thinking is:

  1. Create a Xojo plugin. Can Swift be used for Xojo plugins, or does it have to be Objective C?

  2. Compile it as a library of some sort. No idea how to do this or how to call it from a Xojo app!

  3. Create a windowless helper app that I can run from my Xojo app. If I do this, should I compile it as a CLI or can I use a Universal Binary (so I can include x86 and ARM M1 in the same app - this would be handy). The helper wouldn’t need to communicate back to the main app, it just carries out its work and closes. So no complex communication required, just run and forget.

If I use a helper app, what mountain of issues can I expect if my Xojo app is sandboxed or hardened / notarised? I assume I can’t just use FolderItem.Open to execute the helper if the app is sandboxed and instead have to use some magic to pass on permissions etc.?

I’ve called a Unix CLI from a sandboxed Xojo app before and remember it being a nightmare (and seems to not work on M1 Macs). Will my helper app inherit permissions from the main Xojo app (in terms of file access / container folders etc. I’d need it to have the same working folder). I use Sam’s brilliant AppWrapper, so hopefully any helper app I include in the bundle would be signed / sandboxed properly.

If the worst comes to the worst I can always copy the file that needs processing into the apps data folder, so providing the helper app inherits the permissions for my app’s container folders, all should be well.

Any info or pointers gratefully received before I go down the wrong path. Unless it’s super easy to create a plugin, I’m leaning towards option 3 - that’s if I can work out how to run apps from within a sandboxed Xojo app.

Have you seen https://www.monkeybreadsoftware.de/xojo/plugin-avfoundation.shtml from MBS?

1 Like

I do have the full MBS plugins and did look at this, although I’m not sure it’ll do what I need it to do.

I’m running the AVAudioEngine in offline mode and using the manual rendering mode to render to a buffer that I can output as a file. The MBS plugins may be able to do it, but I haven’t been able to work out how.

Now I’ve worked out how to do it in Swift, I may look again at the MBS plugins as at least I know what I’m looking for now. Most of the bits I need seem to be there.

The classes should mostly just map 1:1 what you see in Objective-C or Swift.
For exception handling, memory management and threading, we do extra steps.

If something is missing, you can email us (next week).

Hi Christian,

Thanks for that. I don’t think MBS will currently do what I need.

I’m setting up an AVAudioPCMBuffer and setting enableManualRenderingMode in the AVAudioEngine and writing out the buffer in a loop. So it exports faster than realtime.

I notice in the MBS docs it says “Manual rendering is currently not supported in MBS Plugins”, so I don’t think it will work unfortunately.

What functions do you call?

Even if documentation says it doesn’t work, I wonder whether this is outdated and we added more functionality later without removing the sentence.

Is that using manualRenderingBlock?
Such a rendering block in Xojo is very difficult to implement as no memory management, mutex blocking or dispatching is allowed there.

Hi Christian,

Here’s some (very incomplete) Swift code. I can put the whole script up if you like, although I don’t know what Xojo’s take on putting a whole Swift script on the forum would be (even if it is to try to convert it to Xojo code).

I’m enabling manual rendering using:

    engine.enableManualRenderingMode(.offline, format: format,
                                                 maximumFrameCount: maxFrames)

Setting up the buffer:

    let buffer = AVAudioPCMBuffer(pcmFormat: engine.manualRenderingFormat,
                                      frameCapacity: engine.manualRenderingMaximumFrameCount)!

Then render out in a loop:

    while engine.manualRenderingSampleTime < Int64(newLength) {
        
        do {
            let frameCount = Int64(newLength) - engine.manualRenderingSampleTime
            let framesToRender = min(AVAudioFrameCount(frameCount), buffer.frameCapacity)
            
            let status = try engine.renderOffline(framesToRender, to: buffer)
            
            switch status {
            
                case .success:
                    // The data rendered successfully. Write it to the output file.
                    try outputFile.write(from: buffer)

	.......

No rush to answer while you’re on holiday. You deserve some time off!

It really does depend on the specific API that you’re using. I was able to write a video editing framework in Xojo with declares a couple of years ago. There are a couple of things that don’t quite work right or at all (mainly from preemptive thread re-entry).

For things like this, I use a .plugin (not a Xojo plugin) which contains a class. Once the plugin is loaded, I interact with the class exactly how I would as if it was a system object or custom object dynamically created.

First up, you’d want to use NSTask for launching a contained helper, this passes as much “permission” to the helper as possible, so it reduces complications.

You used to be able to Move and hardlink it in, but this hasn’t worked consistently since 10.14.
The temporary folder is another location, but on some machines this can become “locked” due to a bug in Apple’s atomic saving code 10.15.

Another way to do it is to use a App Group container, whereby both applications have access to a specific folder.

Hi Sam,

Thanks for this, great help as always!

If I can’t get anything sorted with the MBS plugins, I’ll have a play around with NSTask. I’d forgotten about App Group containers. That may be the answer to common folder access. If I can copy the audio file to be processed into a common app group folder, then both apps should be able to work on it.

Obviously an MBS solution would be easier, but at least it gives me a direction to go in if that’s not possible.

1 Like

I added the required functions and translated your example code to Xojo. Please try soon.
We’ll include it in the next pre-release.

Hi Christian,

Thanks for sending this over, with initial testing it seems to work perfectly. Makes my life so much easier, so thanks for that.

I’ve also downloaded your plugin preview release to test the AV Foundation on iOS and that seems to work well too. I’m about to convert my app to iOS and am debating whether to use Xojo or Swift. Swift will take me some time to get used to as I’ll be starting from scratch, so Xojo will be easier. Your plugins are helping greatly in that decision.