Using NSTask with piped shared buffers

I was checking out Sam Rowland’s post on sandboxing and noticed he’d settled for NSTask for his needs.

One question related to NSTask I’ve been putting off for a while is handling piped shared buffers. Have you implemented pipes like these for communication? (in my case, for receiving video from the shell binary).

I want to integrate mplayer2, which allows for piped shared buffering to Cocoa apps, but while I can launch the player I can’t understand how the pipes are supposed to communicate and how to display the incoming video, afterwards.

This is currently done like this in Cocoa:
http://jongampark.wordpress.com/2010/03/05/how-to-control-external-console-program/
(post is about controlling the unix binary, which is useful as well, but explains how the piping is handled)

(pre-built mplayer2 standalone for this can be found here, for testing:
http://code.google.com/p/mplayerosx-builds/downloads/detail?name=mplayer2-standalone-2012-07-22.app.zip&can=2&q=
)

Normally mplayer2 could be launched with parameter -vo corevideo for normal play or -vo shared_buffer:buffer_name=mybuff* for playback through a shared buffer suitable for piping.

http://www.mplayer2.org/docs/mplayer/
*(mybuff: Name of the shared buffer created with shm_open() as well as the name of the NSConnection mplayer2 will try to open, from the docs)

I would like to know this too.

I’m afraid that it’s beyond me, I only know that using NSTask is the solution for calling Sandboxed console apps, sorry.

Yeah, I didn’t reply in your threads because I was aware it was really out of scope there. I’m using them here as a segue into this specific bit, that would open much better options for me than my current solutions.

Well, we have NSTaskMBS and NSPipeMBS in our plugins. You can use them if you like.

Thank you, Christian. I know you do. But I know this can be done with Declares and I can’t learn how it’s done unless I see it. Plugins are OK, but they’re not useful for learning. This is a hobby project so I can afford taking longer and learning how to do it myself.

Sure, you can look in macoslib if there is anything. Just if you do with declares, you have to do all the homework which normally we handle for you.

Christian,

Since I’m not having any luck with this and I’m not knowledgeable enough to do it on my own I’ll bite. How would I do this with your plug-in?

I think I understand how to set the shared memory buffer but how would the video then be piped onto a window? Theoretically it could be displayed as an opengl texture or with core video but without knowing how to set the shared memory section I can’t even begin to see what the output looks like :frowning:

Bumping this up since you may not have seen it and also hoping all the new blood can come up with ideas :smiley:

By the way, in the meantime Don Melton (of the Safari team fame) came up with a shell for Mplayer that, according what I discussed with him, addresses in its code the shortcomings of accessing programmatically the mplayer binary (mplayer doesn’t provide a library):

https://github.com/donmelton/MPlayerShell

If I had the chops I’d turn this into a helper and/or plug-in to be able to use mplayer within a window in my apps instead of having to launch the binary :frowning:

Well, I have the classes in the plugins, but I’m not sure how to do this correctly.
You can use our plugin to create a shared memory object with a size and name and pass that to mplayer.
Than you would have to know the format it uses to store samples.
And also know when a new frame is ready. Than you can make a copy and decode it for display.

[quote=29974:@Christian Schmitz]Well, I have the classes in the plugins, but I’m not sure how to do this correctly.
You can use our plugin to create a shared memory object with a size and name and pass that to mplayer.
Than you would have to know the format it uses to store samples.
And also know when a new frame is ready. Than you can make a copy and decode it for display.[/quote]

I knew it was too nice to be true :slight_smile:

The problem here, to be honest, is Mplayer’s idiosyncracies, more than NSTask. I discussed this with Don Melton and he urged me to avoid the pain that was trying to use mplayer and to just use his wrapper until mplayer’s devs decided to do a library proper :slight_smile: