Thread Safe Picture Cross-Fade?

I’m having trouble with Windows and threading a crossfade between two pictures. Naturally, of course, Mac OS has no issues - but on Windows, it seems that both Einhugur PictureEffects and BlendPicturesMBS are thread blocking. The next instruction in the method is not executing until the frame generation thread completes.

Adding App.YieldToNextThread to the boundary of the loop has no effect because the frame generation thread is the next item with context. If I force a context switch with Thread.Sleep I can get an instant response, but the fade animation is then choppy as the frame generation calls are still blocking and now have the 1ms wait time.

For HD video sized (1920 x 1080) frames, they generate just about as fast as they should display (just around 40ms a frame) so I can get away with this if I can get the thread to generate frames concurrently with the canvas that’s displaying them.

Does anyone have a thread safe way to generate cross fade frames? (one image on top of another at a certain opacity)

How are you using the crossfade, if it’s in a canvas, could you just use g.transparency when drawing the second frame, or are you computing an image which is then to be inserted in video?

Edit: In fact if you’re generating an image to be stuffed into something else, you can still use graphics.transparency. Reuse the image to avoid memory allocations.

BlendPicturesMBS does not yield.
But isn’t it fast enough that you can do the yield yourself with App.YieldToNextThread before/after calling it?

I’ve created a canvas cubclass that manages displaying frames at a desired framerate. Provided a collection of images at full HD video size, I can get it to display fast enough for a smooth animation. I will have to give g.Transparency a shot as I jumped straight to plugins assuming performance would be an issue with Xojo. (You know what they say about assuming…)

[quote=422081:@Christian Schmitz]BlendPicturesMBS does not yield.
But isn’t it fast enough that you can do the yield yourself with App.YieldToNextThread before/after calling it?[/quote]
I had tried this before switching plugin sets, the frame generation thread was the next thread to have context (not the main thread) so it had no effect. I could force a context switch with Thread.Sleep but since the function was blocking, it resulted in the displaying animation being choppy.

link text
@Tim Parnell , maybe this (rough) example can help a bit, with the timerperiod setting lower you can vary the speedrate of fading and with the updown property in the action event of the timer you can set the stepsize.
You can continuously resize the picture/window while fading in/out.

[quote=422061:@Tim Parnell]I’m having trouble with Windows and threading a crossfade between two pictures. Naturally, of course, Mac OS has no issues - but on Windows, it seems that both Einhugur PictureEffects and BlendPicturesMBS are thread blocking. The next instruction in the method is not executing until the frame generation thread completes.

Adding App.YieldToNextThread to the boundary of the loop has no effect because the frame generation thread is the next item with context. If I force a context switch with Thread.Sleep I can get an instant response, but the fade animation is then choppy as the frame generation calls are still blocking and now have the 1ms wait time.

For HD video sized (1920 x 1080) frames, they generate just about as fast as they should display (just around 40ms a frame) so I can get away with this if I can get the thread to generate frames concurrently with the canvas that’s displaying them.

Does anyone have a thread safe way to generate cross fade frames? (one image on top of another at a certain opacity)[/quote]

You can use the PictureEffectsRaw in Async mode and not have it thread block at all. Its not supported on all the effects yet but it happens to be supported on the BlendEffect.

Note if you do go Async with the PictureEffectsRaw then you will have 2 ways to go:

So you may want to study both ways which you can find in the example projects, if not for this effect then other effect would show the other way.

Thanks for all the help, Björn. I will explore these options when I get a chance.

I can only really help with the macOS, I would suggest for the best performance to use Metal, AVFoundation and Core Image. Using these technologies together I am able to read a 4k 24fps movie, process each frame and display it on screen at ~20 fps, on a 2012 MacBook Pro running El Capitan. On a newer machine running High Sierra (with Metal 2) I would imagine smoother framerates.

Its a lot of work; macOS only (and probably won’t work in the next 2 years as Apple will break/remove something in the process).

I only found last week that Apple is changing the Core Image kernel language, which means 10 years worth of custom core image filters will have to be re-written in the future to adopt the new API. Oh, and they’ll HAVE to be written in Xcode, I’ll no longer be able to create custom Core Image filters in Xojo anymore :frowning:

Your assumption is not a bad one, there have been changes in Xojo (and little tricks) that mean you can do image processing a LOT faster than before, I have a function in Xojo that reads the maximum and minimum values from an HDR image (32-bpc floating point) in something like 0.2 seconds, it’s still way slower than doing it on the graphics card, but as Apple broke the functions for doing this on the graphics card, I accepted it. It used to take seconds btw.

The GPU is the fastest way to process images (at least in my experience) and Core Image made that easy(ier)… But those days are falling behind rather quickly.