FFmpeg output to canvas

Is there anybody with enough FFmpeg experience to know if I can set the FFmpeg’s output to show in a canvas?
(or preferably to a picture object, so I can manipulate it)

I know about the existence of FFplay. But as far as I can tell, FFplay has its own UI. And I have no clue how to translate a “pipe” to a picture object.

What about MoviePlayer ?

MoviePlayer will not work. I need a player that plays back almost anything like VLC does. The thing is, I already use a bunch of FFmpeg functions, like the EBU-R128 analyzer. And, the most important reason is that I need to be able to play back MXF video files with some specific broadcast codecs. And the MoviePlayer control won’t let me. Both VLC and FFmpeg accept the standards most professional broadcasters work with.
I could use the VLC libraries. But I prefer to stick with only one external app to deal with audio and video.

FFplay, a part of FFmpeg, does a lot of things. Even playback. I can extract video data. But I have not yet been able to get the real-time video image data.

In short, you cannot get realtime output from FFmpeg to a Canvas.
The only way to do this, is by using VLC. MBS plugins has some good classes to use VLC in your project.

“In short…” as in not possible at all?
I even thought of just streaming it and picking it up by my own app to show on a Canvas. But I think I might have to deal with delay and what not…

I was afraid to point my arrows to yet another external app. I am familiar with the VLC-plugin by MBS. I just would rather stick with one app the user has to install. But as far as I remember, I can embed the VLC libraries into my final build. And that helps a little.