Page 1 of 1

How To Use VLCKit for iOS to render video output into GLES texture

Posted: 14 Apr 2016 14:24
by Singletons
Hi everyone, I can't find solution of how to render video output from MobileVLCKit (iOS) into my OpenGLES texture, can you please help me with this problem or give some direction to how solve it, thank for help

Re: How To Use VLCKit for iOS to render video output into GLES texture

Posted: 16 Apr 2016 08:59
by fkuehne
MobileVLCKit renders in an OpenGL context by itself and in the latest unstable code, it also supports multi-threaded OpenGL rendering correctly. However, there is no direct way to render to an OpenGL texture.

However, libvlc, the low level library behind VLCKit, supports rendering decoded frames to a memory buffer (which we call "vmem") that is currently already used by the VLCMediaThumbnailer class. Maybe this way could be adapted to give you the raw the buffers so you can render them yourself in your OpenGL context? Obviously, this will involve a speed penalty and not allow 0 copy rendering for hardware decoded media.

Re: How To Use VLCKit for iOS to render video output into GLES texture

Posted: 07 Oct 2016 23:49
by dgoodine
Felix,

I just tried to use that approach and the framerate is really bad (3 fps 2440x1400 frames), most likely due to shuttling the 16mb frames back and forth from the GPU. So the approach of using the vmem callbacks is not feasible, at least for large videos.

Do you have any plans to support rendering to an OpenGL GL_TEXTURE_2D anytime in the future?

-d