sdl_opengl_player on Windows runtime error and not uses decoding acceleration
Posted: 10 May 2021 15:06
I've managed to compile sdl_opengl_player.cpp for Windows using Visual Studio 19 and vlc 4.0. For using the runtime infrastructure of vlc I moved and run the resulting executable into the vlc-4.0.0-dev nightly snapshot (until the one of May 10).
With the original code's libvlc_video_engine_opengl set in libvlc_video_set_output_callbacks, I get a runtime error:
Assertion failed: render_cfg.opengl_format == GL_RGBA, file /builds/videolan/vlc/extras/package/win32/../../../modules/video_output/vgl.c, line 96
Instead, by changing libvlc_video_engine_opengl into libvlc_video_engine_disable (maybe it tells the renderer to choose the video output), an OpenGL rendering window opens up, and the video gets rendered (and audio is audible too). Yet, the playing suffers of video frames dropping, thus by looking at the system performances/GPU decode graph, I understand that no hw decoder is being used. Lastly, by reading the debug output, I noticed this: gl gl error: Could not create interop (both on Intel UHD 620 and Nvidia 1070 graphics hw).
Thus, my questions are:
- is there a way to work around the stated runtime error, so to directly use the asked OpenGL video output device?
- how to enable hw (GPU) based video decoding?
- by creating OpenGL interop hopefully even increases the video frame to texture transfer performances, how to do that?
The purpose of my interest in a OpenGL-based libvlc example is to get 360° frames off 360° videos, and reproject to a CAVE walls. I wrote an OpenGL reprojection code, which presently works in a ffmpeg-based prototype, yet it's based on pipe communication, thus it suffers performance issues with video > 4K, so I was searching for a decoding-rendering integrated solution.
With the original code's libvlc_video_engine_opengl set in libvlc_video_set_output_callbacks, I get a runtime error:
Assertion failed: render_cfg.opengl_format == GL_RGBA, file /builds/videolan/vlc/extras/package/win32/../../../modules/video_output/vgl.c, line 96
Instead, by changing libvlc_video_engine_opengl into libvlc_video_engine_disable (maybe it tells the renderer to choose the video output), an OpenGL rendering window opens up, and the video gets rendered (and audio is audible too). Yet, the playing suffers of video frames dropping, thus by looking at the system performances/GPU decode graph, I understand that no hw decoder is being used. Lastly, by reading the debug output, I noticed this: gl gl error: Could not create interop (both on Intel UHD 620 and Nvidia 1070 graphics hw).
Thus, my questions are:
- is there a way to work around the stated runtime error, so to directly use the asked OpenGL video output device?
- how to enable hw (GPU) based video decoding?
- by creating OpenGL interop hopefully even increases the video frame to texture transfer performances, how to do that?
The purpose of my interest in a OpenGL-based libvlc example is to get 360° frames off 360° videos, and reproject to a CAVE walls. I wrote an OpenGL reprojection code, which presently works in a ffmpeg-based prototype, yet it's based on pipe communication, thus it suffers performance issues with video > 4K, so I was searching for a decoding-rendering integrated solution.