Hardware decoding increases CPU usage

This forum is about all development around libVLC.
chrismin13
New Cone
New Cone
Posts: 2
Joined: 06 Mar 2020 18:27

Hardware decoding increases CPU usage

Postby chrismin13 » 06 Mar 2020 18:54

Hello,

I'm trying to implement Hardware Decoding in OBS which has a VLC Plugin Source. To do so, I have added the following line of code:

Code: Select all

libvlc_media_add_option_(new_media,":avcodec-hw=any");
This does end up decoding the video using the GPU, as task manager shows usage of the Video Decode is increasing (I'm on an i5 6300u with Intel HD520 graphics), unlike normally in OBS where the Video Decode isn't used at all. However, the CPU usage is more than double that of normal OBS! Furthermore, I repeatedly get the following error from VLC on every frame:

Code: Select all

main blend error: blending YUVA to DX11 failed main blend debug: looking for video blending module matching "any": 1 candidates blend blend error: no matching alpha blending routine (chroma: YUVA -> DX11) main blend debug: no video blending modules matched
Here's a link to the full output of the log: https://hastebin.com/ukonapevof.txt
Furthermore, here's the code for the OBS VLC Plugin: https://github.com/chrismin13/obs-studio/tree/master/plugins/vlc-video

Thank you in advance for any help, I'm completely lost at this point. Anything I've tried always gives the same results (either SW decoding or that error message). Any ideas would be very much appreciated!

Rémi Denis-Courmont
Developer
Developer
Posts: 15267
Joined: 07 Jun 2004 16:01
VLC version: master
Operating System: Linux
Contact:

Re: Hardware decoding increases CPU usage

Postby Rémi Denis-Courmont » 07 Mar 2020 01:38

If you need to get the decoded video back to CPU for processing, then hardware decoding is typically slower than software decoding.

It is highly recommended to NOT force parameters in LibVLC.
Rémi Denis-Courmont
https://www.remlab.net/
Private messages soliciting support will be systematically discarded

chrismin13
New Cone
New Cone
Posts: 2
Joined: 06 Mar 2020 18:27

Re: Hardware decoding increases CPU usage

Postby chrismin13 » 10 Mar 2020 13:15

Hmm, alright, I'll check the CPU isn't involved in anything. However, OBS tries to use the GPU as much as possible so I don't think that's the issue. Looking at Visual Studio's performance debugging, the CPU usage is concentrated at the libVLC dll.

Furthermore, the conversion taking place here is to DirectX 11, which has more to do with the GPU than the CPU. It seems to me the conversion is not happening correctly with VLC, so it falls back to software based rendering, although I may be incorrect.

Thanks anyways for your help, I really appreciate it. Looking forward to your reply.


Return to “Development around libVLC”

Who is online

Users browsing this forum: No registered users and 9 guests