Using GPU on decoding

This forum is about all development around libVLC.
imanafa
New Cone
New Cone
Posts: 5
Joined: 07 May 2014 18:13

Using GPU on decoding

Postby imanafa » 12 Jun 2014 21:12

Hi guys.

I am developing an application using libvlc and c#. The application opens multiple instances of vlc player in multiple screens and plays media in them. Everything is working good and the application works great.

There is one problem though which I was expecting from start. As the number of player instances grow so does the CPU usage. I was hoping to be able to activate the hardware acceleration on vlc so the cpu wouldnt have to work so hard. From my understanding this option is enabled in vlc 2.1.3, which is the version I'm using. I tried --avcodec-hw=any and it didnt make any difference, the GPU capacity used by vlc is at most close to 12.5%. (I have AMD Graphics card and monitored the GPU using their monitoring application).

I was hoping to be able to use the maximum GPU capacity on video decoding.

Is there anyway to achieve this? Is it possible to actually manage the GPU usage through libvlc?

Rémi Denis-Courmont
Developer
Developer
Posts: 15272
Joined: 07 Jun 2004 16:01
VLC version: master
Operating System: Linux
Contact:

Re: Using GPU on decoding

Postby Rémi Denis-Courmont » 12 Jun 2014 21:40

Most GPUs can only accelerate decoding of one video at a time (or two if capable of stereography). You definitely do not want to use the GPU to scale up.
Rémi Denis-Courmont
https://www.remlab.net/
Private messages soliciting support will be systematically discarded

imanafa
New Cone
New Cone
Posts: 5
Joined: 07 May 2014 18:13

Re: Using GPU on decoding

Postby imanafa » 16 Jun 2014 21:03

Rémi thank you for your response.

When I use avcodec-hw=any, after opening the second or third(depending on video quality) the videos get laggy and pixelated and if i open enough player instances in my application, usually 4 or more, my application will crash.(the cpu usage is much lower as expected but seems like DXVA is not capable of managing multiple instances)

The same thing happens with VLC itself if i open multiple instances of VLC player, it crashes. i tried using ffmpeg-hw , no lags but still to much CPU usage which leads me to believe that its not effective in my situation.

Is there a way i can use less CPU capacity and more of GPUs?

Rémi Denis-Courmont
Developer
Developer
Posts: 15272
Joined: 07 Jun 2004 16:01
VLC version: master
Operating System: Linux
Contact:

Re: Using GPU on decoding

Postby Rémi Denis-Courmont » 16 Jun 2014 22:02

You can use the GPU to offload decoding, but only one or maybe two videos. If you want to decode many videos simultaneously, you need a powerful (in practice, multicore) CPU. Period.
Rémi Denis-Courmont
https://www.remlab.net/
Private messages soliciting support will be systematically discarded


Return to “Development around libVLC”

Who is online

Users browsing this forum: No registered users and 32 guests