Using GPU on decoding
Posted: 12 Jun 2014 21:12
Hi guys.
I am developing an application using libvlc and c#. The application opens multiple instances of vlc player in multiple screens and plays media in them. Everything is working good and the application works great.
There is one problem though which I was expecting from start. As the number of player instances grow so does the CPU usage. I was hoping to be able to activate the hardware acceleration on vlc so the cpu wouldnt have to work so hard. From my understanding this option is enabled in vlc 2.1.3, which is the version I'm using. I tried --avcodec-hw=any and it didnt make any difference, the GPU capacity used by vlc is at most close to 12.5%. (I have AMD Graphics card and monitored the GPU using their monitoring application).
I was hoping to be able to use the maximum GPU capacity on video decoding.
Is there anyway to achieve this? Is it possible to actually manage the GPU usage through libvlc?
I am developing an application using libvlc and c#. The application opens multiple instances of vlc player in multiple screens and plays media in them. Everything is working good and the application works great.
There is one problem though which I was expecting from start. As the number of player instances grow so does the CPU usage. I was hoping to be able to activate the hardware acceleration on vlc so the cpu wouldnt have to work so hard. From my understanding this option is enabled in vlc 2.1.3, which is the version I'm using. I tried --avcodec-hw=any and it didnt make any difference, the GPU capacity used by vlc is at most close to 12.5%. (I have AMD Graphics card and monitored the GPU using their monitoring application).
I was hoping to be able to use the maximum GPU capacity on video decoding.
Is there anyway to achieve this? Is it possible to actually manage the GPU usage through libvlc?