Page 1 of 1

GPU decoding with LibVLC on Windows

Posted: 08 Nov 2014 15:43
by gcdu45
Hi,

I'm trying to activate GPU decoding in LibVLC (from vlc-2.2.0-rc1-20141107-0204) to use it in my program. After some research, i've seen on many topics that we need to use "avcodec-hw" to activate GPU decoding so I tried many things such as:

- Pass avcodec-hw parameter to LibVLC :

Code: Select all

const char * const vlc_args[] = { "--avcodec-hw=any"}; inst = libvlc_new (sizeof(vlc_args) / sizeof(vlc_args[0]), vlc_args);
- Add option "avcodec-hw" to media :

Code: Select all

libvlc_media_add_option(m,"avcodec-hw=any");
and even use the old "ffmpeg-hw" parameter but none of theses options works... So when i'm playing 1080p movie in my program my CPU keeps going to 20% while GPU decoding works very well in VLC (~5-6% of CPU usage with GPU decoding enabled. Otherwise ~20% like in my program).

I'm using VLC 2.2.0-rc1-20141107-0204 and the same LibVLC version on Windows 8.1, i have a Pentium B980 with Intel HD 2000 graphics (which supports GPU decoding) and my program uses Qt 5.2.1.

Could you help me to solve this ?

Re: GPU decoding with LibVLC on Windows

Posted: 10 Dec 2014 14:53
by Jean-Baptiste Kempf
"--avcodec-hw=dxva2" ?

Re: GPU decoding with LibVLC on Windows

Posted: 03 Jan 2015 19:04
by gcdu45
Thank you, it works perfectly ;)

Re: GPU decoding with LibVLC on Windows

Posted: 03 Jan 2015 20:33
by Jean-Baptiste Kempf
U R welcome :)

Re: GPU decoding with LibVLC on Windows

Posted: 14 Apr 2015 03:35
by jerrylipeng
Hi, where should I use "--avcodec-hw=dxva2", in libvlc_new or libvlc_media_add_option? I tried both but neither works..