GPU decoding with LibVLC on Windows
Posted: 08 Nov 2014 15:43
Hi,
I'm trying to activate GPU decoding in LibVLC (from vlc-2.2.0-rc1-20141107-0204) to use it in my program. After some research, i've seen on many topics that we need to use "avcodec-hw" to activate GPU decoding so I tried many things such as:
- Pass avcodec-hw parameter to LibVLC :
- Add option "avcodec-hw" to media :
and even use the old "ffmpeg-hw" parameter but none of theses options works... So when i'm playing 1080p movie in my program my CPU keeps going to 20% while GPU decoding works very well in VLC (~5-6% of CPU usage with GPU decoding enabled. Otherwise ~20% like in my program).
I'm using VLC 2.2.0-rc1-20141107-0204 and the same LibVLC version on Windows 8.1, i have a Pentium B980 with Intel HD 2000 graphics (which supports GPU decoding) and my program uses Qt 5.2.1.
Could you help me to solve this ?
I'm trying to activate GPU decoding in LibVLC (from vlc-2.2.0-rc1-20141107-0204) to use it in my program. After some research, i've seen on many topics that we need to use "avcodec-hw" to activate GPU decoding so I tried many things such as:
- Pass avcodec-hw parameter to LibVLC :
Code: Select all
const char * const vlc_args[] = {
"--avcodec-hw=any"};
inst = libvlc_new (sizeof(vlc_args) / sizeof(vlc_args[0]), vlc_args);
Code: Select all
libvlc_media_add_option(m,"avcodec-hw=any");
I'm using VLC 2.2.0-rc1-20141107-0204 and the same LibVLC version on Windows 8.1, i have a Pentium B980 with Intel HD 2000 graphics (which supports GPU decoding) and my program uses Qt 5.2.1.
Could you help me to solve this ?