GPU decoding with LibVLC on Windows

This forum is about all development around libVLC.
gcdu45
New Cone
New Cone
Posts: 2
Joined: 08 Nov 2014 13:27
VLC version: vlc 2.2.0-git
Operating System: Windows, Linux
Location: France

GPU decoding with LibVLC on Windows

Postby gcdu45 » 08 Nov 2014 15:43

Hi,

I'm trying to activate GPU decoding in LibVLC (from vlc-2.2.0-rc1-20141107-0204) to use it in my program. After some research, i've seen on many topics that we need to use "avcodec-hw" to activate GPU decoding so I tried many things such as:

- Pass avcodec-hw parameter to LibVLC :

Code: Select all

const char * const vlc_args[] = { "--avcodec-hw=any"}; inst = libvlc_new (sizeof(vlc_args) / sizeof(vlc_args[0]), vlc_args);
- Add option "avcodec-hw" to media :

Code: Select all

libvlc_media_add_option(m,"avcodec-hw=any");
and even use the old "ffmpeg-hw" parameter but none of theses options works... So when i'm playing 1080p movie in my program my CPU keeps going to 20% while GPU decoding works very well in VLC (~5-6% of CPU usage with GPU decoding enabled. Otherwise ~20% like in my program).

I'm using VLC 2.2.0-rc1-20141107-0204 and the same LibVLC version on Windows 8.1, i have a Pentium B980 with Intel HD 2000 graphics (which supports GPU decoding) and my program uses Qt 5.2.1.

Could you help me to solve this ?

Jean-Baptiste Kempf
Site Administrator
Site Administrator
Posts: 37523
Joined: 22 Jul 2005 15:29
VLC version: 4.0.0-git
Operating System: Linux, Windows, Mac
Location: Cone, France
Contact:

Re: GPU decoding with LibVLC on Windows

Postby Jean-Baptiste Kempf » 10 Dec 2014 14:53

"--avcodec-hw=dxva2" ?
Jean-Baptiste Kempf
http://www.jbkempf.com/ - http://www.jbkempf.com/blog/category/Videolan
VLC media player developer, VideoLAN President and Sites administrator
If you want an answer to your question, just be specific and precise. Don't use Private Messages.

gcdu45
New Cone
New Cone
Posts: 2
Joined: 08 Nov 2014 13:27
VLC version: vlc 2.2.0-git
Operating System: Windows, Linux
Location: France

Re: GPU decoding with LibVLC on Windows

Postby gcdu45 » 03 Jan 2015 19:04

Thank you, it works perfectly ;)

Jean-Baptiste Kempf
Site Administrator
Site Administrator
Posts: 37523
Joined: 22 Jul 2005 15:29
VLC version: 4.0.0-git
Operating System: Linux, Windows, Mac
Location: Cone, France
Contact:

Re: GPU decoding with LibVLC on Windows

Postby Jean-Baptiste Kempf » 03 Jan 2015 20:33

U R welcome :)
Jean-Baptiste Kempf
http://www.jbkempf.com/ - http://www.jbkempf.com/blog/category/Videolan
VLC media player developer, VideoLAN President and Sites administrator
If you want an answer to your question, just be specific and precise. Don't use Private Messages.

jerrylipeng
New Cone
New Cone
Posts: 3
Joined: 13 Apr 2015 05:01

Re: GPU decoding with LibVLC on Windows

Postby jerrylipeng » 14 Apr 2015 03:35

Hi, where should I use "--avcodec-hw=dxva2", in libvlc_new or libvlc_media_add_option? I tried both but neither works..


Return to “Development around libVLC”

Who is online

Users browsing this forum: No registered users and 31 guests