Page 1 of 1

How to load balance decoding on multiple GPUs?

Posted: 18 Jun 2018 08:51
by laaksonen
Hello,

I am trying to play multiple H264 encoded streams simultaneously. I have two AMD FirePro GPUs (OS: Windows 7) which I'd like to use for decoding those streams.
I've managed to utilize the first GPU for decoding by passing parameter --avcodec-hw=dxva2.

The problem is that by default, avcodec uses only the first GPU it finds and it seems that after I've opened 5-6 streams, VLC logs are starting to fill with avcodec error:
avcodec decoder error: more than X seconds of late video -> dropping frame (computer too slow ?).

I've checked GPU load with GPU-Z and it reports only 20-30% usage with 12 streams. I've also checked it with HWMonitor which says that the load is up to 85%. The result of HWMonitor is more believable though I wonder why GPU-Z shows so low usage. If anyone knows some more realiable tools to monitor GPU usage, I'm more than happy to hear about it.

I'd like to try to balance the decoding load between both GPUs available for further testing. The final question is, how can I pass the device I wanna use for decoding to libavcodec through libVLC so I can assign the second GPU for some of the streams. I checked vlc documentation and spotted parameter --avcodec-options and tried it with value --avcodec-options=hw_accel_device=1. hw_accel_device should be the avcodec parameter to index the device avcodec should use but it seemed to have no effect. All the decoding was done by the first GPU.

Re: How to load balance decoding on multiple GPUs?

Posted: 18 Jun 2018 17:48
by Jean-Baptiste Kempf
--avcodec-options="{hw_accel_device=1}" would be the correct syntax. But I doubt this option exists.

You probably need some changes in VLC.

Re: How to load balance decoding on multiple GPUs?

Posted: 19 Jun 2018 07:06
by laaksonen
Thank you for your answer.

How much work would it required to add new parameter to libVLC and pass it on to avcodec?
I assume it'd not cause regression to add one but I'm quite unfamiliar with libVLC codebase so I'm not sure if it's worth trying.

Re: How to load balance decoding on multiple GPUs?

Posted: 19 Jun 2018 10:04
by Jean-Baptiste Kempf
Thank you for your answer.

How much work would it required to add new parameter to libVLC and pass it on to avcodec?
I assume it'd not cause regression to add one but I'm quite unfamiliar with libVLC codebase so I'm not sure if it's worth trying.

I believe it needs to be changed also inside libavcodec, but maybe I'm wrong.

I'd say 2 days of work, if you know what you are doing. 2 weeks else :)

Re: How to load balance decoding on multiple GPUs?

Posted: 19 Jun 2018 10:52
by laaksonen
I played with some avcodec parameters by directly using ffmpeg cmd tool.

Code: Select all

./ffmpeg -hwaccel d3d11va -hwaccel_device 1 -report -f lavfi -i nullsrc -c h264 -f null -
With this command, ffmpeg reported:

Code: Select all

[AVHWDeviceContext @ 0000021e06c6a180] Using device 8086:591b (Intel(R) HD Graphics 630)
Changing -hwaccel_device to 0 it seemed to pick another GPU:

Code: Select all

[AVHWDeviceContext @ 0000021d7509a580] Using device 10de:134d (NVIDIA GeForce 940MX).
In conclusion, I think avcodec is already supporting GPU selection. However hwaccel_device is recognized if it's passed with avcodec-options. I think this is because those options are passed only to codec context but not used at all in hardware device context creation (well of course because these are completely separate things).

Re: How to load balance decoding on multiple GPUs?

Posted: 19 Jun 2018 11:09
by Jean-Baptiste Kempf
I played with some avcodec parameters by directly using ffmpeg cmd tool.

Code: Select all

./ffmpeg -hwaccel d3d11va -hwaccel_device 1 -report -f lavfi -i nullsrc -c h264 -f null -
With this command, ffmpeg reported:

Code: Select all

[AVHWDeviceContext @ 0000021e06c6a180] Using device 8086:591b (Intel(R) HD Graphics 630)
Changing -hwaccel_device to 0 it seemed to pick another GPU:

Code: Select all

[AVHWDeviceContext @ 0000021d7509a580] Using device 10de:134d (NVIDIA GeForce 940MX).
In conclusion, I think avcodec is already supporting GPU selection. However hwaccel_device is recognized if it's passed with avcodec-options. I think this is because those options are passed only to codec context but not used at all in hardware device context creation (well of course because these are completely separate things).

hwaccel_device is in ffmpeg, but not avcodec. That's why.

But yes, it's doable. I'd guess 2 days of work.

Re: How to load balance decoding on multiple GPUs?

Posted: 19 Jun 2018 11:16
by laaksonen
I played with some avcodec parameters by directly using ffmpeg cmd tool.

Code: Select all

./ffmpeg -hwaccel d3d11va -hwaccel_device 1 -report -f lavfi -i nullsrc -c h264 -f null -
With this command, ffmpeg reported:

Code: Select all

[AVHWDeviceContext @ 0000021e06c6a180] Using device 8086:591b (Intel(R) HD Graphics 630)
Changing -hwaccel_device to 0 it seemed to pick another GPU:

Code: Select all

[AVHWDeviceContext @ 0000021d7509a580] Using device 10de:134d (NVIDIA GeForce 940MX).
In conclusion, I think avcodec is already supporting GPU selection. However hwaccel_device is recognized if it's passed with avcodec-options. I think this is because those options are passed only to codec context but not used at all in hardware device context creation (well of course because these are completely separate things).

hwaccel_device is in ffmpeg, but not avcodec. That's why.

But yes, it's doable. I'd guess 2 days of work.
Yes, I mean't that ffmpeg interprets hwaccel_device and converts it to a parameter avcodec understands so there must already be functionality to select GPU in libavcodec, right? Isn't ffmpeg's libavcodec the very same version libVLC uses under the hood? Or have I misunderstood something here?

And thank you for your activity with this matter, I really appreciate it.

Re: How to load balance decoding on multiple GPUs?

Posted: 19 Jun 2018 11:40
by Jean-Baptiste Kempf
Yes, I mean't that ffmpeg interprets hwaccel_device and converts it to a parameter avcodec understands so there must already be functionality to select GPU in libavcodec, right? Isn't ffmpeg's libavcodec the very same version libVLC uses under the hood? Or have I misunderstood something here?
Yes, but not necessarily a parameter. Sometimes FFMpeg use function that are not available to 3rd party apps.

But, in theory, it should wokr.