Page 1 of 1

When DXVA is enabled, which decoder is used?

Posted: 26 Apr 2010 21:21
by kurkosdr
Hi there,

When I play a video with VLC 1.1.0 pre2, and DXVA is enabled, which decoder is used? Is it the excellent decoder from ffmpeg VLC has, or some no-name decoder from ATI/NVIDIA nobody has tested in terms of quality?

Thanks,
kurkosdr

Re: When DXVA is enabled, which decoder is used?

Posted: 27 Apr 2010 11:36
by VLC_help
Decoder source code is from FFMPEG, but it uses DXVA API to do the decoding steps.

Re: When DXVA is enabled, which decoder is used?

Posted: 27 Apr 2010 17:47
by kurkosdr
Decoder source code is from FFMPEG, but it uses DXVA API to do the decoding steps.
So, the decoding is done by ffmpeg code, but it uses the GPU instead of the CPU?

I was under the impression that DXVA was a decoder embedded to the video card, and the player's decoder wasn't used at all.

Re: When DXVA is enabled, which decoder is used?

Posted: 27 Apr 2010 21:13
by Jean-Baptiste Kempf
Decoder source code is from FFMPEG, but it uses DXVA API to do the decoding steps.
So, the decoding is done by ffmpeg code, but it uses the GPU instead of the CPU?

I was under the impression that DXVA was a decoder embedded to the video card, and the player's decoder wasn't used at all.
No, decoding is done by GPU manucturer, not ffmpeg.

Re: When DXVA is enabled, which decoder is used?

Posted: 29 Apr 2010 14:17
by McLion
Just to be sure ... we are taking about DXVA2, not DXVA, right?
DXVA is not supported, only DXVA2, correct?

Re: When DXVA is enabled, which decoder is used?

Posted: 29 Apr 2010 14:34
by napx
Decoder source code is from FFMPEG, but it uses DXVA API to do the decoding steps.
So, the decoding is done by ffmpeg code, but it uses the GPU instead of the CPU?

I was under the impression that DXVA was a decoder embedded to the video card, and the player's decoder wasn't used at all.
No, decoding is done by GPU manucturer, not ffmpeg.
This is exactly why I'm more excited about the possibility of multicore decoding than GPU decoding in VLC. I think it is more in VLC's tradition that the CPU (therefore the tried and true ffmpeg backend) does the work. I also think there are less variables to worry about in CPU-land.

Re: When DXVA is enabled, which decoder is used?

Posted: 29 Apr 2010 14:40
by Jean-Baptiste Kempf
Just to be sure ... we are taking about DXVA2, not DXVA, right?
DXVA is not supported, only DXVA2, correct?
Yes.

Re: When DXVA is enabled, which decoder is used?

Posted: 10 May 2010 18:45
by kurkosdr
Here comes the million dollar question:

How can I be sure that the manufacturer's GPU decoder will be as good as the ffmpeg's decoder?

There are at least 3 different GPU decoder implementations out there (Intel, ATI, NVIDIA) and there is no guarantee all of them will be as good as ffmpeg, since ffmpeg doesn't have any control over the decoding process (when using DXVA, ffmpeg just submits the video for decoding and then receives the decoded video, right?).

Re: When DXVA is enabled, which decoder is used?

Posted: 11 May 2010 12:42
by Jean-Baptiste Kempf
How can I be sure that the manufacturer's GPU decoder will be as good as the ffmpeg's decoder?
You cannot. You can just hope.

Re: When DXVA is enabled, which decoder is used?

Posted: 11 May 2010 18:34
by kurkosdr
How can I be sure that the manufacturer's GPU decoder will be as good as the ffmpeg's decoder?
You cannot. You can just hope.
So I guess I should wait for 1.2 which will have multicore support. This will be the best, IMO, because the trusted ffmpeg decoder is used, while even a CoreDuo T2400 should be able to handle the load of 1080p h264.

(I never liked DVXA2 much. As the YUV -> RGB debacle in ATI/NVIDIA drivers showed us, code from GPU manufacturers is not to be trusted)

BTW, can I have DXVA2 on while having the YUV->RGB conversions off?

Re: When DXVA is enabled, which decoder is used?

Posted: 12 May 2010 16:03
by Jean-Baptiste Kempf
Yes, you can.