Nvidia gpus starting with Maxwell had hybrid aka partial hevc support, they used part of the compute function of the gpu to help decode, it wasnt until second generation maxwell that full hevc was implemented.
It's a partial solution which does work, 1080p hevc drops to single digits in cpu use in mpc
4k is too much for it to handle though, but that's not a common issue people deal with.
https://forums.geforce.com/default/topi ... -a-h-265-/
Feature Set E
Similar to feature set D but added support for decoding H.264 with a resolution of up to 4096 × 4096 and MPEG-1/MPEG-2 with a resolution of up to 4080 × 4080 pixels. GPUs with VDPAU feature set E support an enhanced error concealment mode which provides more robust error handling when decoding corrupted video streams. Cards with this feature set use a combination of the PureVideo hardware and software running on the shader array to decode HEVC (H.265) as partial/hybrid hardware video decoding.
GeForce GTX 745, GTX 750, GTX 750 Ti, GTX 850M, GTX 860M,GeForce 830M, 840M,GeForce GTX 970, GTX 980, GTX 970M, GTX 980M,GeForce GTX TITAN X, GeForce GTX 980 Ti
Feature Set F
Introduced dedicated HEVC Main (8-bit) & Main 10 (10-bit) and VP9 hardware decoding video decoding up to 4096 × 2304 pixels resolution.
GeForce GTX 750 SE, GTX 950, GTX 960
Feature Set G
Introduced dedicated hardware video decoding of HEVC Main 12 (12-bit) up to 4096 × 2304 pixels resolution.
Feature Set H are capable of hardware-accelerated decoding of 8192x8192 (8k resolution) H.265/HEVC video streams
GeForce GTX 1070, GTX 1080, GeForce GTX 1060, NVIDIA TITAN XP, GeForce GTX 1050, GTX 1050 Ti
What it looks like active in lav video decoder