Page 1 of 1

c++ runtime error w/ h.264

Posted: 14 Jun 2006 11:28
by Guest
here are my pc stats
3400+
6800gt
2gbs 3200
4 raptors in raid 0
asus k8ne-deluxe

alright well ive heard that 1080p is very taxing on a cpu but ive been looking at some benchmarks and i dont understand why i cant play 1080p at all

720 runs flawlessly but at 1080p i get like 2fps constantly freezing and the sound is always way off and i often get this error

microsoft visual c++ runtime library
program c:\programfiles etc...\vlc.exe
file h264.c
line 3612
expression best_i !=INT_MIN

any ideas?

Posted: 14 Jun 2006 11:29
by ecaffeine
oops guess i wasnt logged in
anyways i posted this

Posted: 14 Jun 2006 21:38
by Guest
Were have you seen benchmarks that 3400+ CPU plays 1080p high bitrate H.264 with CPU only?
Purevideo is different thing and it ain't supported by VLC.
You can try mplayer with -vfm ffmpeg -lavdopts fast:skiploopfilter=all commandline parameter.

Posted: 14 Jun 2006 22:26
by ecaffeine
here are sum benchmarks i have seen with even inefficient codecs
http://www.digital-digest.com/articles/ ... page4.html

not talking about purevideo cuz i have a 6800
only the 7x series has h.264 acceleration with purevideo

Posted: 15 Jun 2006 22:28
by Guest
Short math, Test System 2: 1080p Result: 17.136, would need 40% more CPU power to playback it smoothly, so 4.5 GHz cpu would be suffient.

And libavcodec ain't that inefficient (sorry about double negative), so if you want faster playback go tell that to libavcodec devs. It is also used in VLC (and on many other open source players). CoreAVC is very good, and if you really need good playback buy it, but it won't be VLC compatible.