Will movies always stutter more on pc/monitor than on TV?

Just have a drink and chat
z3b2
Blank Cone
Blank Cone
Posts: 20
Joined: 08 Mar 2013 15:35

Will movies always stutter more on pc/monitor than on TV?

Postby z3b2 » 24 Mar 2013 21:40

Yeah I know, total noob-question probably.
Someone just told me that most monitors only handle like 60Hz, while most TV:s handle 23.976, 24, 25, 50, 60 ...

... so a pc -> TV will give totally smooth video, while pc -> monitor will need pulldown (or something) converting ~24fps to 60fps and therefor always more stutter.

I had no idea, is this really true???
I thought lots of people enjoyed movies "on pc", can it really be such a big downside to that approach?

Thanks :)

BlackWhiteX
Blank Cone
Blank Cone
Posts: 14
Joined: 28 Aug 2012 10:11
VLC version: 2.0.5
Operating System: Windows 7 Pro 64bit

Re: Will movies always stutter more on pc/monitor than on TV

Postby BlackWhiteX » 30 Mar 2013 22:17

Just a bit basic stuff: 8)
The digital output of a playback device, such as a DVD-player, is 100 Hz. But TV's nowadays have lot more than 100 Hz (300-400 and more, I think). The TV makes the output signal look "better" by increasing the Hz rate (which is nothing but the frequency of the image to be "refreshed"). But this doesn't mean that the output rate is that high...
On a monitor you usually have 60-75 Hz, which actually means that the 100 Hz of the output device get decreased.
And this is what you said about the pulldown, of this converting to a lower Hz rate.
By the way, it depends from monitor to monitor...
So if you want to watch a film in good quality...watch it on TV!!! (obviously this only works with well-recorded ones (such as original films), don't think that cinema-dubbed movies will improve lots of quality by watching them on television) :D

Greetings,
BlackWhiteX :wink:

matz
Blank Cone
Blank Cone
Posts: 11
Joined: 17 Apr 2013 01:48

Re: Will movies always stutter more on pc/monitor than on TV

Postby matz » 17 Apr 2013 01:57

I think it more depends on your video card and hardware

matz
Blank Cone
Blank Cone
Posts: 11
Joined: 17 Apr 2013 01:48

Re: Will movies always stutter more on pc/monitor than on TV

Postby matz » 19 Apr 2013 07:08

With my home-made movies, I find a stutter a lot more on the media player. And play perfectly find my PC. I have tried playing the movies on a western digital media player, and a Sony Blu-ray DVD player that has an inbuilt media player.

I cannot figure this one out, suggestions anyone?

Gord
New Cone
New Cone
Posts: 4
Joined: 14 May 2013 15:03

Re: Will movies always stutter more on pc/monitor than on TV

Postby Gord » 15 May 2013 08:09

Hi all

I can help on this one

The 'refresh rate' on a monitor refers to the vertical scan rate. (simplified explanation follows) The image is created (painted) from left to right, line-by-line from top to bottom so there are really two things at work here. Lets use a refresh rate of 60 Hz (60 top-to-bottom scans per second) as an example. Lets further assume that there are 525 horizontal lines on the screen. (...just so happens that these are old NTSC TV rates from the analog days) That means that the horizontal scan rate is 525*60 or 31500 horizontal scans per second. In the old days (I mean just after the dinosaurs died off and vacuum tubes were the state-of-the-art in electronics) these were a bit hard to handle so they did some clever stuff. The vertical scan frequency was maintained at 60Hz but the horizontal scan rate was set to 15750Hz. How did this work you ask? Interlacing is the answer. Each horizontal scan in a complete vertical sweep represented every second horizontal line in the pattern of 1-3-5-7-9 etc then on the second vertical scan a vertical offset was introduced to cause line 2-4-6-8 etc. to be painted. This meant that the screen was refreshed 60 times per second, the horizontal scan was a reasonable 15,750 Hz and the actual video signal (the light and dark areas along each horizontal line) could be maintained between a few Hz and a reasonable 2Mhz (million Hz) or so. This is called the video bandwidth and is said in this case to be a 2Mhz bandwidth.

As the deno bones further fossilized the capability to handle much higher frequencies became more practical and cheaper. TV's didn't change much though due to backward compatibility and commercial concerns. Computer monitors did though and today have far higher video bandwidth than all but the very best TV's. This is quite evident in the way that fine detail appears on a proper monitor vs. a TV. Try to read an 8 point font on a 24 inch monitor and compare the way that looks on a comparable sized TV. (Its all fuzzy on the TV.) No, that is not your fossilizing eyes playing tricks; it is related to the bandwidth/frequency response in the video components.

The bottom line is that a computer monitor is a far more capable display device than your humble TV (unless you have a VERY expensive TV that is essentially a computer monitor with a tuner anyway.)

The video stutter that we are all familiar with is much more a product of other limitations. There is a WHOLE BUNCH of data required to make up a picture on a modern monitor. All of that data has to come from somewhere (usually from your network card or disk drive) and be funneled through many potential bottlenecks on its way to your display. The most common bottle neck is your internet connection itself. Other limitations may be your processor, your hdd, the amount of memory on your PC, the backplane (motherboard) speeds etc.

Any vertical refresh rate much beyond 85Hz is really spec-man-ship because it is beyond what the human eye can detect. (persistence of vision) What now separates the good from the truly great is the video bandwidth. This is directly responsible for the level of detail and the monitor's ability to correctly display a fast-moving object. (Think hockey puck.)

Now for the purists... This is a simplified (not perfect) explanation and does not discuss the color burst signal, the SAP, the horizontal sync pulse, the vertical blanking pulse, the horizontal blanking pulse, video blanking, the time signal, various in-band control signals, etc. ad nausium nor the nuances of flat panel dispalys. Please, lets keep it relatively simple. :-)

Enjoy
Gord

BlackWhiteX
Blank Cone
Blank Cone
Posts: 14
Joined: 28 Aug 2012 10:11
VLC version: 2.0.5
Operating System: Windows 7 Pro 64bit

Re: Will movies always stutter more on pc/monitor than on TV

Postby BlackWhiteX » 17 May 2013 20:16

The bottom line is that a computer monitor is a far more capable display device than your humble TV (unless you have a VERY expensive TV that is essentially a computer monitor with a tuner anyway.)
But how do you explain the thing about the smooth video on TV and not on monitor? :?:

Gord
New Cone
New Cone
Posts: 4
Joined: 14 May 2013 15:03

Re: Will movies always stutter more on pc/monitor than on TV

Postby Gord » 21 Jul 2013 09:57

Because the TV is designed for much slower refresh rates the actual "phosphors" (Old School) have a much longer persistence. When excited (struck) by the old style electron beam (or in newer flat panels when switched "on") they take much longer to fade to black when switched off. This is called persistence. The main reason for this was to conceal the flicker inherent to those older devices. In a higher quality monitor, due to the higher refresh rates supported by the other display hardware the persistence of the microscopic display elements is shorter. This reduces smearing and blurred detail but increases flicker unless the refresh rate is raised to something above the ability of the human eye to detect flicker which is usually somewhere around 75Hz but depends on the individual. At somewhere around 80Hz virtually no one can detect any flicker and will see the screen as a steady light with only one caveat. Interference with lighting, usually fluorescent lighting, which in North America flickers at 60Hz and in the rest of the developed world operates at 50Hz can cause what is known as a beat frequency. This is often perceived as lighter and darker horizontal bands which appear to move up or down the display. This effect can be seen dramatically if a photograph of any display is taken with a decent camera using a fast shutter shutter speed. A dark band will be seen across the screen. The edges of the band will be poorly defined in photo of a TV and much more clearly defined on a monitor with a high refresh rate and low persistence "phosphors" or "pixels".


Return to “Coffee Corner”

Who is online

Users browsing this forum: No registered users and 6 guests