Hi all
I can help on this one
The 'refresh rate' on a monitor refers to the vertical scan rate. (simplified explanation follows) The image is created (painted) from left to right, line-by-line from top to bottom so there are really two things at work here. Lets use a refresh rate of 60 Hz (60 top-to-bottom scans per second) as an example. Lets further assume that there are 525 horizontal lines on the screen. (...just so happens that these are old NTSC TV rates from the analog days) That means that the horizontal scan rate is 525*60 or 31500 horizontal scans per second. In the old days (I mean just after the dinosaurs died off and vacuum tubes were the state-of-the-art in electronics) these were a bit hard to handle so they did some clever stuff. The vertical scan frequency was maintained at 60Hz but the horizontal scan rate was set to 15750Hz. How did this work you ask? Interlacing is the answer. Each horizontal scan in a complete vertical sweep represented every second horizontal line in the pattern of 1-3-5-7-9 etc then on the second vertical scan a vertical offset was introduced to cause line 2-4-6-8 etc. to be painted. This meant that the screen was refreshed 60 times per second, the horizontal scan was a reasonable 15,750 Hz and the actual video signal (the light and dark areas along each horizontal line) could be maintained between a few Hz and a reasonable 2Mhz (million Hz) or so. This is called the video bandwidth and is said in this case to be a 2Mhz bandwidth.
As the deno bones further fossilized the capability to handle much higher frequencies became more practical and cheaper. TV's didn't change much though due to backward compatibility and commercial concerns. Computer monitors did though and today have far higher video bandwidth than all but the very best TV's. This is quite evident in the way that fine detail appears on a proper monitor vs. a TV. Try to read an 8 point font on a 24 inch monitor and compare the way that looks on a comparable sized TV. (Its all fuzzy on the TV.) No, that is not your fossilizing eyes playing tricks; it is related to the bandwidth/frequency response in the video components.
The bottom line is that a computer monitor is a far more capable display device than your humble TV (unless you have a VERY expensive TV that is essentially a computer monitor with a tuner anyway.)
The video stutter that we are all familiar with is much more a product of other limitations. There is a WHOLE BUNCH of data required to make up a picture on a modern monitor. All of that data has to come from somewhere (usually from your network card or disk drive) and be funneled through many potential bottlenecks on its way to your display. The most common bottle neck is your internet connection itself. Other limitations may be your processor, your hdd, the amount of memory on your PC, the backplane (motherboard) speeds etc.
Any vertical refresh rate much beyond 85Hz is really spec-man-ship because it is beyond what the human eye can detect. (persistence of vision) What now separates the good from the truly great is the video bandwidth. This is directly responsible for the level of detail and the monitor's ability to correctly display a fast-moving object. (Think hockey puck.)
Now for the purists... This is a simplified (not perfect) explanation and does not discuss the color burst signal, the SAP, the horizontal sync pulse, the vertical blanking pulse, the horizontal blanking pulse, video blanking, the time signal, various in-band control signals, etc. ad nausium nor the nuances of flat panel dispalys. Please, lets keep it relatively simple.
Enjoy
Gord