-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Luke-Jr wrote:
On Saturday 25 March 2006 14:31, Charles Steinkuehler wrote:
Yes, the video cards will output analog interlace video on the VGA connectors...it will be RGB (instead of the more typical YUV used for video), but modern video cards will output just about *ANYTHING* as far as the field/frame structure goes.
Remember, various VGA resolutions used to be interlaced, back in the day (when RAMDACs and monitors couldn't support the higher frequencies required for progressive at high resolutions).
But will a modern monitor use the interlaced data properly?
Yes.
eg, draw the screen field-by-field in a viewable fashon?
Um...actually what a modern (ie: flat-panel LCD or plasma) monitor does is dump the incoming data into an internal frame-buffer that is then scaled to the monitor's native resolution (there are *VERY* few "HD" flat panels that actually have 1920x1080 pixel resolution), in the process kind of glossing over any interlace issues.
There are odd interlace artifacts from this conversion that you won't see on a classic "tube" display (the reason I have a real CRT display next to me right now for testing), and of course interlace itself has it's own artifacts. Anyway, from the layman's perspective, it "just works", eg: draws the screen *FRAMES* in a viewable fashion.
The details of exactly how this happens varies from monitor to monitor, but pretty much all displays other than classic CRTs (eg: LCD, plasma, DLP) will be de-interlacing, and likely scaling and frame-syncing the input signal to match the native pixel resolution and display rate (typically 59.94 or 60 Hz for North American equipment).
- -- Charles Steinkuehler charles@steinkuehler.net