Coding Horror

programming and human factors

Is DVD the new VHS?

I recently took the plunge and upgraded to a plasma television, mostly because I want a decent native resolution for my home theater PC under Windows Media Center Edition 2005.

Analog televisions don't do 640x480 very well, and can barely be coaxed into legible 800x600. However, HDTV or EDTV sets fare much better as computer displays. Basic EDTV starts at a true ~850x480 and fancier HDTV sets only go up from there, all the way up to 1024x768 or more. And I can corroborate this: it really works. I chose a plasma television with a VGA port as standard equipment, so I just hooked up the PC VGA cable and tweaked the available resolutions slightly in the display control panel. Interestingly, the current nVidia drivers support HDTV resolutions and custom timing adjustments natively. My basic EDTV set produces an amazing pixel-sharp HTPC image at its maximum resolution of 852x480. It's no contest compared to the flickery, blurry sorta-visible 800x600 output I had on the analog set!

There is one unfortunate side effect of this amazing output quality. Compared to the special edition Windows Media HD encoded version of Terminator 2 Extreme Edition, or even the boring old Windows Media Center Edition calibration videos, DVDs are quite blurry. It's astonishing how good these high bit rate videos look; the difference is truly profound. MPEG2 is starting to show its age, I suppose, but it is sobering to find out first hand that even 852x480 is more resolution than you need for DVD playback.

Written by Jeff Atwood

Indoor enthusiast. Co-founder of Stack Exchange and Discourse. Disclaimer: I have no idea what I'm talking about. Find me here: http://twitter.com/codinghorror