Coding Horror

programming and human factors

LCD Progress

After revisiting my ongoing three monitor obsession recently, I was compelled to upgrade my current mongrel mix of varying LCD monitor brands and sizes. I settled on three 20" Samsung 204B panels.

Standardizing on a single type of monitor in a multiple monitor configuration has obvious advantages in color consistency and uniform sizing. But LCD technology has also advanced quite a bit since my last LCD purchase. Here's a small chart outlining the relevant specs of every LCD panel I've ever owned:

Samsung 191T Rosewill R910J Samsung 213T Samsung 204B
Vintage Mid 2003 Early 2005 Early 2005 Late 2006
Size19" 19" 21.3" 20"
Price Paid $660 $320 $840 $350
Resolution1280x1024 1280x1024 1600x1200 1600x1200
Brightness250 cd/m2 250 cd/m2 250 cd/m2 300 cd/m2
Contrast500:1 600:1 500:1 800:1
Viewing Angle170 170 170 160
Response time25 ms 25ms 25ms 5ms

Yes, there are minor brightness and contrast bumps. But more importantly, there's been a huge improvement in response time. This addresses one of the key criticisms of LCD monitors:

The liquid crystals in an LCD have to change their molecular structure to manipulate light, and that's not a speedy process. As a result, LCD pixels respond much slower than what you may be used to on a CRT monitor, and that can cause ghosting and streaking, especially at high frame rates. The pixel response time of LCDs has improved dramatically over the years, but CRTs still have the edge. What's most worrying about pixel response times, however, is that LCDs with similar pixel response time specs don't always show the same performance in the real world. It's really something you have to check for yourself.

That was written in 2002. LCDs are larger and cheaper now, and getting larger and cheaper every day. The improvement in response time makes watching video and gaming on LCDs nearly equivalent to a CRT. Worries about dead pixels are generally a thing of the past, too. It's fair to say that LCDs have won the hearts, minds, and wallets of the public in 2006. It's difficult to find places that even sell CRTs these days.

I'm glad the era of the CRT is finally over. Not just because LCDs are more svelte and power efficient, but also because LCD monitors are much less finicky than CRT monitors. Getting great image quality with an LCD is dead simple. You only need to do two things:

  1. Always use the DVI port to connect your LCD.

    dvi-and-vga-ports.jpg

    The DVI port is digital, so you get a perfect connection to the LCD every time. Using a VGA port with an LCD is downright pathological-- it means you're converting the digital bits inside your video card to an analog video signal, then converting the analog video signal back to digital bits inside the LCD. You open yourself up to a whole host of nasty analog conversion issues along the way. Don't do this! Use the DVI port! If you own a video card that doesn't have two DVI ports, it's time for a new video card.

  2. Set your monitor's refresh rate to 60 Hertz.

    monitor-refresh-rate-dialog.png 

    Refresh rate has no real meaning to a LCD, but it can still cause problems if it's set to anything higher than 60 hertz. This ought to be set automatically when you connect a LCD panel, but it never is in my experience. So go in and lock the refresh rate down to 60 hertz.

  3. Set the display to the maximum native resolution.

    LCDs work best at their native resolution-- when there is one pixel on the LCD for every pixel on the screen. If you run a 1600x1200 LCD panel with a 1280x1024 desktop, you'll get a blurry, upsized image instead of the perfect digital clarity LCDs are capable of. The maximum native resolution is usually the maximum resolution available in the display dialog, but double check your monitor's manual if you're unsure. LCDs should look perfect. If it looks blurry at all, either you're using an anlaog VGA input, or you're using the wrong resolution.

The digital connection between PC and LCD is about as good as it gets, right out of the box. Contrast this with the experience of hooking up a new CRT: to get the best image quality, you had to tweak the refresh rate, the image sizing, the pincushion adjustment, and a half-dozen other tedious, fiddly little analog settings.

But even with the refresh rate issue largely addressed, LCDs do have a few other display peculiarities that linger on:

Viewing angle. When viewed from the side, above, or below, images on LCD monitors become noticeably darker, and colors start to get washed out. CRTs, on the other hand, can be viewed from extreme angles with little loss in actual picture quality. Admittedly, there are few areas where viewing angle makes a big difference for end users, but the limitation is worth noting. If, for example, you want to watch a DVD on your LCD with a group of friends, everyone is going to have to get real cozy with each other on the couch to see things properly. Limited viewing angles might not be a bad excuse to get a little closer to your date, but your buddy that's just over to watch Office Space may object to you rubbing up against his leg like that.

Color reproduction. Although LCD screens claim support for 32-bit color, the displays themselves often aren't capable of accurately reproducing all 16.7 million colors common 32-bit graphics modes. With a properly calibrated LCD, a casual user probably won't notice the difference, but the limitation will probably give graphics designers fits.

Contrast ratio.  LCDs are back-lit whenever they're on, which means that TFT panels have to orient the liquid crystals to block light if they want to display black. Some light inevitably manages to seep through the cracks, which limits a screen's ability to display a true black.

Some of these peculiarities are only of interest to hardcore graphic designers. I'm assuming that most modern LCDs are capable of displaying the full 32-bit range of color by now. Regardless, color calibration remains as much an art as a science, and adjusting colors is difficult on LCDs. I've also noticed backlight problems on every LCD I've owned, including the new models that just arrived. Blacks are never quite as deep as they would be on a CRT. And the backlighting is never perfectly uniform. I tend to see gradations in color and brightness on LCDs where there shouldn't be any. As for viewing angle, this is more of a problem for LCD televisions than monitors. In computing scenarios, we tend to sit much closer to the monitors, and in a fixed location. If you're a hardcore graphic designer, you can buy extremely high end, extremely expensive LCD monitors which attempt to resolve the color and uniformity problems endemic to most LCDs. But these specialty monitors do nothing for viewing angle, and they tend to sacrifice response time, making the ghosting and streaking problems even worse.

Is it possible to calibrate a LCD? You can get some ideas of what you might want to calibrate in CNET's description of their LCD monitor testing methodology. Software like Displaymate or MonitorTest can even walk you through the process. But the earlier good news-- that LCD displays generally don't need to be adjusted to produce good image quality -- is also the bad news here. There's not much you can adjust on a digitally connected LCD, other than the brightness and color temperatures. But that's generally enough to calibrate the essentials: brightness and gamma.

After you're done calibrating, it's time to clean all that dust off your LCD. I've been wary of cleaning my LCD panels, because I'm afraid of damaging the anti-glare or glossy (laptop) coatings. Soap and water leaves massive streaks, and caustic window cleaners are clearly out. I was recently turned on to the Monster ScreenClean display cleaning kit, which works wonderfully. It cleans almost effortlessly without streaks, and it's a screen-friendly non-alcohol formulation. It's almost like screen lube.

LCDs still have a way to go before they're perfect substitutes for CRTs. With the recent industry advances in refresh rate, at least LCDs no longer have any glaring weaknesses. Here's hoping that improvements continue to keep pace in in viewing angle and backlighting.

Discussion

Is Your Database Under Version Control?

When I ask development teams whether their database is under version control, I usually get blank stares.

The database is a critical part of your application. If you deploy version 2.0 of your application against version 1.0 of your database, what do you get? A broken application. And that's why your database should always be under source control right next to your application code. You deploy the app, and you deploy the database. Like peanut butter and chocolate, they are two great tastes that taste great together.

At Vertigo, we rolled our own tool to reverse engineer the database into a set of files, which we then check into source control. I've visited other customers that did the same thing. But why write what you can buy? Leon Bambrick lists a number of great tools you can purchase to help you in get your database under version control where it belongs. Unfortunately, he omitted one of the best tools: Microsoft's new Team Edition for Database Professionals.

a database under version control

Team Edition for Database Professionals goes far beyond mere reverse engineering of the database into files. You get an industrial-strength database project that you can add to your solution, along with a few other goodies:

  • Create test data. A blank database schema isn't particularly useful to develop against. Now you can distribute your database schema along with one-click synthetic data generation plans. With flexible synthetic data generators, you can avoid dumping production data to developers, or, God forbid, letting developers fend for themselves by creating their own test data. And you can generate 1,000 rows or 100,000 rows. I wish I had a dollar every time an application I've worked on began to have performance problems because none of the developers had the foresight to test the app with more than a few rows of crappy, manually entered test data. Data generation is a huge increase in development quality.
  • Schema comparison. If we can compare two files in source control, why can't we compare two tables? A robust schema comparison tool is essential. Not sure why staging is different than production? Run a quick schema compare on 'em. Did I mention it also creates a real-time update script every time you do a comparison.. which it can execute with one additional click?
  • Data comparison. If your testers are complaining because they entered test data that causes your application to crash, run the data compare tool to determine exactly how their data differs from yours.
  • Database unit testing. If you make a change to the database schema, how do you know if you've broken any applications that rely on that database? You know because you've written unit tests that validate the database. Right? Right?
  • Refactoring. You can rename entities in the database (columns, tables, procs, etc) and have the rename automatically cascade throughout the database.
  • Integrated T-SQL editor. T-SQL is now a first class language construct in the IDE, just like C# and VB.NET. Run queries and view execution plans and stats, all without leaving the cozy confines of good old Visual Studio.

It's a great tool.. if you're a Microsoft shop, and you're using SQL Server. I highly recommend downloading the trial edition. But the specifics of the tool aren't important; get your database under version control by any means necessary. Making your database a first-class citizen in source control seems totally obvious in retrospect. Now if only we could convince more developers to actually do it.

Discussion

Printer and Screen Resolution

A recurring theme in Edward Tufte's books is the massive difference in resolution between the printed page and computer displays. Printed pages lend themselves to vastly greater information density.

Sparklines are one particular technique of Tufte's designed to exploit the greater resolution of the printed page. I was curious just how profound the difference in resolution is between a computer screen and a book, so I scanned in a sparkline from Beautiful Evidence with my aging Epson 1200U scanner.

sparkline-book-size.png

This is what the sparkline looks like on the page, roughly. It's quite small. The actual size will depend on the resolution of the display you're using to view this, of course, but it's in the ballpark at 1280x1024 and 1600x1200.

Here's the very same sparkline at the maximum resolution of my scanner, 1200 DPI:

sparkline-scanned-1200dpi.jpg

sparkline-scanned-text-1200dpi.jpg

Of course, Beautiful Evidence was commercially printed, which is typically very high resolution-- on the order of 2400 DPI. Let's try something a little less sophisticated. Here's a bit of text in 8 point Gill Sans MT that I printed on our decrepit old NEC 870 printer and scanned back in at 1200 DPI:

printer-600dpi-text.jpg

This is output from a 600 DPI printer that was originally released more than 7 years ago.

For comparison, a typical computer display is between 72 and 100 DPI. But that doesn't stop us from trying:

Airline dashboard sparklines

As Tufte promised, the difference in resolution between the most expensive computer display you can buy and a cheap off-brand printer really is astronomical. We have a long, long way to go before computer displays can get anywhere near printer resolutions.

Discussion

Moore's Law in Practical Terms

There are two popular formulations of Moore's Law:

The most popular formulation [of Moore's Law] is the doubling of the number of transistors on integrated circuits every 18 months. At the end of the 1970s, Moore's Law became known as the limit for the number of transistors on the most complex chips. However, it is also common to cite Moore's Law to refer to the rapidly continuing advance in computing power per unit cost, because transistor count is also a rough measure of computer processing power.

The number of transistors on a CPU hasn't actually been doubling every 18 months; it's been doubling every 24 months. Here's a graph of the transistor count of each major Intel x86 chip family release from 1971 to 2006:

Intel x86 transistor counts, 1971-2006

The dotted line is the predicted transistor count if you doubled the 2,300 transistors from the Intel 4004 chip every two years since 1971.

That's why I prefer the second, looser definition of Moore's law: dramatic increases in computing power per unit cost. If you're a stickler for detail, there's an extensive investigation of Moore's law at Ars Technica you can refer to.

But how do we correlate Moore's Law-- the inexorable upward spiral of raw transistor counts-- with performance in practical terms? Personally, I like to look at benchmarks that use "typical" PC applications, such as SysMark 2004. According to page 14 of this PDF, SysMark 2004 scores are calibrated to a reference system: a Pentium 4 2.0 GHz. The reference system scores 100. Thus, a system which scores 200 in SysMark 2004 will be twice as fast as the reference system.

So, what was the first new CPU to double the performance of the SysMark 2004 reference system with a perfect 200? The Pentium 4 "Extreme Edition" 3.2 GHz scores 197 on the SysMark 2004 office benchmark in this set of Tom's Hardware benchmarks. Let's compare the release dates of these two CPUs:

Pentium 4 2.0 GHzAugust 27th, 2001
Pentium 4EE 3.2 GHzNovember 3rd, 2003

It took 26 months to double real world performance in SysMark 2004. That tracks almost exactly with the doubling of transistor counts every 24 months.

This isn't a perfect comparison, since other parts of the PC get faster at different rates. But it's certainly a good indication that CPU transistor count is fairly reliable indicator of overall performance.

Discussion

Joining The Prestigious Three Monitor Club

I have something in common with Bill Gates and Larry Page:

Larry Page: I have a weird setup in my office. I have one computer with three monitors: one flat-screen monitor and two regular ones. I have my browser on one screen, my schedule on another and my e-mail on another. I can drag things to different screens. I also have a projector. So if I'm talking with everyone in my office, I can move stuff onto a big screen.

Bill Gates: If you look at this office, there isn't much paper in it. On my desk I have three screens, synchronized to form a single desktop. I can drag items from one screen to the next. Once you have that large display area, you'll never go back, because it has a direct impact on productivity.

We're all members of the three monitor club.

If you're only using one monitor, you are cheating yourself out of potential productivity. Two monitors is a no-brainer. It's so fundamental that I included it as a part of the Programmer's Bill of Rights.

But you can do better.

As good as two monitors is, three monitors is even better. With three monitors, there's a "center" to focus on. And 50% more display area. While there's certainly a point of diminishing returns for additional monitors, I think three is the sweet spot. Even Edward Tufte, in the class I recently attended, explicitly mentioned multiple monitors. I don't care how large a single display can be; you can never have enough desktop space.

Normally, to achieve three monitors, you have to either:

  1. Buy an exotic video card that has more than 2 monitor connections.
  2. Install a second video card.

The first option is difficult because video cards with 3+ monitor connections are quite rare and usually expensive to boot. The second option, adding an additional video card, is easier, but not without some compatibility pitfalls of its own. But there's a third way that may be easiest of all. The Matrox TripleHead2Go is a neat little external device that provides three display support from a single video output. And it's now available in analog VGA and digital DVI editions.

matrox triplehead2go

There is one big caveat, however. In a modern three monitor config, the operating system sees each monitor as an independently controllable desktop. You can set resolution, size, and position of the monitors independently, and windows can intelligently size themselves to each desktop on each monitor.

display properties, three monitors

With the matrox Triplehead2Go, you're stuck with one mongo giant desktop that spans all your monitors.

display properties, one ultra-ultra-widescreen monitor

This is a very old-school way of implementing multiple monitors. Before Windows was properly aware of multiple displays (think NT 4.0 era), the "giant desktop" was the only way you could get more than one display to work at all. And "giant desktop" has a lot of downsides:

  1. Maximizing a window becomes an exercise in futility.
  2. You may not want your start menu on the leftmost monitor.
  3. With two monitors, "centered" dialogs split the middle.

To avoid the many problems of "giant desktop" fakery, you really need the OS to know that you're using two or three physical monitors, along with their resolutions, positions, and so forth.

But the Triplehead2Go has its charms. You don't have to open your computer to install it, for one thing. And it works with computers that can't be opened, such as laptops. The Triplehead2Go abstracts away the multiple monitors at a hardware level and presents itself to the operating system as a giant ultra widescreen monitor.

The Triplehead2Go also has one unexpected strength: video games. 3D acceleration across multiple video cards is tricky at best. And there's no good way to tell a game to use multiple monitors unless it's explicitly coded to do so. The Triplehead2Go device neatly sidesteps both of these limitations by externally simulating one ultra-ultra-wide monitor.

PCFormat UK experimented with the Triplehead2Go in a couple recent and upcoming game titles, such as Armed Assault:

Armed Assault on 3 monitors

Armed Assault also takes advantage of an optional TrackIR head-tracking device. The combination of a three-monitor setup with head tracking is incredibly immersive, and has to be seen to be believed. Watch the video and be amazed.

Three-monitor setups are particularly strong in "simulator" type games where peripheral vision is crucial to gameplay. PC Format UK tried it with the recently released GTR2 racing game, and the results are impressive.

GTR2 on three monitors

There are lots more screenshots of various games running in triple-head mode at Tom's Hardware and Neoseeker.

The traditional "add another video card" method is still the preferred way to gain entry into the prestigious three monitor club. But for laptop users and gamers, the Matrox Triplehead2Go is also a nice option with a few caveats.

Discussion