Coding Horror

programming and human factors

Gifts for Geeks, 2011 Edition

Between founding Stack Overflow (and later, running Stack Exchange) and having a child, I haven't had much time to blog about the holidays for a few years now. The last Gifts for Geeks I did was in 2008. Those recommendations are still as valid as ever, but I just couldn't muster the enthusiasm to do it every year.

I've also come to realize, especially after having a child, that the goal in life is not to own a lot of "stuff", but rather, to free yourself from everything except that which is essential, and that which you love.

I'm still working on this, and I probably will be until I die. That said, there are a few essential things I think any self respecting geek should have, things I use all the time and I truly love – and I feel it's my responsibility to let my fellow geeks, and the spouses and significant others of geeks, know about them. Otherwise you might end up with yet another WiFi Detecting Shirt as a gift this year, and that'd just be … sad, for everyone involved. So consider this a public service, and feel free to share this post, lest you show up to work in January and find yourself and all your coworkers wearing Wifi Detecting Shirts.

As I wrote in What's On Your Utility Belt? I've been carrying LED flashlights since 2005, and just in that time the average LED flashlight has gone from bright, to very bright, to amazingly bright, to ridiculously blinding laser-like bright. You can thank Haitz's Law for that:

[Haitz's Law] states that every decade, the cost per lumen (unit of useful light emitted) falls by a factor of 10, the amount of light generated per LED package increases by a factor of 20, for a given wavelength (color) of light. It is considered the LED counterpart to Moore's law, which states that the number of transistors in a given integrated circuit doubles every 18 to 24 months. Both laws rely on the process optimization of the production of semiconductor devices.

Or, as I like to call it, "why you will be regularly blinded by flashlights for the rest of your natural life." But on the plus side, it also means that today even inexpensive LED flashlights are plenty bright for all but the most niche applications. You no longer have to pay a big premium to get one that's usefully bright. LED lights are so awesome, in fact, that I own and recommend no less than three form factors:

  1. Fenix HL21 ($35)

    Fenix-hl21

    If you do any kind of DIY work at all, at some point you're going to want a focused light exactly where you are looking. If you can get over the "hey, I have this lamp strapped to my head and I look like a dork" factor, headlamps are ridiculously convenient. I had a much less bright (~40 lumens) headlamp and switching to this 90 lumen HL21 was a major improvement. I use this thing all the time. Looking cool is overrated.

  2. Fenix E21 ($35)

    Fenix-e21

    The E21 is much smaller than your typical full-size flashlight but it is every bit as bright as those giant police-baton like Maglites. It runs off two ubiquitous AA batteries, and has a pleasingly simple design, with an obvious switch in the rear and only two configurable light levels: low (48 lumens) and high (150 lumens). This is a flashlight you could buy your parents without baffling them. We own three, and each of our cars has one in the glove box. This is, in my opinion, what LED lights were meant to be.

  3. Fenix LD01R4 ($40)

    Fenix-ld01

    The latest revision of the LD01 is the proverbial Every Day Carry; a compact single AAA flashlight. As long as you have your keys with you, you'll never without a reliable, bright enough light. Twist the cap to balance between runtime and light output; the three modes are 85 lumens for 1 hour, 28 lumens for 3.5 hours, and 9 lumens for 11 hours. Pretty incredible from a single AAA battery! Oh, and I recommend a lithium AAA battery because they run longer and are 1/3 lighter than other types of batteries. Normally I wouldn't care, but the reduced weight is surprisingly noticeable in something you'll have in your pocket all the time.

All these LED lights have one thing in common: batteries. It's unavoidable. Because you're a responsible geek, of course you use modern rechargeable battery technology. And as I wrote in Adventures in Rechargeable Batteries, sophisticated battery chargers are like geek catnip.

Lacrosse Techology BC-900 AlphaPower battery charger

This is the LaCrosse BC1000 ($60), and it's a ton of fun to mess around with. Also, it recharges batteries. It might seem a little spendy, but it can do miraculous things like bring old nearly-dead rechargeable batteries back to life. And it comes with a bunch of actually useful accessories in the box:

  • Nylon carrying bag
  • 4 AA and 4 AAA rechargeable NiMH batteries
  • 4 C size battery adapters
  • 4 D size battery adapters

Yep, you can simulate C and D cells by putting the AA and AAA batteries inside the shells. The only battery type not represented here is the 9 volt. I own two of these LaCrosse chargers, and given the stupid number of AA and AAA powered devices in the house I'm thinking of buying a third. If you're a geek, you almost certainly have 99 battery problems, but armed with this baby, recharging ain't one. And don't forget the low self-discharge NiMH batteries, while you're at it.

Ah, the dremel. I think this Canadian forumgoer expressed it best:

It truly is hard for me to express the joy I feel when I am forced to break out the dremel; the last resort, the "Trojan Horse" of tools. In a dark place when all other tools abandon me and leave me heartbroken, the dremel always provides a loving shoulder to help complete my tasks. The dremel is a very selfless tool, he/she has no purpose to which they cling, yet is always willing to assist its fellow tools in completing theirs...

Drill strip a screw? The dremel can help... The jigsaw leave some nasty edges? dremel can restore them. I like to think of the dremel as the Jesus of tools.

They say Jesus performed many miracles and although it's not thoroughly documented, I believe his first miracle was, in fact, the dremel blueprint (he was a carpenter after all). The good Lord presented me with an image in a dream... I would like to share it.

Jesus-dremel

If you don't own a dremel, I'm sorry, but I'm going to have to ask you to turn in your geek card. The dremel is truly the swiss army knife of DIY projects. Any DIY project.

I use my Dremel about once every few months, mostly for things that I probably shouldn't even be attempting. But that's the beauty of the Dremel. It doesn't judge; it just helps you get s**t done, by any means necessary. I don't recommend buying a big Dremel kit to start, because it's hard to tell which accessories you'll actually want or need until you begin using this insanely versatile tool. I suggest starting with the entry-level high power Dremel kit ($90).

Dremel-4000

Finally, I have to put in a mention for an updated version of what is probably the most frequently used thing on my keychain, with the biggest bang for the gram other than my front door key -- the Leatherman Squirt PS4 ($24).

Leatherman-squirt-ps4

That's right, you no longer have to face the terrible existential conundrum of choosing between pliers or scissors. The new PS4 model now includes both pliers and scissors. This is nothing less than a Christmas miracle, people! (Oh yeah, and get this awesome tiny carabiner to attach it to your keychain so you can easily detach it when you need to bust it out.)

So that's it this year. Nothing extravagant. Nothing too expensive. No frills. Just essential stuff I love and use regularly. I hope you, or someone you love, will love them too.

[advertisement] What's your next career move? Stack Overflow Careers has the best job listings from great companies, whether you're looking for opportunities at a startup or Fortune 500. You can search our job listings or create a profile and let employers find you.

Discussion

Fast Approximate Anti-Aliasing (FXAA)

Anti-aliasing has an intimidating name, but what it does for our computer displays is rather fundamental. Think of it this way -- a line has infinite resolution, but our digital displays do not. So when we "snap" a line to the pixel grid on our display, we can compensate by imagineering partial pixels along the line, pretending we have a much higher resolution display than we actually do. Like so:

2d-anti-aliasing

As you can see on these little squiggly black lines I drew, anti-aliasing produces a superior image by using grey pixels to simulate partial pixels along the edges of the line. It is a hack, but as hacks go, it's pretty darn effective. Of course, the proper solution to this problem is to have extremely high resolution displays in the first place. But other than tiny handheld devices, I wouldn't hold your breath for that to happen any time soon.

This also applies to much more complex 3D graphics scenes. Perhaps even more so, since adding motion amplifies the aliasing effects of all those crawling lines that make up the edges of the scene.

No-aa-vs-4x-aa

But anti-aliasing, particularly at 30 or 60 frames per second in a complex state of the art game, with millions of polygons and effects active, is not cheap. Per my answer here, you can generally expect a performance cost of at least 25% for proper 4X anti-aliasing. And that is for the most optimized version of anti-aliasing we've been able to come up with:

  1. Super-Sampled Anti-Aliasing (SSAA). The oldest trick in the book - I list it as universal because you can use it pretty much anywhere: forward or deferred rendering, it also anti-aliases alpha cutouts, and it gives you better texture sampling at high anisotropy too. Basically, you render the image at a higher resolution and down-sample with a filter when done. Sharp edges become anti-aliased as they are down-sized. Of course, there's a reason why people don't use SSAA: it costs a fortune. Whatever your fill rate bill, it's 4x for even minimal SSAA.

  2. Multi-Sampled Anti-Aliasing (MSAA). This is what you typically have in hardware on a modern graphics card. The graphics card renders to a surface that is larger than the final image, but in shading each "cluster" of samples (that will end up in a single pixel on the final screen) the pixel shader is run only once. We save a ton of fill rate, but we still burn memory bandwidth. This technique does not anti-alias any effects coming out of the shader, because the shader runs at 1x, so alpha cutouts are jagged. This is the most common way to run a forward-rendering game. MSAA does not work for a deferred renderer because lighting decisions are made after the MSAA is "resolved" (down-sized) to its final image size.

  3. Coverage Sample Anti-Aliasing (CSAA). A further optimization on MSAA from NVidia [ed: ATI has an equivalent]. Besides running the shader at 1x and the framebuffer at 4x, the GPU's rasterizer is run at 16x. So while the depth buffer produces better anti-aliasing, the intermediate shades of blending produced are even better.

Pretty much all "modern" anti-aliasing is some variant of the MSAA hack, and even that costs a quarter of your framerate. That's prohibitively expensive, unless you have so much performance you don't even care, which will rarely be true for any recent game. While the crawling lines of aliasing do bother me, I don't feel anti-aliasing alone is worth giving up a quarter of my framerate and/or turning down other details to pay for it.

But that was before I learned that there are some emerging alternatives to MSAA. And then, much to my surprise, these alternatives started showing up as actual graphics options in this season's PC games -- Battlefield 3, Skyrim, Batman: Arkham City, and so on. What is this FXAA thing, and how does it work? Let's see it in action:

No AA 4x MSAA FXAA
Noaa-closeup-1 Msaa-closeup-1 Fxaa-closeup-1

(this is a zoomed fragment; click through to see the full screen)

FXAA stands for Fast Approximate Anti-Aliasing, and it's an even more clever hack than MSAA, because it ignores polygons and line edges, and simply analyzes the pixels on the screen. It is a pixel shader program documented in this PDF that runs every frame in a scant millisecond or two. Where it sees pixels that create an artificial edge, it smooths them. It is, in the words of the author, "the simplest and easiest thing to integrate and use".

Fxaa-algorithm

FXAA has two major advantages:

  1. FXAA smooths edges in all pixels on the screen, including those inside alpha-blended textures and those resulting from pixel shader effects, which were previously immune to the effects of MSAA without oddball workarounds.
  2. It's fast. Very, very fast. Version 3 of the FXAA algorithm takes about 1.3 milliseconds per frame on a $100 video card. Earlier versions were found to be double the speed of 4x MSAA, so you're looking at a modest 12 or 13 percent cost in framerate to enable FXAA -- and in return you get a considerable reduction in aliasing.

The only downside, and it is minor, is that you may see a bit of unwanted edge "reduction" inside textures or in other places. I'm not sure if it's fair to call this a downside, but FXAA can't directly be applied to older games; games have to be specifically coded to call the FXAA pixel shader before they draw the game's user interface, otherwise it will happily smooth the edges of on-screen HUD elements, too.

The FXAA method is so good, in fact, it makes all other forms of full-screen anti-aliasing pretty much obsolete overnight. If you have an FXAA option in your game, you should enable it immediately and ignore any other AA options.

FXAA is an excellent example of the power of simple hacks and heuristics. But it's also a great demonstration of how attacking programming problems from a different angle -- that is, rather than thinking of the screen as a collection of polygons and lines, think of it as a collection of pixels -- can enable you to solve computationally difficult problems faster and arguably better than anyone thought possible.

[advertisement] What's your next career move? Stack Overflow Careers has the best job listings from great companies, whether you're looking for opportunities at a startup or Fortune 500. You can search our job listings or create a profile and let employers find you.

Discussion

Bias Lighting

I've talked about computer workstation ergonomics before, but one topic I didn't address is lighting. We computer geeks like it dark. Really dark. Ideally, we'd be in a cave. A cave … with an internet connection.

You-read-my-doormat

The one thing that we can't abide is direct overhead lighting. Every time the overhead light gets turned on in this room, I feel like a Gremlin shrieking Bright light! Bright light! Oh, how it burns!

But there is a rational basis for preferring a darkened room. The light setup in a lot of common computing environments causes glare on the screen:

If your room's lit, as most are, by fittings hanging from the ceiling, you'll be wanting to set up your monitor so that you don't see reflections of the lights in it. The flat screens on many modern monitors (like the excellent Samsung I review here) help, because they reflect less of the room behind you. And anti-reflective screen coatings are getting better and better too. But lots of office workers still just can't avoid seeing one or more ceiling fluoros reflected in their screen.

A good anti-reflective coating can reduce most such reflections to annoyance level only. But if you can see lights reflected in your screen, you can probably also directly see lights over the top of your monitor. Direct line of sight, or minimally darkened reflected line of sight, to light sources is going to give you glare problems.

Glare happens when there are small things in your field of vision that are much brighter than the general scene. Such small light sources can't be handled well by your irises; your eyes' pupil size is matched to the overall scene illumination, and so small light sources will appear really bright and draw lines on your retinas. The more of them there are, and the brighter they are, the more work your eyes end up doing and the sooner they'll get tired.

While a darkened room is better for viewing most types of computer displays, it has risks of its own. It turns out that sitting in a dark room staring at a super bright white rectangle is … kind of bad for your eyes, too. It doesn't help that most LCDs come from the factory with retina-scorching default brightness levels. To give you an idea of how crazy the defaults truly are, the three monitors I'm using right now have brightness set to 25/100. Ideally, your monitors shouldn't be any brighter than a well-lit book. Be sure to crank that brightness level down to something reasonable.

You don't want total darkness, what you want is some indirect lighting – specifically bias lighting. It helps your eyes compensate and adapt to bright displays.

"[Bias lighting] works because it provides enough ambient light in the viewing area that your pupils don't have to dilate as far. This makes for less eyestrain when a flashbang gets thrown your way or a bolt of lightning streams across the screen," he told Ars. "Because the display is no longer the only object emitting light in the room, colors and black levels appear richer than they would in a totally black environment. Bias lighting is key in maintaining a reference quality picture and reducing eye-strain."

Bias lighting is the happy intersection of indirect lighting and light compensation. It reduces eye strain and produces a better, more comfortable overall computing display experience.

Bias-lighting

The good news is that it's trivially easy to set up a bias lighting configuration these days due to the proliferation of inexpensive and bright LEDs. You can build yourself a bias light with a clamp and a fluorescent bulb, or with some nifty IKEA LED strips and double-sided foam tape. It really is that simple: just strap some lights to the back of your monitors.

I'm partial to the IKEA Dioder and Ledberg technique myself; I currently have an array of Ledbergs behind my monitors. But if you don't fancy any minor DIY work, there are a wide array of inexpensive self-adhesive LED strips out there – which also have the benefit of being completely USB powered, and thus can power up and down with your monitor or TV.

Of course, lighting conditions are a personal preference, and I'd never pitch bias lighting as a magic bullet. But there is science behind it, it's cheap and easy to try, and I wish more people who regularly work in front of a computer knew about bias lighting. If nothing else, I hope this post gets people to turn their LCD monitors down from factory brightness level infinity to something a tad more gentle on the old Mark I Eyeball.

[advertisement] What's your next career move? Stack Overflow Careers has the best job listings from great companies, whether you're looking for opportunities at a startup or Fortune 500. You can search our job listings or create a profile and let employers find you.
Discussion

On Parenthood

Our son was born March 12th, 2009. He's a little over two and a half years old. Now, I am the wussiest wuss to ever wuss up the joint, so take everything I'm about to say with a grain of salt – but choosing to become a parent is the hardest thing I have ever done. By far. Everything else pales in comparison.

My feelings on this matter are complex. I made a graph. You know, for the children.

Children

That one percent makes all the difference.

It's difficult to explain children to people who don't yet have children, because becoming a parent is an intensely personal experience. Every child is different. Every parent is different. Every culture has their own way of doing things. The experience is fundamentally different for every new parent in the world, yet children are the one universally shared thing that binds our giant collective chain letter of human beings together, regardless of nationality and language. How do you explain the unexplainable?

Well, having children changes you. Jonathan Coulton likens it to becoming a vampire.

I was having a conversation with a friend who had recently become a parent, and she reminded me of something I had forgotten about since my daughter was born. She was describing this what-have-I-done feeling – I just got everything perfect in my life, and then I went and messed it all up by having a baby. I don’t feel that way anymore, but the thought certainly crossed my mind a few times at the beginning. Eventually you just fall in love and forget about everything else, but it’s not a very comfortable transition. I compare the process to becoming a vampire, your old self dies in a sad and painful way, but then you come out the other side with immortality, super strength and a taste for human blood. At least that’s how it was for me. At any rate, it’s complicated.

Maybe tongue in cheek, but not that far from the truth, honestly. Your children, they ruin everything in the nicest way.

Before Henry was born, I remembered Scott Hanselman writing this odd blurb about being a parent:

You think you love you wife when you marry her. Then you have a baby and you realize you'd throw your wife yourself under a bus to save your baby. You can't love something more.

Nuts to that, I thought. Hanselman's crazy. Well, obviously he doesn't love his wife as much as I love mine. Sniff. Babies, whatever, sure, they're super cute on calendars, just like puppies and kittens. Then I had a baby. And by God, he was right. I wouldn't just throw myself under a bus for my baby, I'd happily throw my wife under that bus too – without the slightest hesitation. What the hell just happened to me?

As an adult, you may think you've roughly mapped the continent of love and relationships. You've loved your parents, a few of your friends, eventually a significant other. You have some tentative cartography to work with from your explorations. You form ideas about what love is, its borders and boundaries. Then you have a child, look up to the sky, and suddenly understand that those bright dots in the sky are whole other galaxies.

You can't possibly know the enormity of the feelings you will have for your children. It is absolutely fucking terrifying.

When I am holding Henry and I tickle him, I can feel him laughing all the way to his toes. And I realize, my God, I had forgotten, I had completely forgotten how unbelievably, inexplicably wonderful it is that any of us exist at all. Here I am with this tiny, warm body so close to me, breathing so fast he can barely catch up, sharing his newfound joy of simply being alive with me. The sublime joy of this moment, and all the other milestones – the first smile, the first laugh, the first "dada" or "mama", the first kiss, the first time you hold hands. The highs are so incredibly high that you'll get vertigo and wonder if you can ever reach that feeling again. But you peak ever higher and higher, with dizzying regularity. Being a new parent is both terrifying and exhilarating, a constant rollercoaster of extreme highs and lows.

It's also a history lesson. The first four years of your life. Do you remember them? What's your earliest memory? It is fascinating watching your child claw their way up the developmental ladder from baby to toddler to child. All this stuff we take for granted, but your baby will painstakingly work their way through trial and error: eating, moving, walking, talking. Arms and legs, how the hell do they work? Turns out, we human beings are kind of amazing animals. There's no better way to understand just how amazing humans are than the front row seat a child gives you to observe it all unfold from scratch each and every day, from literal square zero. Children give the first four years of your life back to you.

I wasn't sure how to explain meeting new people to Henry, so I decided to just tell him we've met a new "friend" every time. Now, understand that this is not at all the way I view the world. I'm extremely wary of strangers, and of new people in general with their agendas and biases and opinions. I've been burned too many times. But Henry is open to every person he meets by default. Each new person is worth greeting, worth meeting as a new experience, as a fellow human being. Henry taught me, without even trying to, that I've been doing it all wrong. I realized that I'm afraid of other people, and it's only my own fear preventing me from opening up, even a little, to new people that I meet. I really should view every new person I meet as a potential friend. I'm not quite there yet; it's still a work in progress. But with Henry's help, I think I can. I had absolutely no idea my child would end up teaching me as much as I'm teaching him.

Having a child is a lot like running a marathon. An incredible challenge, but a worthwhile and transformative experience. It leaves you feeling like you truly accomplished something for all that effort. After all, you've created something kind of amazing: a person.

Bob: It gets a whole lot more complicated when you have kids.

Charlotte: It's scary.

Bob: The most terrifying day of your life is the day the first one is born.

Charlotte: Nobody ever tells you that.

Bob: Your life, as you know it... is gone. Never to return. But they learn how to walk, and they learn how to talk, and you want to be with them. And they turn out to be the most delightful people you will ever meet in your life.

It's scary and it's wonderful in equal measure. So why not have another baby? Or so we thought.

Atwood-babbies

Turns out, we're having two babies. Both are girls, due in mid-February 2012.

I've been told several times that you should never be crazy enough to let the children outnumber you. I hope to ultimately win the War of the Lady Babies, but when it comes to children, I think all anyone can ever realistically hope for is a peaceful surrender.

[advertisement] What's your next career move? Stack Overflow Careers has the best job listings from great companies, whether you're looking for opportunities at a startup or Fortune 500. You can search our job listings or create a profile and let employers find you.
Discussion

Multiple Video Cards

Almost nobody should do what I am about to describe – that is, install and use more than one video card. Nobody really needs that much graphics performance. It's also technically complex and a little expensive. But sometimes you gotta say to hell with rationality and embrace the overkill.

Why? Battlefield 3, that's why.

Battlefield-3-caspian

I've been a fan of the series from the earliest days of Battlefield 1942, and I lost hundreds of hours to Battlefield 2 and Battlefield: Bad Company 2. I even wrote about the original Battlefield 2 demo on this very blog six years ago. So, yeah, I'm a superfan from way back. As much as I was anticipating Battlefield 3, I have to say the open beta convinced me it is everything I always wanted, and more. Glorious sandbox warfare on an enormous, next-generation destructible battlefield is a beautiful thing.

Since PC was the lead platform for Battlefield 3, it is the rare current game that isn't dumbed down to PS3 and Xbox 360 console levels; it is a truly next-generation engine designed to scale over the next few years of PC performance.

Battlefield-3-graphics-levels

This also means it's going to be rough on current PCs; at a minimum, you'll need a fast dual core CPU, and a modern video card with 512mb or more video memory. It only goes up from there. Way up. Like most games, Battlefield 3 is far more limited by video card performance than CPU performance. This is normally the place where I'd trot out my standard advice urging you to buy one of the new hotness video cards released this holiday season. But unfortunately due to difficulties with the 40nm to 28nm process transition for ATI and NVIDIA, there aren't any new hotness video cards this year.

So what's a poor performance addicted Battlefield superfan to do? Double down and add another video card for more performance, that's what. Both ATI and NVIDIA have offered mature multi-GPU support for a few years now, and they've mostly settled on a simple Alternate Frame Rendering (AFR) strategy where each video card alternates between frames to share the graphics rendering work.

Alternate-frame-rendering-diagram

The little arrow there is a bridge attachment that you place on both cards so they can synchronize their work. Yes, there is a bit of overhead, but it scales surprisingly well, producing not quite double the performance but often in the area of 1.8x or so. Certainly enough to make it worth your while. You can technically add up to four video cards in this manner, but as with multiple CPUs your best bang for the buck is adding that second one; the third, fourth, and beyond provide increasingly diminished returns.

The good news is that the market crash in BitCoin GPU mining (if you don't know what this is, don't ask… please) means there is a glut of recent video cards up for sale on eBay right now. I have the same AMD Radeon HD 5870 that I've had since early 2010. I picked up another 5870 on eBay for a mere $170. This is a great video card, well ahead of its time when it was originally released, and even now only 10% slower than the fastest video card AMD makes. I simply dropped the second card in my system and installed the bridge connector.

Dual-radeon-gpus

You may recognize this computer as a further tweaked version of my last build (which is still awesome, by the way, and highly recommended). Anyway, for this to work, you'll need to establish a few things about your computer before rushing out and buying that second video card.

  1. A motherboard that has two video card capable PCI Express slots. Most aftermarket and enthusiast motherboards have this, but if you bought a system from say, Dell, it's less clear.
  2. A power supply with enough headroom to drive two video cards. Warning: modern gaming video cards are major power hogs -- they easily pull 100 to 200 watts under load. Each. Sometimes more than that! Be absolutely sure you have a quality power supply rated for a minimum of 600 watts. Each video card will have two power connectors, either 6 or 8 pin. Check that your power supply offers enough connectors, or that you have converters on hand.
  3. A case with sufficient airflow to dissipate the 400 to 800 watts of CPU and GPU heat that you'll be generating. Understand that this is serious amounts of heat while gaming, way more than even the highest of high end PCs would normally produce. Yes, it is possible to do this quietly (at least in the typical idle case), but it will take some engineering work.

Beyond that, I found there are some additional peculiarities of multi-GPU systems that you need to be aware of.

  • Make sure that the two cards you use are not only of the exact same family (minor vendor differences are OK) but also have identical clock and memory speeds. It's not supposed to matter, but I found that it did and I had to flash one of my cards to set it to default speeds to match the other card.
  • Do not attempt to overclock your system while getting the multiple GPUs up and running. In particular, be extremely careful not to mess with the bus speed as timings are critical when dealing with two GPUs on the PCI Express bus synchronizing their work. Trust me on this one.
  • Do a clean video driver remove and install the very very latest video drivers after putting the second card in. I recommend Driver Sweeper to remove any traces of your old drivers before rebooting.

Don't say I didn't warn you about this stuff, because I said it would be technically complex in the first paragraph. But after a (lot) of teething pains, I'm happy to report that multiple GPUs really does work as advertised. I can crank up games to the absolute maximum settings on my 27" monitor and get nearly constant 60 frames per second. As you can see in the below example, we go from 44 fps to 77 fps in Battlefield: Bad Company 2.

Battlefield-bad-company-2-gpu-scaling

Now, Battlefield 3 (beta) is so very bleeding edge that I can't quite get it to max settings even with two GPUs in play. But I can now run very high settings, much higher than I could with a single GPU.

To be honest, it's unlikely I will continue with multiple GPUs through 2012 when the next-generation video cards are released. With every new video card introduction, you're supposed to get about the same performance in the new card as you did with two previous generation video cards working together. So at best this is a sort of sneak preview, cheating time by pretending we have a next-generation video card today. There are obvious efficiencies involved in performing that parallelization on a single GPU die rather than through two distinct video cards sitting on the PCI bus.

There's also the issue of micro-stuttering. I personally haven't found that to be a big problem unless you're pushing the graphics settings beyond what even two cards can reliably deliver. But if the frame rate dips low enough, the overhead of synchronization between the cards can interfere with overall frame rate in a perceptible way.

A single fast video card is always going to be the simpler, easier, and cheaper route. But multiple video cards sure is nifty tech in its own right, and it wasn't too expensive to get started with at $170. In the meantime, I'm having a ball playing with it, and I am dying to test my configuration with the final release of Battlefield 3 on October 25th. Join me if you like!

[advertisement] What's your next career move? Stack Overflow Careers has the best job listings from great companies, whether you're looking for opportunities at a startup or Fortune 500. You can search our job listings or create a profile and let employers find you.

Discussion