Coding Horror

programming and human factors

Building a PC, Part II

Yesterday, we completed a basic build of Scott Hanselman's computer. We built the system up enough to boot to the BIOS screen successfully. Today, we'll complete the build by installing an operating system and burning it in.

The first thing we'll need is hard drives. The Antec P182 case has a well engineered drive mounting system. The bottom cage holds 4 drives, with soft rubber grommets to support each drive, and more importantly, to isolate the case from transmitted, resonant hard drive vibration noise. After all, they are giant hunks of circular metal spinning at 7,200 to 10,000 RPMs.

PC build, hard drive cage

The boot drive is a 10,000 RPM Raptor, which I can't recommend highly enough. The secondary drive is a run of the mill 500 GB model. I slid them in and secured them using the long screws provided with the case.

I connected two SATA cables and threaded them down to the bottom channel through the center cutout. I snapped a modular SATA power cable into the Corsair HX series power supply, and routed that cable around the back, into the hard drive compartment.

PC build, hard drives installed

Houston, we have storage!

But we can't install an OS without an optical drive. Fortunately, DVD drives are dirt cheap; I chose the latest Lite-On DVD drive, in black to match the case. I suppose eventually we'll be buying HD-DVD or Blu-Ray drives, but until the format war is decided, it's DVD all the way for me.

The 5 1/4" drive bays at the top of the P182 require the use of rails, which are provided with the case. I find rail mounting annoying, but since we're only installing a single DVD-R drive, I can deal. It took a bit of trial and error, but I got the rails screwed into each side of the drive and snapped it in to the top bay.

PC build, DVD-R drive installed

This is not one of those fancy new SATA DVD drives, so we'll need to break out the old-school Parallel ATA cable included with the motherboard. I snapped in another modular power connector to provide the necessary 4-pin power. As usual I routed the power cable along the back of the motherboard tray to preserve the clean interior layout.

PC build, DVD-R drive connected to 4-pin power and PATA

We're now ready to boot up the machine. Plug in the power cord, connect a keyboard and mouse, then hit the power switch. During boot, press DELETE to enter the BIOS setup screen. Go into the basic settings and verify that all the drives we installed are properly detected by the motherboard.

PC build, booting with drives

Looks good-- all three drives are showing up. From here you may want to adjust a few basic BIOS settings. For example, I always set the floppy drive to "Disabled". You'll definitely want to set the boot order to ensure the right drives are booting first-- in our case, it's DVD-R, Raptor, then the second drive. Beyond those basic settings, mucking around in the BIOS isn't required at this point; we want to test the system with stock settings anyway.

However, do I recommend flashing the motherboard BIOS to the latest version before you go any further. You'd be surprised how often motherboards ship with out-of-date BIOSes. It isn't required, but your life will be easier if you flash to the latest BIOS now before you complete system setup. A full description of how to flash your motherboard's BIOS is outside the scope of this article, but here's the condensed version:

 

  • Check the manufacturer's website for the latest motherboard BIOS. Be absolutely, positively sure you have the BIOS for the correct motherboard model!
  • Copy the BIOS files to a bootable USB Flash drive.
  • Boot from the flash drive and follow the instructions.

This is a typical BIOS flash scenario. Some vendors do make it easier, though. On my ASUS P5B Deluxe, the flash program is embedded into the BIOS. Others provide programs that allow you to flash the BIOS from within Windows using a friendly GUI.

At any rate, BIOS update or not, we can now install an operating system. I placed my OEM copy of Windows Vista into the DVD tray, rebooted, and selected a 120-day trial of Windows Vista Ultimate.

PC build, installing OEM Windows Vista

Here's one thing I've learned from experience: if your system can't finish a clean install of Windows, it's not stable. Period. It's tempting to blame Microsoft, but the only possible culprit if you have problems at this stage is the hardware (or possibly a scratched DVD). Trust me on this one.

Fortunately, our new system completed the Windows install without a hitch. Remember those driver CDs that came with the motherboard? Throw them right in the trash. They're way out of date by the time the motherboard gets from the factory, to the vendor, and then finally to you. The MSI P6N SLI motherboard we chose is based on the well-regarded NVIDIA 650i SLI chipset, so we have a one stop shop at the NVIDIA drivers page. I downloaded the 650i SLI platform drivers for Windows Vista x86, and the latest 8600 GTS graphics driver.

Now that we have Windows installed, and our platform drivers firmly in place, we know our system is reasonably stable. But we want to confirm that our system is totally stable.

To do that, we'll need to download a few essential burn-in utilities:

 

I run through basic benchmarks first. If the system can't complete a run of 3DMark or PCMark, it's definitely not stable. The rig we just built generated these scores:

 

3DMark2006 (@ 1024x768) 7217
PCMark2005 7353

 

And of course the obligatory Windows Experience results:

PC build, Windows Experience score

These tests aren't just for stability; they're also reality checks. Make sure these scores are in the ballpark for comparable systems. If not, you got something wrong in the build and somehow crippled your system's performance. Fortunately, these numbers check out (although the memory subscore is suspiciously low), and we didn't have any crashes or reboots during the benchmark runs. So far so good.

Now for the real torture test. We'll use:

  1. Four instances of Prime95 (one per core) to load the CPU
  2. Real-Time HDR IBL (RTHDRIBL) to load the GPU
  3. CoreTemp to monitor temperatures

To run four instances of Prime95, create four shortcuts to Prime95.exe using the -A(n) flag, where (n) is the core number. That's documented in this forum thread. Start with "Small FFTs" on the Options | Torture Test dialog in each instance. Then launch RTHDRIBL in a maximized window, and CoreTemp, as pictured here.

PC build, torture testing

Now we need to monitor our patient during the torture test, at least for the first 30 minutes or so.

I use my trusty Kill-a-Watt to determine how much power the system is consuming. I saw 130 watts at the Windows desktop, and during the extreme CPU and GPU torture test, 220 watts.

Kill-a-Watt showing 220w power usage

That gives us a rough idea of how much power dissipation, and therefore heat, we have to worry about. I also use my temperature gun to spot check various heatsinks in the system and make sure they're not getting unusually hot. Here, I'm checking the northbridge heatsink, which gets pretty toasty in modern systems.

PC build, temperature gun on northbridge heatsink

Fancy laser temperature guns are fun, but they're not required. I often use my built-in Mark I finger to touch various items in the computer (but not the bare electrical components, obviously) and make sure they're within normal temperature ranges. You might call me "The PC Whisperer"-- I love nothing more than getting in there and physically touching everything that's at risk of temperature damage:

 

  • CPU heatsink
  • Northbridge
  • Southbridge
  • Video card heatsink
  • Hard drives

You'll know you're in the danger zone when something is too hot to leave your finger on for more than a few seconds. I'm happy to report that all the temperatures on this system check out, both with my temperature gun and my Mk. I finger-- even after hours and hours of torture testing.

Looks like we have a stable, complete system. And when you have a stable, complete system, it's clearly time to overclock it until it breaks. The CPU heatsink remained quite cool throughout the torture test, and CoreTemp confirms relatively low temperatures on each core. This is a very good omen for future overclocking. We'll do that tomorrow.

Discussion

Building a PC, Part I

Over the next few days, I'll be building Scott Hanselman's computer. My goal today is more modest: build a minimal system that boots.

I'd like to dispel the myth that building computers is risky, or in any way difficult or complicated. If you can put together a LEGO kit, you can put together a PC from parts. It's dead easy, like snapping together so many LEGO bricks. Well, mostly. Have you seen how complicated some of those LEGO kits are?

Granted, building computers isn't for everybody. There are plenty of other things you might want to do with your time, like, say, spending time with your children, or finding a cure for cancer. That's why people buy pre-assembled computers from Dell. But if you need fine-grained control over exactly what's inside your PC, if you desire a deeper understanding of how the hardware fits together and works, then building a PC is a fun project to take on. You can easily match or beat Dell's prices in most cases, while building a superior rig -- and you can learn something along the way, too.

Here's the complete set of parts we ordered, per the component list. The CPU and memory boxes aren't shown, unfortunately, because I had already opened those by the time I took this photo. Whoops!

PC build, pile o' parts

All you need is a few basic tools to build this PC. I typically use needle-nose pliers, wire cutters, and a small phillips screwdriver.

PC build, basic tools

Before we get started, let me share a few key things I've learned while building PCs:

 

  • Computer parts are surprisingly durable. They aren't fragile. You don't have to baby them. So often I see people handle computer parts as if they're sacred, priceless relics. While I don't think you should play "catch" with your new Core 2 Quad processor, it's also not going to explode into flames if you look at it the wrong way. You don't have to tiptoe around the build. Just be responsible and use common sense. I've done some appalling things to computer hardware in my day, truly boneheaded stuff, and I think I've broken all of two or three items in the last 10 years.

     

  • The risk of static discharge is overblown. I never wear anti-static wristbands, and I've yet to electrocute any components with static electricity. Never. Not once. However, I always touch a metal surface before handing computer components-- and that's a good habit for you to cultivate as well.

     

  • Be patient, and don't force it. Those rare times I've damaged components, it's because I rushed myself and forced something that I thought should fit-- despite all the warning signs. I've learned through hard experience that "maybe I need to use lots of additional force" is never the right answer when it comes to building PCs. Take a deep breath. Count to ten. Refer to the manual, and double-check your work.
  •  

     

I always build up the motherboard first. Place the motherboard on top of the anti-static bag it came in so it's easier to work on. Slot in the CPU and snap in the memory sticks. We're using four sticks here, so every slot is populated. However, if you're only using two sticks of memory, be sure they are in the correct paired slots for dual-channel operation. If you need advice, the motherboard manual is a good reference for basic installation steps.

PC build, motherboard with CPU and memory

Continue building up the motherboard by installing the CPU cooler. I strongly recommend buying an aftermarket CPU cooler based on a heatpipe tower design, as they wildly outperform the stock Intel coolers. This particular model we chose for Scott's build is the Scythe Mine, but I'm also a fan of the Scythe Infinity and Scythe Ninja Plus. (You can see the Ninja Plus on my work rig.)

It's important to install the CPU cooler correctly, otherwise you risk frying your CPU. Refer closely to the heatsink instructions. Don't forget to place a bit of the heatsink paste (included with the cooler) on the surface of the CPU before installing. These larger heatsinks can be quite heavy, so be sure you've followed the installation instructions to the letter and secured it firmly to the motherboard. Check the orientation of the heatsink so the fan blows "out" if possible, e.g., towards the back of the motherboard, where the case exhaust fans usually are.

PC build, motherboard with CPU cooler installed

Now let's build up the case to accept the motherboard. We chose the Antec P182 case for Scott's build. This case is unique; it's a collaborative venture between the well-known case vendor Antec and Silent PC Review, one of my favorite PC enthusiast websites.

This is the second version of the case, which reflects a number of design tweaks over the original P180. It's a little expensive, but the P182 oozes quality and attention to detail. It's probably the single best designed case I've ever worked on. But don't take my word for it; see reviews at AnandTech and SilentPCReview.

PC build, P182 case unpacked

Some cases are sold with power supplies, but the higher end cases, such as the P182, typically are not. For Scott's build, we chose the Corsair HX series power supply, which is a rebranded and tweaked Seasonic. It's considered one of the best quiet and efficient power supplies on the market, which is why it tops the list of recommended PSUs at SilentPCReview.

I opened the opposite side of the case to gain access to the PSU cage from both sides, installed the PSU in the cage, and threaded the power cables up through the opening in the middle.

PC build, power supply mounted

If you have cats, like we do, you have curious cat helpers. Unfortunately, cat helpers aren't all that... helpful.

PC build, my cat helper

Now install the backplate included with the motherboard. Every backplate is different because every motherboard is different. It's held in by pressure; just snap it in firmly around the edges.

PC build, backplate installed

It's finally time to place the motherboard in the case. Clear room in the case compartment by moving any errant cables out of the way and stowing them. Make sure the screw holes on the motherboard line up with the pre-installed screw mount standoffs in the case. In our P182, everything matched up perfectly out of the box.

Angle the motherboard down slowly and line up the ports to the backplate, then gently let the motherboard down to rest against the standoffs. Loosely line up the motherboard screw holes to the motherboard standoffs.

PC build, motherboard installed in case

Find the packet of screws included with the case, and use the appropriate screws to secure the motherboard to the case standoffs.

PC build, screws included with case

Now let's connect the power supply to the motherboard. There are two power connectors on modern motherboards, so be sure you've connected them both. Don't worry, the connectors are keyed; you can't install them incorrectly and blow up your PC. As you can see here, I threaded the power connectors along the back side of the motherboard platform. That's one of the many nifty little design features of the P182 case.

PC build, connecting power to motherboard

Before we can boot up, we need to connect the power and reset switches so they work. This part is a little fiddly. Find the cable with the labelled power, reset, and LED connectors from the case, then refer to the motherboard manual to see where the appropriate motherboard front panel connector pins are.

PC build, referring to front panel connector headers in motherboard manual

Connect each front panel wire to the specific motherboard front panel pins individually. Make sure you connect them to the right location, but orientation of these connectors doesn't matter. This is where the needlenose pliers come in handy unless you have nimble (and tiny) fingers. Why this isn't a universally standard keyed block connector by now is beyond me.

PC build, closeup of motherboard front panel connectors

We need some kind of video output to see if our computer can boot, so let's install a video card. Scott's not a hardcore gamer, so I went for something midrange, a set of two NVIDIA 8600GTS cards. They're an excellent blend of performance and the latest DX10 and high-definition features, while using relatively little power.

Don't forget to connect the 6-pin video card power connector if your video card requires it! This is a common mistake that I've made more than once. Our power supply has modular connectors, so I snapped in one of the two 6-pin power connectors and threaded it up to the video card.

PC build, install video card

We're ready for the moment of truth: does it boot? I attached a power cord to the power supply, hooked up a utility 15" LCD I keep around for testing, and then pressed the power button.

PC build, successful boot

Success! I know "reboot and select proper boot device" doesn't look like much, but it means everything is working. We've just built a minimal PC that boots up. It's a small step that we'll build on tomorrow.

Getting this system from a pile of parts to bootable state took about two hours. Like I promised -- easy! Writing it up is taking almost as long as actually doing it. This was a slow build for me because I was extra cautious with Scott's parts, and I was stopping to take frequent pictures. With some practice, it's possible to build a PC much more quickly-- even in under ten minutes.

Discussion

Defining Open Source

As I mentioned two weeks ago, my plan is to contribute $10,000 to the .NET open source ecosystem. $5,000 from me, and a matching donation of $5,000 from Microsoft.

There's only two ground rules so far:

  1. The project must be written in .NET managed code.
  2. The project must be open source.

The first rule is simple enough; although mono and subversion are great open source projects, they aren't written in .NET managed code, and are therefore totally ineligible. But number two is where I hit a roadblock: how do you tell if something that calls itself "open source" is really open source? Many projects think they are-- or at least some users may think they are-- but they really aren't.

NDoc is an example of exactly this kind of tricky misunderstanding, per a comment Chris Nahr left on a related post:

 

If [other people contributing] was his wish he kept it to himself. The source code for NDoc 2.0 was never released -- Kevin claimed licensing issues as the reason. No one else could contribute, except by mailing him bug reports on his (binary-only) alpha builds.

Kevin is now supposedly passing on the Sourceforge administration of the project to two other guys; I hope we'll finally see a public source code release again so that willing engineers actually can contribute once again.

I want to avoid these kinds of problems.

To that end, here are a few criteria we need to evaluate for each nominated project, to ensure that they're not just paying lip service to "open source":

 

  1. The project must use an OSI approved license, or the permissive or reciprocal shared source licenses from Microsoft. (I have to include that rider because part of the OSI's pissing match with Microsoft is not formally recognizing Microsoft's licenses, even though they're absolutely in the spirit of open source.) Pick a license, any license! If your project does not have a license, or if you fail to make it stupid easy for us to determine what license your project uses... your project is ineligible.

     

  2. The project must use a commonly available method of public source control. SourceForge, CodePlex, Google Code, whatever. Other developers should be able to retrieve the read-only public code using a source control tool, and potentially check in changes to the codebase if granted appropriate permissions. If the only way to get to the source code is via HTTP download of a ZIP file... your project is ineligible.

     

  3. The project must provide public evidence that it accepts and encourages code contributions from the outside world. Is a project truly open source if it only has one developer? Is a project truly open source if it has a cabal of three developers who summarily ignore all outside suggestions and contributions? All I'm looking for here is evidence of some kind of community. It doesn't have to be a large one, necessarily, but it has to be there. The spirit of open source is active community development. If you can't show a decent history of checkins from a reasonable variety of contributors... your project is ineligible.
  4.  

     

This maps fairly well to the "four freedoms" of the Free Software Foundation:

 

  • The freedom to run the program, for any purpose.
  • The freedom to study how the program works, and adapt it to your needs.
  • The freedom to redistribute copies so you can help your neighbor.
  • The freedom to improve the program, and release your improvements to the public, so that the whole community benefits.

So now the vetting process begins.

I need your help to figure out how many of the projects nominated in the comments actually meet the criteria I've outlined. I've put a read-only spreadsheet online via Google documents which contains all the projects people nominated in the comments to my original post. But I can't seem to make it editable by the world. I can only invite people in as "collaborators". Supposedly this link allows anyone with a Google account to be a collaborator. Try that first.

Alternately, if there's a better way to collaboratively edit a spreadsheet-like list, I'm open to suggestions!

Discussion

Better Image Resizing

In a previous post, I examined the difference between bilinear and bicubic image resizing techniques. Those are the two options available in most graphics programs for resizing an image.

image resizing options

After some experimentation, I came up with these rules of thumb:

  • When making an image smaller, use bicubic, which has a natural sharpening effect. You want to emphasize the data that remains in the new, smaller image after discarding all that extra detail from the original image.
  • When making an image larger, use bilinear, which has a natural smoothing effect. You want to blend over the interpolated fake detail in the new, larger image that never existed in the original image.

Of course, there are plenty of conditions that might make you want to choose one method over the other, but I think these are reasonable guidelines to start with.

What I didn't realize when I wrote the original article is that there are other, more advanced resizing algorithms available. Some are specific to particular kinds of images, such as the 2xSAI algorithm which works on pixel art. Compare this shot of Mario vs. Wario using pixel resizing, and the same shot using 2xSAI resizing. It's a dramatic difference, especially since traditional bilinear and bicubic upsizing methods degenerate into a giant blur on pixel art.

Supposedly, one of the best image resizing algorithms on the market is Genuine Fractals. The web site boasts that you can use its fractal-based resizing algorithm to "enlarge your images over 1000% with no loss in image quality". It's probably pure marketing hyperbole, but I was still intrigued. Bilinear and Bicubic are decent, but there has to be room for improvement in there somewhere. I downloaded a trial version of the tool (which requires Photoshop Elements, or Photoshop CS) and gave it a shot.

I took the the reference Lena image and blew it up 500%.

Here's a closeup of the results using Bicubic Sharper:

Lena 512 color reference image, bicubic sharp 5x resize

Here's the same closeup using Genuine Fractals:

Lena 512 color reference image, fractal 5x resize

Bicubic wouldn't normally be my choice here, but I chose it because it's technically the most advanced method, and it produces the results closest to the effect that the fractal resizing delivers. Still, the fractal algorithm comes out way ahead; you can't see any pixel resize artifacts in the enlarged image, and the edges are sharp and well defined. It does start to bear an unfortunate resemblance to a watercolor drawing filter, but arbitrarily resizing images to 5 times their original size will always involve tradeoffs of some kind.

Bicubic and bilinear are well understood image resizing algorithms, and they're "good enough" for most image resizing chores. That's why they are provided out of the box in almost all graphics applications and graphics libraries. There's an outstanding article on CodeProject which digs into advanced image resizing algorithms with actual C# code for some spline and fractal resizing algorithms. But before you begin resizing images, consider whether you need those advanced algorithms.

Reducing images is a completely safe and rational operation. You're simply reducing precision and resolution by discarding information. Make the image as small as you want, and you have complete fidelity-- within the bounds of the number of pixels you've allowed. You'll get good results no matter which algorithm you pick. (Well, unless you pick the naive Pixel Resize or Nearest Neighbor algorithms.)

Enlarging images is risky. Beyond a certain point, enlarging images is a fool's errand; you can't magically synthesize an infinite number of new pixels out of thin air. And interpolated pixels are never as good as real pixels. That's why it's more than a little artificial to upsize the 512x512 Lena image by 500%. It'd be smarter to find a higher resolution scan or picture of whatever you need* than it would be to upsize it in software.

But when you can't avoid enlarging an image, that's when it pays to know the tradeoffs between bicubic, bilinear, and more advanced resizing algorithms. At least arm yourself with enough knowledge to pick the best of the bad options you have.

* e.g., if I really needed the Lena image that large, I'm better off hunting down old copies of Playboy and scanning them myself. Or at least that's what I tell my wife...

Discussion

Game Development Postmortems

I've written about the value of project postmortems before. Still, getting a project postmortem going (or, if you prefer your terminology a bit less morbid, a project retrospective) can be a daunting proposition. Game Developer Magazine's postmortem objectives offers a helpful template for conducting a postmortem yourself:

The bulk of the [postmortem] should revolve around the "5 wrongs, 5 rights" concept:

Explain what 5 goals, features or aspects of the project went off without a hitch or better than planned. Were there any phases of development that you thought would be much harder than you had planned? Did a new programmer come on the team and inject great ideas or brilliant programming ability into the effort? Did a new technology become widely adopted by consumers that solved a particularly thorny development problem? Did new development tools become available that let you add better graphics or sound? Did you save money in certain ways you hadn't expected? Cut days/weeks/months off the schedule in some way you hadn't expected to? Did the marketing or PR team get some outstanding preview coverage in a consumer magazine?

Explain what 5 goals, features or aspects of the project were problematic or failed completely. Did the lead programmer leave the company halfway throughout the project? Did adapting to new technologies (for example, MMX, DirectX, a new graphics library or AI algorithm) create unanticipated problems for the developers? Did your development tools let you down in some way or not live up to expectations? Did hidden costs creep into the project, and if so, where did they come from? Did the schedule slip for some reason? Was the configuration testing or beta testing cycle problematic for some reason? Did features get axed because of scheduling pressures? Did the lead programmer quit? Did the marketing or PR team misrepresent the game to the public, causing false hopes? Be specific in this regard -- postmortems that accentuate the "what went right" and sugar coat the "what went wrong" sections are dismissed by readers and won't be accepted by our edit staff. Be honest, that's all we are asking.

These postmortems are deemed so significant to other game developers that they're often the cover story on the magazine. I think that's exactly the priority level postmortems should have; if you're not learning from your mistakes, or even better, learning from other people's mistakes, then your next project will have a rocky future at best.

Game Developer magazine, Guitar Hero postmortem

Game Developer Magazine isn't available online without subscribing, but many of the postmortems from the magazine are compiled in the book Postmortems from Game Developer: Insights from the Developers of Unreal Tournament, Black and White, Age of Empires, and Other Top-Selling Games.

However, Gamasutra, another magazine for game developers, does post its postmortems online. As I've mentioned before, one of my very favorite postmortems is for Trespasser, one of the most notorious failures in PC gaming history. But it's instructive to soak in the successes as well as gawk at the trainwrecks. Here are a few Gamasutra postmortems I recommend (registration required):

All the postmortems are worthwhile, and you'll surely recognize a lot of the pitfalls and triumphs they describe. Even if you don't care a whit about gaming, it's instructive to read game development postmortems because game development is such a pressure cooker. It's incredibly challenging software engineering, with unclear goals (eg, "fun") under intense deadlines. Every project pathology you're likely to see on your software project has probably already materialized on one or more of these games.

Discussion