Coding Horror

programming and human factors

YSlow: Yahoo's Problems Are Not Your Problems

I first saw Yahoo's 13 Simple Rules for Speeding Up Your Web Site referenced in a post on Rich Skrenta's blog in May. It looks like there were originally 14 rules; one must have fallen off the list somewhere along the way.

  1. Make Fewer HTTP Requests
  2. Use a Content Delivery Network
  3. Add an Expires Header
  4. Gzip Components
  5. Put CSS at the Top
  6. Move Scripts to the Bottom
  7. Avoid CSS Expressions
  8. Make JavaScript and CSS External
  9. Reduce DNS Lookups
  10. Minify JavaScript
  11. Avoid Redirects
  12. Remove Duplicate Scripts
  13. Configure ETags

It's solid advice culled from the excellent Yahoo User Interface blog, which will soon be packaged into a similarly excellent book. It's also available as a powerpoint presentation delivered at the Web 2.0 conference.

I've also covered similar ground in my post, Reducing Your Website's Bandwidth Usage.

But before you run off and implement all of Yahoo's solid advice, consider the audience. These are rules from Yahoo, which according to Alexa is one of the top three web properties in the world. And Rich's company, Topix, is no slouch either-- they're in the top 2,000. It's only natural that Rich would be keenly interested in Yahoo's advice on how to scale a website to millions of unique users per day.

To help others implement the rules, Yahoo created a FireBug plugin, YSlow. This plugin evaluates the current page using the 13 rules and provides specific guidance on how to fix any problems it finds. And best of all, the tool rates the page with a score-- a score! There's nothing we love more than boiling down pages and pages of complicated advice to a simple, numeric score. Here's my report card score for yesterday's post.

yslow performance score

To understand the scoring, you have to dissect the weighting of the individual rules, as Simone Chiaretta did:

Weight 113. Add an Expires Header
4. GZip Components
13. Configure ETags
Weight 102. Use a Content Delivery Network
5. Put CSS at the Top
10. Minify JavaScript
11. Avoid Redirects
Weight 59. Reduce DNS Lookups
6. Move Scripts to the Bottom
12. Remove Duplicate Scripts
Weight 41. Make Fewer Requests (CSS)
1. Make Fewer Requests (JS)
Weight 31. Make Fewer Requests (CSS background images)
Weight 27. Avoid CSS Expressions

My YSlow score of 73 is respectable, but I've already made some changes to accommodate its myriad demands. To get an idea of how some common websites score, Simone ran YSlow on a number of blogs and recorded the results:

  • Google: A (99)
  • Yahoo Developer Network blog : D (66)
  • Yahoo! User Interface Blog : D (65)
  • Scott Watermasysk : D (62)
  • Apple : D (61)
  • Dave Shea's mezzoblue : D (60)
  • A List Apart : F (58)
  • Steve Harman : F (54)
  • Coding Horror : F (52)
  • Haacked by Phil : F (36)
  • Scott Hanselman's Computer Zen : F (29)

YSlow is a convenient tool, but either the web is full of terribly inefficient web pages, or there's something wrong with its scoring. I'll get to that later.

The Stats tab contains a summary of the total size of your downloaded page, along with the footprint with and without browser caching. One of the key findings from Yahoo is that 40 to 60 percent of daily visitors have an empty cache. So it behooves you to optimize the size of everything and not rely on client browser caching to save to you in the common case.

yslow stats

YSlow also breaks down the statistics in much more detail via the Components tab. Here you can see a few key judgment criteria for every resource on your page...

  • Does this resource have an explicit expiration date?
  • Is this resource compressed?
  • Does this resource have an ETag?

... along with the absolute sizes.

yslow components

YSlow is a useful tool, but it can be dangerous in the wrong hands. Software developers love optimization. Sometimes too much.

There's some good advice here, but there's also a lot of advice that only makes sense if you run a website that gets millions of unique users per day. Do you run a website like that? If so, what are you doing reading this instead of flying your private jet to a Bermuda vacation with your trophy wife? The rest of us ought to be a little more selective about the advice we follow. Avoid the temptation to blindly apply these "top (x) ways to (y)" lists that are so popular on Digg and other social networking sites. Instead, read the advice critically and think about the consequences of implementing that advice.

If you fail to read the Yahoo advice critically, you might make your site slower, as as Phil Haack unfortunately found out. While many of these rules are bread-and-butter HTTP optimization scenarios, it's unfortunate that a few of the highest-weighted rules on Yahoo's list are downright dangerous, if not flat-out wrong for smaller web sites. And when you define "smaller" as "smaller than Yahoo", that's.. well, almost everybody. So let's take a critical look at the most problematic heavily weighted advice on Yahoo's list.

Use a Content Delivery Network (Weight: 10)

If you have to ask how much a formal Content Delivery Network will cost, you can't afford it. It's more effective to think of this as outsourcing the "heavy lifting" on your website-- eg, any large chunks of media or images you serve up -- to external sites that are much better equipped to deal with it. This is one of the most important bits of advice I provided in Reducing Your Website's Bandwidth Usage. And using a CDN, below a reasonably Yahoo-esque traffic volume, can even slow your site down.

Configure ETags (Weight: 11)

ETags are a checksum field served up with each server file so the client can tell if the server resource is different from the cached version the client holds locally. Yahoo recommends turning ETags off because they cause problems on server farms due to the way they are generated with machine-specific markers. So unless you run a server farm, you should ignore this guidance. It'll only make your site perform worse because the client will have a more difficult time determining if its cache is stale or fresh. It is possible for the client to use the existing last-modified date fields to determine whether the cache is stale, but last-modified is a weak validator, whereas Entity Tag (ETag) is a strong validator. Why trade strength for weakness?

Add an Expires Header (Weight: 11)

This isn't bad advice, per se, but it can cause huge problems if you get it wrong. In Microsoft's IIS, for example, the Expires header is always turned off by default, probably for that very reason. By setting an Expires header on HTTP resources, you're telling the client to never check for new versions of that resource-- at least not until the expiration date on the Expires header. When I say never, I mean it -- the browser won't even ask for a new version; it'll just assume its cached version is good to go until the client clears the cache, or the cache reaches the expiration date. Yahoo notes that they change the filename of these resources when they need them refreshed.

All you're really saving here is the cost of the client pinging the server for a new version and getting a 304 not modified header back in the common case that the resource hasn't changed. That's not much overhead.. unless you're Yahoo. Sure, if you have a set of images or scripts that almost never change, definitely exploit client caching and turn on the Cache-Control header. Caching is critical to browser performance; every web developer should have a deep understanding of how HTTP caching works. But only use it in a surgical, limited way for those specific folders or files that can benefit. For anything else, the risk outweighs the benefit. It's certainly not something you want turned on as a blanket default for your entire website.. unless you like changing filenames every time the content changes.

I don't mean to take anything away from Yahoo's excellent guidance. Yahoo's 13 Simple Rules for Speeding Up Your Web Site and the companion FireBug plugin, YSlow, are outstanding resources for the entire internet. By all means, read it. Benefit from it. Implement it. I've been banging away on the benefits of GZip compression for years.

But also realize that Yahoo's problems aren't necessarily your problems. There is no such thing as one-size-fits-all guidance. Strive to understand the advice first, then implement the advice that makes sense for your specific situation.

Discussion

Discipline Makes Strong Developers

Scott Koon recently wrote about the importance of discipline as a developer trait:

Every month a new programming language or methodology appears, followed by devotees singing its praises from every corner of the Internet. All promising increases in productivity and quality. But there is one quality that all successful developers possess. One trait that will make or break every project.

Discipline.

An undisciplined developer will not be able to ship on time and will not write code that is easy to maintain. A disciplined developer will not only enable the success of a project, but will raise the level of productivity in others. Software architects and developers do themselves a disservice when they attribute their success to whatever methodology they have adopted. It really boils down to how disciplined you are.

It's an interesting coincidence, because I recently gave a presentation to a group of developers on the topic of source control, and I found myself repeating that very same word throughout the presentation: discipline. Discipline. Discipline! I repeat it because the mere presence of a great source control system doesn't obligate anyone to use it in a structured, rational way. No. That takes discipline.

And not many shops, at least in my experience, have the right discipline. All too often, what I see in source control looks more like this Windows desktop:

messy windows desktop

Instead of a nice, structured set of projects with logical branches and tags, what ends up in source control is a hairy furball of crazily named folders with no logical structure at all. Just like the average user's desktop.

And it doesn't matter what language you use, either. You can write FORTRAN in any language.

So I'm inclined to agree with Scott. Without discipline, things like tools and languages are irrelevant. But repeating the word "discipline" isn't exactly helpful, either. Perhaps what entry-level developers need is a programming mentor who isn't afraid to personally advocate the necessary discipline, someone hard, someone like Dave Cutler, or perhaps someone with the right motivational techniques to inspire discipline, like Gunnery Sergeant Hartman:

If you ladies leave my island, if you survive recruit training, you will be a weapon. You will be a minister of death praying for war. But until that day you are pukes. You are the lowest form of life on Earth. You are not even human f**ing beings. You are nothing but unorganized grabastic pieces of amphibian s**t. Because I am hard you will not like me. But the more you hate me the more you will learn. I am hard but I am fair. There is no racial bigotry here. I do not look down on ni**ers, kikes, wops or greasers. Here you are all equally worthless. And my orders are to weed out all non-hackers who do not pack the gear to serve in my beloved Corps. Do you maggots understand that?

You can find the same advice stated in more prosaic terms in McConnell's Code Complete:

It's hard to explain to a fresh computer-science graduate why you need conventions and engineering discipline. When I was an undergraduate, the largest program I wrote was about 500 lines of executable code. As a professional, I've written dozens of utilities that have been smaller than 500 lines, but the average main-project size has been 5,000 to 25,000 lines, and I've participated in projects with over 500,000 lines of code. This type of effort requires not the same skills on a larger scale, but a new set of skills altogether.

In a 15-year retrospective on work at NASA's Software Engineering Laboratory, McGarry and Pajerski reported that the methods and tools that emphasize human discipline have been especially effective (1990). Many highly creative people have been extremely disciplined. "Form is liberating", as the saying goes. Great architects work within the constraints of physical materials, time, and cost. Great artists do, too. Anyone who has examined Leonardo's drawings has to admire his disciplined attention to detail. When Michelangelo designed the ceiling of the Sistine Chapel, he divided it into symmetric collections of geometric forms, such as triangles, circles, and squares. He designed it in three zones corresponding to the three Platonic stages. Without this self-imposed structure and discipline, the 300 human figures would have been merely chaotic rather than the coherent elements of an artistic masterpiece.

Discipline takes many forms and permeates every aspect of software development. Start small. Say your database schema contains three primary key table columns named list_id, ListId, and list_value. There should be a Gunnery Sergeant Hartman on your development team who will.. gently.. remind the team that it might be a good idea to fix problems like this before they become institutionalized in all your future code.

You don't necessarily have to have a strict, rigid military code of conduct. Even though software engineering is a young field, there are a lot of accepted conventions that make up modern software development. All it takes to benefit from those conventions is a little old-fashioned discipline. And if it doesn't start with you, then who?

Discussion

Measuring Font Legibility

If you think of fonts as a bit of design esoterica, consider this New York Times article on the new Clearview typeface that will appear on all new highway road signs here in the United States:

The problem sounded modest enough: Add more information to the state's road signs without adding clutter or increasing the physical size of the sign itself. But with the existing family of federally approved highway fonts -- a chubby, idiosyncratic and ultimately clumsy typeface colloquially known as Highway Gothic -- there was little you could add before the signs became visually bloated and even more unreadable than they already were. "I knew the highway signs were a mess, but I didn't know exactly why," Meeker recalled.

Around the same time Meeker and his team were thinking about how to solve the problem of information clutter in Oregon, the Federal Highway Administration was concerned with another problem. Issues of readability were becoming increasingly important, especially at night, when the shine of bright headlights on highly reflective material can turn text into a glowing, blurry mess. Highway engineers call this phenomenon halation and elderly drivers, now estimated to represent nearly a fifth of all Americans on the road, are most susceptible to the effect.

I've always considered road signs a rich field for study, as signage design has many parallels to modern GUI design in computer science. The accompanying slideshow for the article provides this image which illustrates the halation problem.

Font legibility, road sign halation

You could improve readability by simply making the font bigger. But this would result in billions of dollars spent on larger signs that increase visual clutter on the roadways. The Clearview font is an attempt to fundamentally improve readability with better design -- a completely redesigned typeface, optimized for highway use.

Here's a detailed comparison of the old FHWA typeface, Highway Gothic, and the new Clearview:

font legibility, clearview vs. highway gothic

This isn't just aesthetics-- it also results in a practical benefit for drivers. That's the best kind of design, and like all the best designs, they provide the data to prove it:

Intrigued by the early positive results, the researchers took the prototype out onto the test track. Drivers recruited from the nearby town of State College drove around the mock highway. From the back seat, Pietrucha and Garvey recorded at what distance the subjects could read a pair of highway signs, one printed in Highway Gothic and the other in Clearview. Researchers from 3M came up with the text, made-up names like Dorset and Conyer -- words that were easy to read. In nighttime tests, Clearview showed a 16 percent improvement in recognition over Highway Gothic, meaning drivers traveling at 60 miles per hour would have an extra one to two seconds to make a decision.

I've talked before about font legibility experiments, where fonts designed for the web allowed users to read faster. This isn't opinion; it's backed by actual experimental data. There was a more recent experiment from the same source that also found even more benefits from the newest typefaces designed around RGB anti-aliasing. That's why I think Microsoft's font rendering strategy is ultimately smarter than Apple's.

So before you write off a design exercise as seemingly trivial as font choice, consider whether that tiny bit of design could improve the user experience, if only a little. And more importantly-- how would you measure the improvement?

Discussion

Trojans, Rootkits, and the Culture of Fear

Scott Wasson at The Tech Report notes that two of his family members fell victim to the eCard email exploit that has been making the rounds lately:

I just dropped off a package containing my dad's laptop at the FedEx depot this afternoon. I spent parts of several days this week recovering his data, wiping the drive, and reinstalling the OS and key apps. My dad's a tech-savvy guy, but in a moment of weakness, he opened one of those greeting card spam messages recently and his computer became infected with a trojan. The thing had installed a proxy for IE7 and rerouted all DNS queries to a compromised server, and then covered most if its tracks via a rootkit. I wiped the drive and started over because I didn't think I could be sure otherwise that the trojan was entirely removed from his system.

I went through the same thing with my wife's PC not long ago. She also knows better than to open attachments, but the greeting card thing caught her off guard somehow. Took her a while to admit that she'd gone through the steps of opening the email, clicking the link, downloading the payload, and running the executable. I lost a day's work, at least, to rebuilding that machine from the ground up.

Were it not for tools like Rootkit Revealer, I might not have even been able to detect the trojans. One of them seemed to be attacking our antivirus software and trying to stop the Revealer process, even.

I could get mad at my relatives for making a mistake, but it's hard to see the point. The really frustrating thing is that they both had reason to believe a greeting card might be coming their way at the time and reason to be a little frazzled: my dad had brain surgery recently. These email-based attacks prey on those who might not be operating at 100% for whatever reason. That makes me white-hot mad.

Which makes me wonder: if it can happen to some fairly tech-savvy folks like these, how widespread is this problem? And what happens when your computer gets infected and you don't have a close relative who's a PC expert? The trojan on my wife's PC wasn't detected by Windows Defender, Avast! antivirus, or the Windows Malicious Software Removal Tool.

I feel his pain. I went through a similar experience on one of my machines recently, which I documented in How to Clean Up a Windows Spyware Infestation. I'm sure I'd be even angrier if this had happened to someone more vulnerable, like my wife, or my father. But there are a few hard lessons to be learned here:

  1. Stop Running As Administrator

    To answer the question Scott posed at the bottom of his post, the problem is incredibly widespread; it's a Windows security epidemic. The only real long-term solution for the Windows security epidemic is to stop running as Administrator. Vista's UAC is a marginally effective half-step at best. Why not emulate the UNIX operating sytems, which seem to be immune to most infections to date? When was the last time you heard of a Linux or FreeBSD user running anti-virus software? Or a Mac OS X user? A handful of antivirus programs exist for the Mac, but they're largely snake oil, as they have little to protect against.

    If you take the advice to run as a non-administator, you may find that the standard user route is painful, too. I received an email from James Boswell that describes the difficulty:

    You and many others have been advocating the use of admin users and standard users on Windows. I'm an experienced Windows developer too, and regularly build machines, but I've always had admin access for a single user. This time, I am putting a Vista Home Premium 64 bit machine together for my son and thought I'd take your advice but I have really struggled with multi users.

    When logged in as the standard user, installations of software are hit and miss. For example, 3DMark06, Shockwave 10 and Gamespot Download Manager failed to install correctly as standard user (with admin priv. when prompted for password). All 3 failed installs required me to switch user to the admin and repeat the installs. Plus many installations require me to enter my password several times, not just during the install, but when the program runs for the first time (usually for firewall access or updates).

    All of this is very unhelpful, because my son will no doubt want to install software during his use of the computer, and so will come to me saying "Dad, I want to install {Counter Strike | some web plugin | a screensaver} and Vista is bugging me again" so I will look at what he's installing and type my password in to approve access, and then I go back to what I was doing. But I will now be waiting for the "DAD!!!… it doesn't work" follow up because the install failed.

    I will then have to switch user to admin, repeat my son's actions to access the install program, wait for install to finish, run the app to approve any firewall or other permissions, and then log off. I'm most definitely for the responsible parental control of PCs, but this is a monumental and entirely unnecessary waste of my time.

    This is partially the fault of Windows software developers who fail to test as standard user. It's disappointing, but understandable, since running as Administrator has long been institutionalized in Windows. It's also a particular problem for users who need to install lots of software for whatever reason. In contrast, my wife runs fine as limited user, but she almost never installs software of any kind. I hope more Windows developers are testing their software when running as a standard user, and in time running as a standard user will become as easy(ish) as it is on a Unix based OS.

  2. Traditional Anti-Virus Doesn't Work Any More

    The blacklist approach used by anti-virus vendors simply doesn't scale to today's threat environment. Blacklists are never particularly effective. But it's getting to the point where the illusion of protection afforded by a traditional anti-virus solution is worse than no protection at all:

    Let's suppose somebody who is involved with incident response at a typical US public University collected a few recent malware samples from the compromised machines, and then submitted all the samples to VirusTotal for scanning against all current anti-virus and anti-virus-like products. What do you think the average detection rate is?

    Let me give you the answer: it is 33%. In other words, the average detection rate of malware from these "solutions" was 33%, with the maximum at 50% and the minimum at 2%. Keep this number in mind, that shiny anti-virus product you just bought might be protecting you from just 2% of currently active and common malware (not some esoteric and custom uber-haxor stuff)!

    I have to conclude what many security pundits were blabbing about for years: "mainstream" anti-virus is finally DEAD. It's a weak excuse for defense-in-depth, in about the same sense as wearing an extra shirt provides "another security layer" in a gun fight.

    Not only does anti-virus cripple your machine's performance, it doesn't even protect you adequately! Even if your anti-virus or anti-malware solution is catching an incredibly optimistic 90% of threats, all it takes is one new, undetected threat to get through and your machine is thoroughly 0wned.

    And I do mean 0wned. These aren't your father's happy99.exe trojans. Today's threats have evolved into very sophisticated beasts. I got a chill when Scott mentioned so casually that the payload of the eCard trojan is a rootkit that redirects all DNS queries to a compromised DNS server. That's a worst case scenario which is becoming increasingly common. Good luck detecting a threat which subverts the very kernel of the operating system. Traditional programming techniques don't work; you need to fight fire with fire and hire kernel hackers of your own to pit them against. This leads to a kind of software armageddon that nobody can really "win": you're left with a wake of destroyed operating systems and thoroughly defeated users.

  3. The Mainstreaming of Virtual Machine Sandboxes

    Running as non-administrator should be absolutely standard, as it is one of the few security techniques which has a proven track record. But, with sufficient desire and initiative, naive or malicious users can still subvert the limited user account. If users want to see the dancing bunnies bad enough -- or, in Scott's case, they want to see the eCard someone "sent" them -- they'll type the administrator password in and escalate. Forget about protecting users from malicious threats. Now you have to deal with a far more difficult problem: how do you protect users from themselves? I think virtualization is the only rational way to protect users from themselves-- and that's why virtualization is the next great frontier for computer security.

    Full-machine virtualization as seen in Virtual PC 2007 and VMWare is one way to achieve this, and it's a completely natural use for the obscene abount of local processing power we have on our desktops. But there's also software virtualization, which isolates all disk access from individual applications. Earlier this year, Google acquired GreenBorder technologies, which used software virtualization to isolate the browser from the disk and completely prevent any malware attacks. Their product is no longer distributed while they do whatever it is they're doing as a part of Google, but for context, you can read a review of their original product, GreenBorder Pro, at PC Magazine. Note the "does not need signature updates" notation in the review. With virtualization, you stop caring about blacklists and signature updates; you're protected against any possible threat, now or in the future.

    Well, except for the rare threats that target the virtualization layer, but that's a much tougher nut to crack.

Most of all, I dislike the culture of fear that permeates Windows security software marketing. I don't think it's ethical to scare users into buying your security software product-- and it also creates a huge conflict of interest between the security software vendors and the virus, malware, and trojan creators. After all, why would we buy anti-virus software if, like Mac OS X users, we had almost no risk of being infected by a virus, malware, or trojan? Windows security software vendors need the threats-- and the more credible and fearsome the threats, the better-- to make money. They have no economic incentive to support an environment where threats are ineffective. The status quo of weak Windows security suits them just fine. It sells their products.

detail of Edward Munch painting, The Scream

I believe we can solve the Windows security epidemic without using fear as a marketing tactic. We need to stop relying on the illusory, expensive protection of anti-virus blacklists, and start implementing better solutions. We already have the ability to run as a limited user account today. It's too bad the powers that be at Microsoft didn't have the guts to pull the trigger on limited user accounts as a standard setup in Vista. But that shouldn't stop us. We should have the guts to pull the trigger ourselves. And if we add a little virtualization to the mix, I think we can almost completely eliminate most security threats. Windows anti-virus software is considered mandatory today. But I'd love to see a day where, as on OS X and every other Unix operating system variant, anti-virus software is viewed as unnecessary, even superfluous.

Discussion

Dell XPS M1330 Review

Although I wasn't unhappy with my ASUS W3J laptop, which I've owned for a little over a year now, it was never quite the ultraportable to match my beloved, dearly departed three pound Dell Inspiron 300M. That's why I recently purchased a Dell XPS M1330 laptop.

Dell XPS M1330

I've been eyeing laptops with LED displays and solid-state hard drives for a while now, long before I ever saw the Dell M1330. But the fact that...

  • it offers the required LED display and SSD drive options
  • and it's a sub-four-pound ultralight
  • and it offers a non-integrated graphics option, which is incredibly rare for an ultralight
  • and it has ridiculously good design for a Dell

... sort of pushed me over the edge. Plus there are all these rave reviews of the M1330 coming in from PC Magazine, Notebook Review, and CNET. And I do mean rave reviews. It's not often you see the jaded PC Magazine reviews dish out this kind of praise:

It's been a while since Dell delivered a laptop that possessed so many awe-inspiring features. The Dell XPS M1330 is a monumental step in that it takes the best things from other great ultraportables and combines them into a single entity. My only peeve is that the weight can get up there with the nine-cell battery. Otherwise, this ultraportable should easily sit at the top of any laptop shopping list.

It's strange, in a way, because the M1330 isn't much of an upgrade from the W3J in terms of absolute hardware specifications. The display sizes are almost the same, both offer 2.0 GHz dual-core CPUs, and the M1330 is even a downgrade in one area: I ordered it with a hard drive that's less than half the size of the W3J. It's more of a sidegrade than a pure upgrade. The resulting Windows Experience benchmark scores are almost the same for both laptops, too.

Of course, the first thing I did after getting the machine was format the hard drive. It's a sad fact of life in the PC ecosystem, but if you want a machine clean of bloatware and useless, paid-endorsement installed craplets (including Google Desktop, I might add), you have to raze it to the ground yourself immediately after unboxing it.

This is a particularly egregious problem on the 32 GB solid state hard drive, because it had a 10 GB "restore" partition, and a 6 GB "media direct" partition pre-installed from the factory. Nothing like booting up a system with an already-limited 32 GB storage device and finding you only have 16 GB of disk space available. Way to go, Dell.

After formatting and beginning a clean install of Vista, I ran into a little problem where the machine would bluescreen immediately on startup after the install. I found that switching the hard drive interface from AHCI back to standard fixed that problem. According to the BIOS warning, this precludes the use of Intel's Robson onboard Flash memory cache, but with a solid state hard drive in play I don't think that's much of a loss. UPDATE: it's a better idea to install the proper AHCI driver during the Vista install process, because that's the only time you can make the switch! Copy the "Intel SATA driver" to a USB flash drive, and specify alternate driver during the drive selection phase. The only drivers you'll need for a clean 32-bit Vista install are video, sound, and wired network-- all available from the Dell XPS M1330 driver download page. Everything else is included in the default set of Vista drivers. Beware, though, because 64-bit drivers aren't available for the video card yet.

I've only had the machine since Tuesday, so I'm not really in a position to provide a comprehensive review. But after being one of the fortunate few to receive their M1330s, I have to agree with the glowing reviews. This is an outstanding machine for people like me who think laptops were meant to be portable first and foremost.

Perhaps the most striking thing about the machine is the 32 gigabyte solid-state hard drive. It's blazingly fast and completely silent. I have gotten so used to the low, metronomic rumbling of hard drives when my computers are working that the complete absence of sound in normal operation is rather strange. All you can hear is a bit of very quiet high pitched electronic buzzing, and only if you put your ear very close to the machine.

The downside, of course, is that it's only 32 GB in size. It's definitely a little tight. I wasn't too worried, because when I priced this option-- and it's not a cheap option at $500-- I was already using less than 32 GB of disk space on my current ASUS W3J laptop, which has a fairly typical 80 GB laptop hard drive. I tend to run a minimalistic laptop configuration; with Vista Ultimate, Visual Studio 2005, Office 2007, Streets and Trips, Photoshop Elements, and a few other things installed, I have almost 12 GB of disk space free on the 32 GB solid state drive. It's not quite as bad as it sounds; I carry a 100 GB external USB 2.5" hard drive in my bag as a matter of course, for virtual machines and other large items. TreeSize was always one of my key utilities; on this machine, it's my new best friend.

Treesize on a 32 GB SSD in Windows Vista

32 GB of space is enough to get by as a primary hard drive, but it definitely makes you realize how spoiled we've all become with our standard ginormous physical hard drives. Hard drive space is one of those things we stopped worrying about years ago; 500 GB desktop drives and 100 GB laptop drives are dirt cheap. But with a smallish SSD drive, you have to start caring about disk space again. On a machine with 2 GB of memory, that mandatory 2 GB hibernate file on disk, plus the 1.5+ GB swap file, start to sting a bit. You can imagine what this would be like on a 64-bit operating system with 4 GB of memory-- you'd be dropping almost a sixth of your disk space on pure overhead!

Size (and, well, price) is the only thing keeping solid state hard drives from being a no-brainer on laptops. It'll be a lot easier to stomach the size restriction when 64 gigabyte solid state hard drives are more widely available. And they're even faster:

Samsung claims the respective read and write performance on the SSD drive have been increased by 20 and 60 percent: the 64 GB unit can read 64 MB/S, write 45 MB/s, and consumes just half a watt when operating -- and merely one tenth of a watt when idle. In comparison, a mechanical 80 GB 1.8-inch hard drive reads at 15 MB/s, writes at 7 MB/s, and eats 1.5 watts either operating or when idle.

After using a machine with a solid-state hard drive for a few days, it's clear to me that solid-state hard drives are absolutely the future for all laptops, and possibly even for desktops in some scenarios. You boot up faster, you shut down faster, and launching applications feels instantaneous. On top of all that, it uses almost no power and produces virtually zero noise or heat. They just need to get the prices down and the sizes up, which will come naturally enough in time. As William Gibson said, the future is already here-- it's just unequally distributed.

It's hard to quantify these sorts of things, but I also greatly prefer the aesthetics of the M1330 over my old W3J. For one thing, the wedge shape is much more natural; the keyboard descends to meet your hands and the desktop, and it's always angled up in traditional keyboard form. I'm not sure if it's my imagination or not, but the feel of the keys is better too. One thing I can quantify is that the horrible touchpad arrangement on the W3J, where the sides and bottoms are hard-coded to be scroll areas, thankfully does not exist on the M1330. I love touchpads, and it killed me to have a crappily implemented one. That was my one major beef with the W3J.

The XPS M1330 is a proper spiritual successor to my all-time favorite Inspiron 300M. It's not quite the flyweight 3 pound champion the 300M was, but it's far more powerful and much more technologically advanced. It's also prettier, with its remarkably un-Dell-like svelte, sleek design. Be careful, though, because many of the things that make the M1330 so great are, bizarrely, add-on options-- like the solid-state hard drive, the LED display, the discrete NVIDIA 8400M GS graphics, and even the Bluetooth adapter. My only real criticism is the slot-load DVD writer; it's clever, but clever in an unnecessary way. Who still uses ye olde DVDs or CDs in this era of cheap 4 GB flash drives, broadband, and ubiquitous gigabit ethernet? I do wish they had dropped the optical drive to reduce the weight a bit further, but it's a minor complaint. Overall, I love the M1330, and I'd recommend it unconditionally to anyone who shares my preference for an uncompromising, ultralight laptop.

Discussion