Coding Horror

programming and human factors

My Work PC, or, Taking Your Own Advice

I recently had the opportunity to rebuild my work PC. It strongly resembles the "Little Bang" D.I.Y. system I outlined in my previous post on the philosophy of building your own computer.

Wumpus PC at Vertigo, under desk

Wumpus PC at Vertigo, closeup

See, I do take my own advice.

Here's a quick breakdown of the components and the rationale behind each. Every aspect of this system has been a blog post at one point or another.

  • ASUS Vento 3600 case (green)

    Is there anything more boring than a beige box? The Vento is a little aggravating to work on, and it's a bit bulky. But it's unique, a total conversation starter, and the sparkly green model fits the Vertigo color scheme to a T. I even built my wife a PC using the red Vento. The 3600 has been discontinued in favor of the 7700; the newest version is, sadly, much uglier.

  • MSI P6N SLI, NVIDIA 650i chipset

    The 650i is a far more economic variation of the ridiculously expensive NVIDIA 680i chipset, but offering the same excellent performance. Dual PCI express slots, for two video cards, is a must in my three-monitor world. It also has a fairly large, passive thin-fin northbridge cooler; quality of the motherboard chipset cooling is important, because modern motherboard chipsets can dissipate upwards of 20-30 watts all by themselves. And it still runs blazingly hot, even at idle.

  • Intel Core 2 Duo E6600

    The Core 2 Duo is Intel's best processor in years. I opted for the E6600 because I have an unnatural love for large L2 caches, but even the cheapest Core Duo 2 runs rings around the competition. And all the Core 2 Duos overclock like mad. This one is running at 3 GHz with a very minor voltage bump for peace of mind.

  • Antec NeoHE 380 watt power supply

    Great modular cable power supply, with around 80% efficiency at typical load levels. It's extremely quiet, per the SPCR review. It's a myth that you "need" a 500 watt power supply, but 380 W is about the lowest model you can buy these days. The quality of the power supply is far more important than any arbitrary watt number printed on its side.

  • Scythe Ninja heatsink

    The Ninja, despite the goofy name, offers superlative performance. It is easily one of the all-time greatest heatsinks ever made, and still a top-rank performer. It's quite inexpensive these days, too. As you can see, I am running it fanless. The Ninja is particularly suited for passive operation because of the widely-spaced fins. It's easily cooled passively, even under overclocked, dual prime 95 load, by the 120 mm exhaust fan directly behind it. (Disclaimer: I have a giant heatsink fetish.)

  • Dual passive GeForce 7600 GT 256 MB video cards

    The 7600 GT was the runaway champ in the video card power/performance analysis research I did last summer. The model I chose is a passively cooled, dual slot design from Gigabyte (model NX76T256D). It offers outstanding performance, it runs cool, it has dual DVI, and the design is clever. I liked this card so much, I bought two of them. Not for SLI (although that's now an option) but for more than two monitors. It's inexpensive, too, at around $115 per card.

  • 2 GB of generic PC800 DDR2

    I don't believe in buying expensive memory. It's not worth it, unless you're an extreme overclocker. I buy cheap, reasonable quality memory. Even the cheap stuff overclocks fairly well, at least for the moderate overclocks I'm shooting for.

  • 74 GB 10,000 RPM primary hard drive; 300 GB 7,200 RPM secondary hard drive

    I cannot emphasize enough how big the performance difference is between 10,000 RPM drives and 7,200 RPM drives. I know it's a little expensive, but the merits of the faster drive, plus the flexibility of having two spindles in your system, makes it well worth the investment.

And it's quiet, too. The entire system is cooled by three fans: one 120mm exhaust, the 80mm fan in the power supply, and an optional 80 mm fan I installed in the front of the case to keep hard drive temperatures down. Airflow in the hard drive area is quite limited on the Vento.

One of the advantages of a D.I.Y. system is that you can perform relatively inexpensive upgrades instead of buying an entirely new computer. The most recent one was plopping in the new motherboard/cpu/memory/heatsink combo. With that upgrade, I now have a top of the line dual-core PC running at 3.0 GHz-- and it only cost me $650 to get there.

Discussion

Building a Computer the Google Way

If you're ever in Silicon Valley, I highly recommend checking out the Computer History Museum. Where else can you see a live demonstration of the only known working PDP-1 in existence, and actually get to play the original Spacewar on it? I did. It was incredible. I got chills. And my wife was bored beyond belief, but I love her all the more for soldiering through.

Beyond the special exhibits, the Visible Storage area is where the real action is at in the museum. It takes up the majority of the floor space, and it contains every computer I've ever heard of. Among the artifacts in visible storage is one of Google's original servers from 1999:

Google Server at the Computer History Museum, rack from afar   Google Server at the Computer History Museum, closeup of rack

Google Server at the Computer History Museum, placard: With limited funds, Google founders Larry Page and Sergey Brin initially deployed this system of inexpensive, interconnected PCs to process many thousands of search requests per second from Google users. This hardware system reflected the Google search algorithm itself, which is based on tolerating multiple computer failures and optimizing around them. This production server was one of about thirty such racks in the first Google data center. Even though many of the installed PCs never worked and were difficult to repair, these racks provided Google with its first large-scale computing system and allowed the company to grow quickly and at minimal cost.

If Google's first production server resembles a hastily cobbled together amalgam of off-the-shelf computer parts circa 1999, well, that's because it is. Just like Google's original servers at Stanford. If you think this rack is scary, you should see what it replaced.

Instead of buying whatever pre-built rack-mount servers Dell, Compaq, and IBM were selling at the time, Google opted to hand-build their server infrastructure themselves. The sagging motherboards and hard drives are literally propped in place on handmade plywood platforms. The power switches are crudely mounted in front, the network cables draped along each side. The poorly routed power connectors snake their way back to generic PC power supplies in the rear.

Some people might look at these early Google servers and see an amateurish fire hazard. Not me. I see a prescient understanding of how inexpensive commodity hardware would shape today's internet. I felt right at home when I saw this server; it's exactly what I would have done in the same circumstances. This rack is a perfect example of the commodity x86 market D.I.Y. ethic at work: if you want it done right, and done inexpensively, you build it yourself.

Even today, Google is serious about exerting total control over the servers in their now-massive server farms. They build their own high-efficiency power supplies, and conduct fascinating, public research on disk failure (pdf). Current estimates put Google's server farm at around 450,000 machines-- and they're still custom built, commodity-class x86 PCs, just like they were in 1999.

Like Google, I demand total control over every part of my PC. I've always built my own. Building your own PC isn't for everyone, but if you're willing to add a little elbow grease, the D.I.Y. approach can result in a higher quality, better performing PC-- often at a substantial cost savings.

Here's a chart I put together based on my research for the Scott Hanselman Ultimate Developer Rig Throwdown:

D.I.Y.
"Big Bang" 
D.I.Y.
"Little Bang"
Mac  Pro Dell XPS 710 Dell Dimension 410
CPU Intel Core 2 Quad
2.4 GHz 
Intel Core 2 Duo
2.4 GHz
2 x Intel Core 2 Duo
2.66 GHz
Intel Core 2 Duo
2.4 GHz
Intel Core 2 Duo
2.4 GHz
Memory 4 GB, DDR 800 2 GB, DDR 800 1 GB, DDR ECC 667 2 GB, DDR 667 2 GB, DDR 667
Mobo P965 premium P965 budget Intel 5000x unknown unknown
Drives 2 x 150 GB 10k RPM (RAID 0)
2 x 750 GB (RAID 1)
500 GB 250 GB 500 GB 500 GB
Video 2 x 512 MB X1950 Pro 256 MB X1950 Pro 256 MB 7300 GT 256 MB 7900 GS 256 MB 7900 GS
Case Antec P180
case-antec-p180b-small.jpg
Antec P180 Apple
case-mac-pro-small.jpg
XPS
case-xps-700-small.jpg
Dimension
case-dell-dimension-410-small.jpg
Other Premium PSU
Premium heatsink
Premium PSU
Premium heatsink
OS X
Bundled software
Windows Vista Windows Vista
Price $3,500 $1,400 $2,499 $2,039 $1,400

If you're willing to factor out the cost of the operating system, the D.I.Y. "Little Bang" system offers more bang for the buck than any of its peers. And the "Big Bang" is off the charts, if you have the budget.

The lower-end Dell system looks quite similar, but closer inspection reveals otherwise:

  • Dell's use of non-standard case connectors and power supply connectors prevents future upgrades using standard commodity parts.
  • The OEM parts used in Dell machines are generally of inferior quality to their retail equivalents. OEM parts are impressive on the surface, but cut corners to lower costs. For example, the use of slower DDR 667 memory; cut-down, featureless OEM motherboards; video cards with lower clocks and slower memory.
  • Absolutely no overclocking potential.
  • Limited internal case expansion for additional hard drives and video cards.

The Mac Pro is a beautifully designed machine, but it has some quirks, too:

  • Quad-core in a single socket, ala the "Big Bang" system, makes more sense than this Dual-Dual arrangement. Obviously Apple will produce a Dual-Quad for a total of 8 CPUs any day now. But there is a serious point of diminishing returns with additional CPUs unless you're doing something highly specific and highly parallelizable like raytracing or rendering.
  • Requires expensive DDR2 buffered ECC RAM, because it's a server motherboard.
  • Zero overclocking options.
  • 667 MHz memory? Not that it matters very much to bottom-line performance, but support for different FSB speeds would be nice.
  • The default video card and hard drive are totally pedestrian, and will limit overall performance unless replaced.

If you don't have the time or inclination to build your own desktop PC, the Dells and the Mac Pro are perfectly valid choices. The prices are reasonable; the configurations flexible. There's absolutely nothing wrong with buying pre-built, as long as you spec carefully. But by the time I'm done setting up my D.I.Y. "Little Bang" system, it'll be faster, quieter, and more power efficient than any of the pre-built systems-- for the same money, or less. This is possible because the D.I.Y. system is uniquely mine; I choose exactly what goes in it, and exactly how it's configured.

Pre-built might work for typical users. But pre-built didn't work for Google. And pre-built doesn't work for me.

We aren't typical users. We're programmers. The x86 commodity PC is the essential, ultimate tool of our craft. It's the end product of 30 years of computer evolution. And it's still evolving today, with profound impact on the way we code. If you treat your PC like an appliance you plug into a wall, you've robbed yourself of a crucial lesson on the symbiotic relationship between software and hardware. The best way to truly understand the commodity PC is to gleefully dig in and build one yourself. Get your hands dirty and experience the economics of computer hardware first hand-- the same economics that have shaped the software industry since the very first line of code was stored in memory.

Who knows, you might even enjoy it.

Discussion

Software Internationalization, SIMS Style

Internationalization of software is incredibly challenging. Consider this Wikipedia sandbox page in Arabic, which is a right-to-left (RTL) language:

Wikipedia sandbox in Arabic

Compare that layout with the Wikipedia page on internationalization and localization in English. Now consider how you'd implement switching between English and Arabic in MediaWiki, the software that powers Wikipedia:

  • Every bit of static text on the page has to come out of a unicode string resource file, indexed per-culture.
  • Images that happen to contain text, or are otherwise culture-specific, must also be placed in a resource file and indexed per-culture.
  • Numbers, currency, and dates must be displayed (and validated) differently depending on what country your audience lives in.
  • You could detect the country your users are in, and automatically assume which language they're using. But this is obviously problematic in countries where multiple languages are spoken. Or, you can allow users to manually choose a language the first time they access your application. This is slightly easier in web applications, because you can absorb the ambient language setting from the browser's HTTP headers.

It's a lot of work.

Beyond the purely mechanical grunt work of translation, there are deeper cultural issues to consider, such as avoiding offensive images, colors, or concepts for certain cultures – and how the concepts you're trying to express in the software will map to a given culture. As noted in a related Larry Osterman post, these deeper cultural considerations are collectively known as localization:

[localization] is a step past translation, taking the certain communication code associated with a certain culture. There are so many aspects you have to think about such as their moral values, working styles, social structures, etc... in order to get desired (or non-desired) outputs. This is one of the big reasons that automated translation tools leave so much to be desired - humans know about the cultural issues involved in a language, computers don't.

The Sims has a unique solution that sidesteps the software internationalization problem. They invented an entirely new, completely artificial language: Simlish. Simlish renders your cultural background irrelevant. When you redefine language as gibberish, it's equally meaningless to everyone. Or is it? Somehow, The Sims is playable without a lick of translation or localization, without any comprehensible language of any sort.

Signs in The Sims games often do not contain text; they consist entirely of graphics. For instance, the stop sign in The Sims is a red octagon with a flat, white hand. In The Sims 2 it becomes a white bar instead. The sign for a grocery store depicts a cornucopia, and that of a restaurant shows a hamburger or a place setting.

In The Sims, most text is only distinguishable at very close zooms. On book covers, newspapers and Nightlife's "Sims Must Wash Hands" sign, the lettering is all nonsense characters that bear about as much resemblance to Latin characters as they do to Cyrillic. Almost no actual characters from any known alphabet are used. The game uses the Simoleon sign (closely resembling ) as the currency symbol.

The Simlish alphabet

When Sims are writing novels or term papers, dingbats from the Wingdings font appear as text on the screen. The notebooks used for homework contain writing composed of random lines.

Characters in The SIMS don't just write in Simlish – they speak it, too:

When The Sims was originally designed, Will Wright wanted the language the Sims spoke to be unrecognizable but full of emotion. That way, every player could construct their own story without being confined to a Maxis-written script (to say nothing of the mind-numbing repetition). We experimented with fractured Ukrainian, and the Tagalog language of The Philippines. Will even suggested that perhaps we base the sound on Navajo, inspired by the code talkers of WWII. None of those languages allowed us the sound we were looking for – so we opted for complete improvisation.

Simlish is, by definition, meaningless. And yet it's surprisingly easy to figure out what a Sim is talking about, even without any visual point of reference or a facial expression to read. The intonation and context of the sounds is enough to extract meaning. Try these two Simlish MP3 samples (one, two) and hear for yourself.

Simlish even extends to music. Last year, Maxis paid many original artists to re-record their songs with Simlish lyrics:

Each artist rerecorded one of their songs with new vocal tracks, replacing English lyrics with nonsensical Sim-speak. Simlish words don't have any real meaning, so the artists were free to come up with whatever sounded good, as long as English didn't seep in. The result isn't that different from what bands like the Cocteau Twins and Vas already do. The idea is to transcend words and use the human voice to express pure emotion.

Charlotte Martin, whose song "Beautiful Life" finds its way onto the University soundtrack, took things a step further than some of the other artists. She didn't just sing gobbledygook, she made sure all the Simlish words were consistent with their counterparts in the English version. "It still had the same meaning, I just had to write it in an alien language," Martin said. In rewriting the song, Martin said it changed the way she thinks about lyrics, letting her come at her creation from a more technical standpoint, paying closer attention to syllables and rhythm.

Probably the funniest example of this is the Pussycat Dolls' re-recording of "Don't Cha" in Simlish.

Listen to "Don't Cha" in Simlish (mp3). Singing in gibberish almost makes a Pussycat Dolls song more intelligible. It's brilliant. Doba, baby, doba!

Another example is Lily Allen's "Smile". Compare the original version of "Smile" with the Simlish re-recording of "Smile". It works well for that cheeky little song, but it's a little weirder when a morose band like Depeche Mode re-records a song in Simlish.

When you hear Simlish, you expect to hear meaningless gibberish. But instead, you hear something else, something unexpected. The absence of language isn't limiting; it's liberating. You move beyond language, from expressing with words to expressing visually, aurally, emotionally:

For songstress Abra Moore, whose song "Big Sky" was used in the game, singing in Simlish gave her a new perspective on her music. "It's like jazz for me; I just take to it like a duck to water," Moore said. "It was very liberating creatively." The experience made such an impression on Moore that she said she'd consider recording a song in Sim-like scat on a future album. She perceives the emotional lyrics, divorced of a specific meaning, in almost a spiritual light. She's fascinated that fans try to interpret the nonsensical lyrics. It represents the essence of human nature, Moore said, to take meaning from something that has no meaning.

Spoken words and music are dense with multiple levels of audible meaning. We probably can't take such Simlish liberties with applications and web sites, which are anchored on the flat, one-dimensional medium of text. The challenges of i18n and l10n are unavoidable for us. But as the Sims shows us, there's a lot to be said for following human conventions which work across all languages and cultures.

Discussion

Dude, Where's My 4 Gigabytes of RAM?

Due to fallout from a recent computer catastrophe at work, I had the opportunity to salvage 2 GB of memory. I installed the memory in my work box, which brings it up to 4 gigabytes of RAM-- 4,096 megabytes in total. But that's not what I saw in System Information:

Vista System Information, 4 GB installed, 32-bit operating system

Only 3,454 megabytes. Dude, where's my 4 gigabytes of RAM?

The screenshot itself provides a fairly obvious hint why this is happening: 32-bit Operating System. In any 32-bit operating system, the virtual address space is limited, by definition, to the size of a 32-bit value:

232 = 4,294,967,296
4,294,967,296 / (1,024 x 1,024) = 4,096

As far as 32-bit Vista is concerned, the world ends at 4,096 megabytes. That's it. That's all there is. No ms.

Addressing more than 4 GB of memory is possible in a 32-bit operating system, but it takes nasty hardware hacks like 36-bit PAE extensions in the CPU, together with nasty software hacks like the AWE API. Unless the application is specifically coded to be take advantage of these hacks, it's confined to 4 GB. Well, actually, it's stuck with even less-- 2 GB or 3 GB of virtual address space, at least on Windows.

OK, so we're limited to 4,096 megabytes of virtual address space on a 32-bit operating system. Could be worse.* We could be back in 16-bit land, where the world ended at 64 kilobytes. Brr. I'm getting the shakes just thinking about segments, and pointers of the near and far variety. Let us never speak of this again.

But back to our mystery. Where, exactly, did the other 642 megabytes of my memory go? Raymond Chen provides this clue:

In the absence of the /PAE switch, the Windows memory manager is limited to a 4 GB physical address space. Most of that address space is filled with RAM, but not all of it. Memory-mapped devices (such as your video card) will use some of that physical address space, as will the BIOS ROMs. After all the non-memory devices have had their say, there will be less than 4GB of address space available for RAM below the 4GB physical address boundary.

Ian Griffiths offers a more detailed explanation:

To address 4GB of memory you need 32 bits of address bus. (Assuming individual bytes are addressable.) This gives us a problem - the same problem that IBM faced when designing the original PC. You tend to want to have more than just memory in a computer - you need things like graphics cards and hard disks to be accessible to the computer in order for it to be able to use them. So just as the original PC had to carve up the 8086's 1MB addressing range into memory (640K) and 'other' (384K), the same problem exists today if you want to fit memory and devices into a 32-bit address range: not all of the available 4GB of address space can be given over to memory.

For a long time this wasn't a problem, because there was a whole 4GB of address space, so devices typically lurk up in the top 1GB of physical address space, leaving the bottom 3GB for memory. And 3GB should be enough for anyone, right?

So what actually happens if you go out and buy 4GB of memory for your PC? Well, it's just like the DOS days - there's a hole in your memory map for the IO. (Now it's only 25% of the total address space, but it's still a big hole.) So the bottom 3GB of your memory will be available, but there's an issue with that last 1GB.

And if you think devices can't possibly need that much memory-mapped IO, I have some sobering news for you: by this summer, you'll be able to buy video cards with 1 GB of video memory.

To be perfectly clear, this isn't a Windows problem-- it's an x86 hardware problem. The memory hole is quite literally invisible to the CPU, no matter what 32-bit operating system you choose. The following diagram from Intel illustrates just where the memory hole is:

Intel system memory map

The proper solution to this whole conundrum is to use a 64-bit operating system. However, even with a 64-bit OS, you'll still be at the mercy of your motherboard's chipset and BIOS; make sure your motherboard supports using 4 GB or more of memory, as outlined in this MSKB article.

264 = 18,446,744,073,709,551,616
18,446,744,073,709,551,616 / (1,024 x 1,024) / 8 = 2 exabytes

In case you're wondering, the progression is giga, tera, peta, exa.

Although the performance benefits of 64-bit are somewhat dubious on the desktop, a 64-bit OS absolutely essential if you run applications that need to use more than 2 GB of memory. It's not common, but we're getting there.

The memory hole for IO still exists in the 64-bit world, but most modern BIOSes allow you to banish the IO memory hole (pdf) to some (for now) ridiculously high limit when you're running a 64-bit OS. Don't get too excited, though. The user-mode virtual address space in 64-bit Windows is a mere 8 terabytes. Suffice it to say that we won't be running out of physical or virtual address space on 64-bit operating systems for the forseeable future. It's the final solution, at least for the lifetime of everyone reading this blog post today.

Here's one parting bit of advice: if, like me, you're planning to stick with a 32-bit operating system for the next few years, don't waste your money on 4 GB of RAM. You won't be able to use it all. Buy 3 GB instead. Every motherboard I'm aware of will happily accept 2 x 1 GB and 2 x 512 MB DIMMs.

* Could be raining.

Discussion

Getting the Most Out of PNG

When it comes to image formats on the internet, it's generally a three-way tie between JPEG, GIF, and PNG. Deciding which image format to use is relatively straightforward; you choose lossy JPEG when you're saving continuous-tone photographic images, and you choose between lossless GIF or lossless PNG when you're saving images with large blocks of the same or similar colors. See my comparison of GIF/PNG and JPEG if you're not clear on what the difference is. But the choice between GIF and PNG is no contest. PNG is a more modern and vastly improved version of GIF that (almost) completely obsoletes it. You should always choose PNG over GIF, except in the following two circumstances:

  • You want an animated graphic. PNG doesn't support animation. GIF does.
  • Your image is extremely small, on the order of a few hundred bytes. In my experience, GIF filesizes are smaller in this scenario.

In every other way, PNG is the natural heir to GIF. It's copyright-free, it can store all bit depths, it can represent alpha channels, and it offers more efficient compression. But as great as PNG is, there are a few things you should know about PNG to get the most out of it.

Let's start with a representative image. I took a quick screenshot of this website, along with all the browser chrome, transparency, and shadows. ClearType font rendering is on, and there's a nice mix of text, graphics, and UI. It's a perfect candidate for the lossless PNG file format, because there are large areas of the same colors and hard transitions between them. We want nice, crisp transitions between the white and dark areas of the screenshot.

small screenshot of Coding Horror in IE7

The actual size of the screenshot is 1233 x 946. When I save this file directly from Paint Shop Pro as a 24-bit PNG file, I get the following file sizes:

PNG, interlaced288 KB
PNG, non-interlaced212 KB

So here's our first lesson: never save interlaced PNG files.

  • Interlaced PNGs are 35% larger for the single purpose of progressive rendering.
  • Progressive rendering is confusing; the image gets less and less blurry over time. As Philip Greenspun so aptly pointed out, readers can't tell when an image is "done".
  • Standard PNGs have a perfectly acceptable progressive rendering solution without interlacing. They render in obvious and simple fashion, from top to bottom.

212 KB is an impressively small filesize for such a large and detailed image. It's a testament to the efficiency of the PNG image format. But we can do better. If we run Ken Silverman's PNGOUT* on the files, we can crunch them down even smaller:

PNG, interlaced190 KB
PNG, non-interlaced190 KB

First, note that PNGOUT strips out any interlacing. If you have interlaced PNG images, you can expect a very substantial reduction in file size. But even without interlacing, PNGOUT reduces the file size by 22 KB, or nearly 10 percent. I know it doesn't sound like much, but PNG is by definition lossless compression. JPEG is lossy, so as file sizes decrease, more and more of the image is lost. With PNG, we haven't lost any detail in our images, we've just made them smaller. Folks, this is free bandwidth! It doesn't get much better than that.

To see how effective PNGOUT really is, I ran it on a subset of my /images folder. The trick here is that these images are already optimized; I run OptiPNG on every file in this folder periodically.

OptiPNGPNGOUT
267 PNG files4.40 MB4.04 MB

It took a while to run, but we get a further 9% reduction in PNG image size beyond what OptiPNG could do. How is this possible?

I thought the name Ken Silverman sounded familiar. Ken, the author of PNGOUT, is the wunderkind behind the original Duke Nukem 3D build rendering engine, which he wrote at the age of 18.

Duke Nukem: always bet on Duke

Ken is so good, even John Carmack – the author of Doom and Quake, who is widely regarded as a programming god – respects him. No wonder his little PNG optimizer decimates all the other ones. Always bet on Duke.

If you're running a website of any size, and you use PNG images in any quantity, you should run them through PNGOUT to reduce their size. PNGOUT can also convert your existing GIF images to the superior PNG format along the way. And it's so easy to do; here's the Windows command prompt syntax to optimize all PNG images in a folder:

for %i in (*.png) do pngout "%i" /y

The PNGOUT optimization process isn't particularly speedy, but it hardly matters. This one-time optimization could reduce your image bandwidth usage from 10 to 30 percent. That's an offer I can't refuse.

(* thanks to Kevin Breitenstein for pointing this out to me)

Discussion