Coding Horror

programming and human factors

Anisotropic Filtering

I've talked about Bilinear vs. Bicubic filtering before in the context of 2D images, but bilinear filtering is a key ingredient in 3D graphics, too. When a texture is applied to a polygon, the texture may be scaled up or down to fit, depending on your screen resolution. This is done via bilinear filtering.

A full discussion of 3D graphics is way outside the scope of this post-- plus I don't want to bore you to death with concepts like trilinear filtering and mip-mapping. But I do want to highlight one particular peculiarity of bitmap scaling in 3D graphics. As you rotate a texture-mapped polygon away from the viewer, simple bilinear filtering and mip-mapping cause the texture to lose detail as the angle increases:

Texture without anisotropic filtering

Now, some detail loss with distance is intentional. That's essentially what mip-mapping is. if we didn't mip-map into the distance, the image would look extremely noisy:

No mip-mapping Mip-mapping
texture without mip-mapping texture with mip-mapping

The problem with simple mip-mapping and bilinear filtering is that they're too simple. Much more detail should be retained into the distance. And that's what anisotropic filtering does:

Texture with anisotropic filtering

Because you're typically viewing most of the polygons in the world at an angle at any given time, anisotropic filtering has a profound impact on image quality. Here are some screenshots I took from the PC game FlatOut which illustrate the dramatic difference between standard filtering and anisotropic filtering:

Standard filtering 16x Anisotropic filtering
Flatout screenshot detail, standard filtering Flatout screenshot detail, 16x anisotropic filtering
Flatout screenshot detail, standard filtering Flatout screenshot detail, 16x anisotropic filtering

These are detail elements cropped from the full-size 1024x768 screenshots: standard, anisotropic.

Proper anisotropic filtering is computationally expensive, even on dedicated 3D hardware. And the performance penalty increases with resolution. ATI was the first 3d hardware vendor to introduce some anisotropic filtering optimizations-- some would say shortcuts-- in their cards which allowed much higher performance. There is one small caveat, however: at some angles, textures don't get fully filtered. ATI effectively optimized for common angles you'd see in 3D level geometry (floor, walls, ceiling) at the cost of the others.

For better or worse, these optimizations are now relatively standard now even on nVidia cards. I think it's a reasonable tradeoff for the increased image quality and performance.

In my opinion, anisotropic filtering is the most important single image quality setting available on today's 3D hardware. It's like Freedom Rock: make sure you've turned it up, man!

Discussion

The Zen of Mustard and Pickles

A co-worker and I went over to Scott's house today at around 1pm PST to pick something up for work. Scott just got a new television, so he demoed it for us, flipping through the channels, comparing HD signals to regular signals and so forth. As we were doing this we happened across this exact episode of the Maury Povich show featured on Boing Boing:

The Maury Povich Show: My Fear of Mustard and Pickles is Ruining my Life

Quite a Zen moment. It could be worse. Your fear of mustard and pickles could be ruining your life.

Discussion

The Impossibly Small PC: Nano-ITX

VIA's Nano-ITX fits an entire PC motherboard into an impossibly small 12 by 12 centimeter format:

nano_itx-256.png

This board has been gestating for a while at VIA, and according to the nanoitx blog, it's evidently because of interference and heat problems presented by the extremely small form factor. But a retail board was sighted in Japan's Akiba district in late November, so it can't be too far off now.

Format Size in cm
Nano-ITX 12 x 12
Mini-ITX 17 x 17
Micro-ATX 24.4 x 24.4
Flex-ATX 22.9 x 19.1
Standard ATX 30.5 x 24.4
PC motherboard form factor comparison

If you've ever purchased a motherboard, chances are it was standard ATX size. The ITX specifications are essentially a VIA creation, but detailed information on the other form factors is available at formfactors.org

The Nano-ITX will be available in two versions: fanless 800 mhz for ~$300 and a 1 ghz version that requires a small fan for ~$350. You'd need a few more pieces to make a complete mini-PC, though:

  • 40gb 2.5" IDE drive, $65
  • 512mb SODIMM, $45
  • Slim DVD drive, $45
  • Nano-ITX case and power supply, ~$100 (?)

It'd be easy to build a PC smaller than a Mac Mini using a Nano-ITX board. Just don't expect a lot of power; the VIA CPUs aren't exactly barn-burners. The VIA boards are fine for basic web browsing and productivity tasks with undemanding users, but if you need more than that, you'll want a Pentium M Mini-ITX motherboard.

Discussion

Getting Back to Web Basics

Every few years, Jakob Nielsen takes websites to task with a Top Ten Web Design Mistakes article. Although things have clearly improved since the original 1996 list, I'm particularly concerned that in the competitive frenzy to get all JavaScripted up for Web 2.0, we may be defeating the very simplicity that made the web so popular. Nielsen shares this concern:

This year's list of top problems clearly proves the need to get back to Web design basics. There's much talk about new fancy "Web 2.0" features on the Internet industry's mailing lists and websites, as well as at conferences. But users don't care about technology and don't especially want new features. They just want quality improvements in the basics:
  • text they can read;
  • content that answers their questions;
  • navigation and search that help them find what they want;
  • short and simple forms (streamlined registration, checkout, and other workflow); and
  • no bugs, typos, or corrupted data; no linkrot; no outdated content.

Anytime you feel tempted to add a new feature or advanced technology to your site, first consider whether you would get a higher ROI by spending the resources on polishing the quality of what you already have. Most companies, e-commerce sites, government agencies, and non-profit organizations would contribute more to their website's business goals with better headlines than with any new technology (aside from a better search engine, of course).

Of course, Web 2.0 isn't just JavaScript. But according to Paul Graham, JavaScript is one of the three key characteristics that define Web 2.0:

One ingredient of its meaning is certainly Ajax, which I can still only just bear to use without scare quotes. Basically, what "Ajax" means is "Javascript now works." And that in turn means that web-based applications can now be made to work much more like desktop ones.

In fact the new generation of software is being written way too fast for Microsoft even to channel it, let alone write their own in house. Their only hope now is to buy all the best Ajax startups before Google does. And even that's going to be hard, because Google has as big a head start in buying microstartups as it did in search a few years ago. After all, Google Maps, the canonical Ajax application, was the result of a startup they bought.

I've visited quite a few Ajax sites that committed the cardinal sin of the web: they broke the back button. Nothing demonstrates an utter disregard for the user quite like breaking the back button does. Going "back" is the second most common user activity after clicking a hyperlink. Didn't we learn our lesson with <frame>? Frame based layouts are so widely reviled for their address bar and back button breaking ways that they have been banished to freak-in-a-sideshow status. And yet when an Ajax app breaks the back button, it's no big deal, it's an acceptable side-effect of all that cool client-side processing?

Well, it is a big deal, and it isn't acceptable. I hate to single out everyone's favorite whipping boy, but it's the most recent example:

  1. Visit http://www.live.com
  2. Click the Add Content link
  3. Search for anything; I used "news"
  4. Now click the back button

I don't care how many fancy client-side features your site has-- if you break the back button, you broke the internet for your users. I can't emphasize this enough.

A big part of the web's ease of use is basic visibility-- if you can see it, you can click it. Nothing hidden. Nothing up our sleeves. But the minute you throw a drop-down menu on your page, you've broken that contract with the user. That's why drop-down menus don't belong on the web. And yet we can't seem to get away from the damn things.

Amazon, of all places, has an incredibly annoying DHTML menu on their home page. Just mouse over the "See All 32 Product Categories" tab. Is it really necessary for this tab to spawn an aggravating DHTML javascript popup, complete with its own click-interrupting animation? It's disconcerting to accidentally mouse over this area and have a popup blasted in your face. Why not just let me click the link and see the categories, like every other web page I've ever visited?

Worse, many sites' implementations of drop-down menus are erratic and inferior to the menus in the operating system. Consider the 3leaf site: why can't I click on the Services menu, while all the others are clickable? Why doesn't the pointer change to indicate that I'm clicking on a hyperlink? The future of drop-down menus is uncertain even in Windows; shoehorning a marginal GUI convention on today's web is asking for trouble. Better to avoid these problems altogether by ditching drop-down menus entirely.

Shaun Inman's site, which was inexplicably nominated for a best-designed bloggie award, is truly painful to visit. It reads like a laundry list of Nielsen's complaints:

  • The font is tiny by default, on both IE and Firefox
  • Visited links are shown in unreadable strikeout font
  • Crazy, confusing top and bottom navigation slide-out panels

Shaun Inman seems to be a well-regarded web designer-- shouldn't he know better? Or is it the Web 2.0 kool-aid? If that's what it takes to get a bloggie design award, I'm sure hoping this guy starts a blog, because he's a shoo-in for 2006.

I don't want to turn this into a rant session, so I'll stop here. Clever JavaScript on your web page does not exempt you from good web design. Instead of spending all this time exerting maximum cleverness to transcend the weakesses of the web medium, It might be a better idea to play to the web's strengths-- such as speeding up how fast your pages load, or avoiding recent ill-advised design trends. And whatever you do, don't break the freaking back button.

Discussion

I Heart Cheatsheets

I'm a huge fan of Beagle Brothers style cheat sheets, because nothing promotes the illusion of mastery like a densely packed chart of obscure reference information:

Beagle Brothers Apple II colors and ASCII values Beagle Brothers Peeks, Pokes, and Pointers
Beagle Brothers Tips, Tricks, and Techniques Beagle Brothers 6502 Instructions

Just throw some of those babies up on your walls and people will know that they're clearly dealing with a coding genius!

VisiBone makes great modern equivalents of these classic references, and they're available in dense card, dense foldout, large print book, and huge wall chart formats. The Visibone regular expression reference is the best concise regex reference I've found to date -- and there's a free online version, too. It is limited to the JavaScript regex syntax, but that's 98 percent of what most people will need.

I've seen the CSS Cheat Sheet before, but I didn't realize there's an entire set of cheatsheets freely available from the same site:

These are all conveniently provided in PDF or PNG formats. If you're a Microsoft .NET programmer, there's no shortage of cheatsheets for you as well:

If you don't see the one you want, here are few other sites that aggregate cheatsheets. Many of them are open-source / UNIX oriented, but they run the gamut.

Know of any other good cheatsheets not listed here? Share in the comments!

Discussion