Coding Horror

programming and human factors

Visual Studio .NET 2003 and 2005 Keyboard Shortcuts

I've been trying to improve my use of keyboard shortcuts in Visual Studio .NET. Here are the ones I use most often, what I consider my "core" keyboard shortcuts:

Go to declaration

F12

Debug: step over

F10

Debug: run to cursor

ctrl + F10

Debug: step into

F11

Debug: step out

shift + F11

Toggle a breakpoint

F9

Go to next item in task list or search results

F8

Go to previous item in task list or search results

shift + F8

Switch to code view

F7

Switch to designer view

shift + F7

Run with debugging

F5

Stop debugging

shift + F5

Run without debugging

ctrl + F5

Move to previous edit point (back)

ctrl + -

Switch to the Task List

ctrl + alt + K

Switch to the Immediate window

ctrl + alt + I

Switch to the Output window

ctrl + alt + O

Find

ctrl + F

Find in all files

ctrl + shift + F

Replace

ctrl + H

Incremental find (it's pure sex!)

ctrl + I

I'm still struggling to find a keyboard shortcut that sets the input focus back to the code window after doing a ctrl+alt+O or ctrl+alt+K. As it turns out, that key is

Esc

But enough about me. Which keyboard shortcuts do you use most often?

The best early reference for keyboard shortcuts in VS.NET 2003 was Mastering Visual Studio .NET; Appendix C contains an excellent reference table of all the keyboard shortcuts in .NET.

As good as that reference table is, you can generate a better keyboard reference yourself using my improved keyboard shortcut enumerator macro.*

vs_net_keyboard_macro_screenshot.png

It groups the results by scope and sorts by primary keyboard key so related key accelerators are all displayed together (eg, F5, ctrl+F5, shift+F5, etc).

The macro works in VS.NET 2003 and VS 2005, and unless you are a total Visual Studio Ninjatm I guarantee you'll find at least a few keyboard shortcuts in there that you didn't know about. I also did a diff on the resulting files to see what keyboard shortcuts have changed in Visual Studio 2005:

* Originally based on an earlier macro I discovered.

Discussion

Flickr Hacks

There's so much buzz around Flickr right now it's practically deafening. Or maybe I should say blinding, because Flickr is a collaborative photo sharing service. I was perplexed why Yet Another Photo Sharing Website was so hot until I started browsing the myriad hacks and tools available for this site. Flickr has a web API, and there's a .NET wrapper around that API available at Flickr.NET. It's truly astonishing; a case study in what having an open API and community-driven content can do for your business. Here are some of the cooler Flickr hacks (warning-- heavy use of Flash ahead)

  • Flickr color picker. Shows all pictures for any given color you click in the color wheel. Quite mesmerizing.
  • Flickr postcard browser. Photos are "tagged" in Flickr with various descriptive words by the users. This is a quick way to browse around a specific tag.
  • Flickr related tag browser. This is like the postcard browser, but it also shows related tags that are frequently associated with whatever tag you're browsing in a ring around the pictures. Fantastic for browsing around and getting a sense of what tags are in use.
  • FlickrGraph. Flickr also contains social networks-- users who mark each other's photos as "favorites". This tool lets you map out the relationships between users in graphical form.
  • Flickr Replacer. A neat bookmarklet that takes any highlighted word on the web page and replaces it with an image representing that word (via tags, of course). Perfect for getting your rebus on.
  • Spell with Flickr. Spells a word of your choice using Flickr images representing each of the letters.

As you can see from the above sampling, Flickr is all about tags. There's a neat page on Flickr that shows the most popular tags at any given moment.

Amateur photographers take far better pictures, on the whole, than I could have ever possibly imagined. After seeing this, who needs professional photographers? Still, there's a big gap between the good and great pictures. My biggest frustration with flickr is that there's no rating system for the pictures. You can only browse pictures by user, or by keyword. I have a hard time coming up with tag keywords (frogs? dogs? clouds? graffiti?), and I don't really know any Flickr users. I'd rather just subscribe to some feed of highly rated pictures. Sort of an AmIHOTorNOT for photos, but hopefully without the prurience and desperation.

This is, of course, only the tip of the iceberg. There's an exhaustive list of all the Flickr hacks at The Great Flickr Tools Collection.

Discussion

Compression and Cliffs

I set up a number of Windows XP SP2 Virtual PC base images today. A WinXP SP2 clean install, after visiting Windows Update, is 1.70 gigabytes. Building up a few baseline images like this can chew up a substantial amount of disk space and network bandwidth. So, taking a page from Jon Galloway's book, I decided to see what I'd get if I compressed the virtual hard drive file. My results?

AppSizeTime taken (approx)
WinZip 9.0 SR-1880 megabytes3 minutes
7Zip 4.20739 megabytes22 minutes

(All apps were used with out of box defaults). I do end up with a file that is 17% smaller, but it takes 7.3 times longer. That sure doesn't seem like a very good deal to me. Now, in fairness to Jon, his only goal was to squeeze a largish 10gb VHD image into a single 4.7 gigabyte DVD-R; compression time wasn't a criteria.

Although this is my first exposure to 7zip, I've run these kinds of comparions before with ZIP and RAR and reached the same conclusion. Although there are certainly different algorithmic efficiencies, no matter what compression algorithm you choose-- beyond a certain optimal compression level, performance falls off a cliff. After you hit that point, you'll spend obscene amounts of time for rapidly diminishing benefits. Nowhere is this better illustrated than in Wim Heirman's Practical Compressor Test results:

GIMP source compression results, compression ratio vs. time

Note that the scale on the bottom of the graph is logarithmic. This is the only comparison I could find that properly expresses compression as the zero-sum game it really is: you can either have efficiency, or you can have speed. That's why, except for the truly obsolete algorithms, you see the "diagonal line" effect on this graph: better compression algorithms always take longer. Sometimes a lot longer. If you're holding out for Hyper Turbo Extreme Compression X algorithm, you may be waiting a while. Consider RAR, which offers the best blend of compression and speed currently available:

LevelTime (secs)Compression ratioTime factorGain
-m15.722.1%1x-
-m228.314.5%5x longer7.6% smaller
-m340.213.4%7x longer8.7% smaller
-m440.213.1%7x longer9.0% smaller
-m546.712.5%8x longer9.6% smaller

When it takes 5 times longer for barely 8% more compression, you've fallen off a cliff. But it still might be worth the extreme cost, depending on your goals. For most usages, it boils down to these three questions:

  1. How often will your data be compressed?
  2. How many times will it be transferred?
  3. How fast is the network?

Decompression time in this scenario is usually a tiny, relatively constant fraction of the compression time, so it's not a factor. Wim provides a helpful calculator to assist in making this decision:

total time = compression time + n * (compressed file size / network speed + decompression time)

For instance, if you compress a file to send it over a network once, n equals one and compression time will have a big influence. If you want to post a file to be downloaded many times, n is big so long compression times will weigh less in the final decision. Finally, slow networks will do best with a slow but efficient algorithm, while for fast networks a speedy, possibly less efficient algorithm is needed.

Of course, there are still a few variables Wim's page hasn't considered. Most notably, he only compresses a single file (the GIMP source TAR file), which has two consequences:

  1. Filetype specific compression can perform far better than the generic compression considered here. Compression tailored to file contents (eg, lossless audio compression) is generally a huge win.
  2. When compressing groups of files, programs that can do "solid" archiving will always outperform those that can't. Solid archiving means that the files are compressed as one giant binary blob and not as a bunch of individual files. This provides a much higher level of overall compression due to (generally) repeated data between files.

No one set of benchmarks offers a complete picture. Most other compression benchmark pages tend to focus on absolute compression ratios to the detriment of all other variables, which is a little crazy once you've fallen off the cliff. On Wim's page, the slowest three times are 198 (7zip), 47 (rar), and 43 (bzip2) seconds respectively. Some of the more extreme space-optimized compression algorithms can take several hours to compress the same file!

Discussion

x86 Uber Alles

I guess John Gruber isn't as savvy as he thought he was:

Apple Announces Switch to Intel Chips

After seeing NT slowly shed its MIPS, Alpha, and PPC versions, you have to wonder: will our children be using architectures that emulate some form of x86?

Even with a flash new CPU powering it, Apple's OSX has some performance issues of its own to resolve. A recent AnandTech article by Johan De Galas documents OSX running Apache and MySQL 5-10 times slower than Linux on the same hardware. This is evidently due to a bizarre "worst of both worlds" kernel architecture where applications have no access to the kernel-level threads commonly used for performance reasons on Windows and Linux:

The server performance of the Apple platform is, however, catastrophic. When we asked Apple for a reaction, they told us that some database vendors, Sybase and Oracle, have found a way around the threading problems. We'll try Sybase later, but frankly, we are very sceptical. The whole "multi-threaded Mach microkernel trapped inside a monolithic FreeBSD cocoon with several threading wrappers and coarse-grained threading access to the kernel", with a "backwards compatibility" millstone around its neck sounds like a bad fusion recipe for performance.

Discussion

The Game Controller Family Tree

Remember when anything you could possibly imagine could be controlled with a single stick and a single button? Trace the evolution of human interaction in video gaming from 1980 to the present through this remarkable visual game controller family tree.

video game controller family tree

This tree isn't quite as complete as I would like (where is the Odyssey2, for example), but it's got the major players. I had half forgotten how weird the controllers with keypads were, and that the Atari Jaguar was the last console to adopt this oddball convention. Input conventions have certainly changed a lot over the last 20 years; the most profound change-- single digital axis to dual analog axes-- was prompted by the broad switch from 2D to 3D games around 1997.

If you're feeling particularly retro, you can pick up a modern USB incarnation of the no-frills Atari 2600 joystick that started it all for a mere five bucks.

Discussion