Friday, October 12, 2007

Keep Your Friends Close, Your Enemies Closer, and Your Work Closest of All

As coffee and tea were being proffered after dinner at a friend’s house one night, I noticed that our hosts had loaded the dishwasher and started it up. It was a gathering of good friends, and so the dishes were done well before the group was ready to call it a night.

This friend has a large, spacious kitchen, with storage galore and an attractive island in the middle. Said island also houses the dishwasher, which our host then approached upon realizing that the dishes were done. He opened the dishwasher, and started to take items out a few at a time, moving around the kitchen as he put them in their respective locations.

What’s wrong with this picture? Nothing, really, except that my friend was taking more time to put away the dishes than was really necessary. I suggested an alternative: instead of taking random groups of items from the dishwasher, then placing them in the appropriate locations in the kitchen immediately, why not take everything out of the dishwasher first, and then, when everything was laid out on that generous island, sort them out by where they belong and take them in these clusters to their designated places?

This tweak in dishwasher unloading sped things up noticeably. Instead of multiple repetitions of crouching into the dishwasher, pulling things out, walking around the kitchen, then keeping them where they belong, all of the crouching and pulling was done at the beginning. Walking clusters of items from the island to their designated cabinet, drawer, or cupboard resulted in fewer trips to and from the kitchen’s center island.

What happened here? Some would say common sense; in this case, however, there is also computer science beneath the common sense. By doing all of the crouching-pulling-out first, and then doing all of the walking-to-cupboard activities in clusters next, we were taking advantage of locality of reference.

Locality of reference is a fairly simple premise: individual tasks or units of work tend to cluster around a certain group of items. We stay “within” this group of items for as long as we’re performing that particular task. When we move on to something else, we shift to a different group of items.

The upshot of this principle is that, if we identify the items that we need in order to perform a particular task, we can perform that task more quickly if we keep those items close by for the duration of that task. Here’s where computer science shoots back to common sense: for as long as you need something, keep it close by; the closer something is to you, the more quickly you can get to it, and so doing any work that involves that item proceeds more quickly.

Unloading a kitchen item from a dishwasher can be broken down into:
  1. Crouching over the open dishwasher
  2. Taking the item out of the dishwasher
  3. Transporting the item to its proper location in the kitchen
  4. Placing the item in that location
At first, our dinner host was performing each of these steps for random batches of items — whatever he can carry — from the dishwasher. Note how he wasn’t taking advantage of locality of reference: he had to shuttle between the dishwasher and multiple locations in the kitchen for every batch of items, thus incurring travel time that made the overall task take longer. The “fix” that was suggested took into account the fact that multiple items typically belonged to a particular area of the kitchen. Multiple trips for individual items became a single trip for a group of items: the “referenced” items were kept together (“local”) as much as possible, both when taking them out of the dishwasher (one “locality”) and when putting them away in their designated kitchen location (the other “locality”).

Locality of reference plays a significant role in many areas of computer science. For instance, it is the foundational principle behind caches in computer architecture — small blocks of fast memory which hold the most-referenced data at any given time. Faster memory access for the data we need most frequently translates to faster performance.

In operating systems, virtual memory allows programs to work as if they have more memory available to them than actually exists on the physical machine. Due to locality of reference, a program is very unlikely to need absolutely all of that memory at any given time; instead, it moves from one section of its memory space to another as it does its work. The net result is that an operating system can temporarily shunt the (currently) unused sections of memory to a disk, automatically moving them back when the program goes into a new unit of work (thus needing a new memory “locale”).

But the bottom line is that locality of reference may be found in any task that we undertake, regardless of whether that task involves a computer. If we are on the lookout for the “localities” within that task, and find ways to restructure what we do to stay within those localities as much as possible, then we may be able to squeeze things out just a little (or a lot) more quickly.

Of course, this whole discussion would never have come up had the guests also helped by lending more hands to put the dishes away...but that has to do with parallelism. That can show up in daily life too, but we’ll get to that some other time.

No comments: