Here’s a fun “riddle me this, Batman” scenario: Why does it take, on average, less time to get a burger/hotdog/chicken sandwich from the school cafeteria’s in-house grill during peak hours than during off-hours, when there are easily 5–10 times more people waiting?
Consider the Riddler defeated if you answered “Because there are more people working the grill.” Consider yourself a computer scientist if you aren’t put off by the apparently blatant common sense behind this answer. In case you haven’t already noticed, the explanation behind many “common sense” premises turns out to be computational sense, and this scenario is no exception.
It turns out that the seemingly obvious notion that more workers results in more work getting done has a few gotchas. “More workers” in computer science is typically associated with parallelism, and the one of the biggest gotchas in parallelism is Amdahl’s law, which is named after computer architect Gene Amdahl.
Simply put, Amdahl’s law states that the speedup gained from increased parallelism (i.e., adding more workers) is limited by the amount of work that can actually be parallelized. For example, having two people at the french fries station won’t churn out french fries twice as quickly, since that vat of oil doesn’t really work faster if two people are watching it. Instead, what you want is to increase the number of french-fry vats, and you get the near-ironic result that a single person suddenly churns out french fries twice as quickly.
Note how Amdahl’s law isn’t doesn’t necessarily rain on the parallelism parade — it just cautions us to make good decisions regarding when parallelism pays off. And in the case of a fixed cafeteria grill setup, the “parallelism” that brings about genuine speedup lies in additional equipment, and not additional workers.
So if parallelism isn’t truly the answer behind our grill’s peak-hour productivity, then what is? Another possible answer also comes from computer architecture’s bag of tricks: pipelines.
In a computing pipeline, a particular task is broken up into distinct stages. If each stage can be performed by a different working unit, then multiple tasks of the same type can overlap, meaning that such tasks can finish at a faster rate than if the work were performed in strict sequence.
When you think about it, a grill’s tasks, which include preparing burgers, hot dogs, chicken strips, french fries, and the like, can fit the pipeline model. A burger, for example, requires the following stages:
- Cook the meat
- Assemble the sandwich (bun, fixings, etc.)
- Wrap the sandwich
Note how we said perceived rate. In the end, it still takes the same amount of time to make a single hamburger, pipeline or not. The trick lies in overlapping the work that’s being done on multiple hamburgers (or other grill products). Which leaves one last issue to address, if you’re a stickler for details: why did the riddle say that it takes less time to get a sandwich at peak hours, when pipelining doesn’t really change the amount of time it takes to produce an individual sandwich?
Again, the answer may reek of “plain old” common sense — “less time” in this case is less time as perceived by the person waiting for the food. With more people behind the grill, and thus more overlapping stages in the pipeline, the time from order to food becomes less and less dependent on the number of orders that came before me. The “time-to-food” includes less waiting time: having a well-populated pipeline makes it seem more and more like the folks at the grill have only me as their customer.
One last real-world detail from this analysis: you might then think that a totally empty grill — i.e., a grill where I’m the only customer — will produce a sandwich just as quickly as the pipelined, peak-hour grill. While this might be true in general, note how, for the specific case of a cafeteria grill, you might still win out at peak hours. This is because, with a single person at the grill performing all of the tasks involved with making a sandwich, you may incur overhead as that person moves from one stage to another. In a pipeline, workers can churn right along on their single task, without changing places or switching equipment. So a peak-hour cafeteria grill with more workers may still get you your food faster than an off-hour grill with a single worker.
Of course, in the realm of computer architecture, we do have the advantage that additional workers don’t cost an additional chunk of salary, with benefits. This whole “employee cost” mentality is the primary reason, of course, for why many operations don’t happily hire more workers even though they know that their customers will be happier and get served faster. But somewhere along the way, there must be an inflection point: at a certain level of excellent service, wouldn’t more customers eventually show up, perhaps eventually offsetting the cost of the additional workers? Certainly — but this entails a completely different kind of “computational sense.” :)