18.19, Friday 14 Feb 2003

The computational cost of any activity can be measured in instructions processed. In the early days, when the model was central mainframes and dumb terminals, whatever you did had a very low local cost and high remote cost (but not very remote, because the mainframe wouldn't be very far away). Next, microcomputers: Computation on the desktop. Whatever you do is processed locally. Now, when I hit a website, there's a certain cost locally, but there's computation at every switch and router and on the webserver itself. (I'm only talking about direct cause computation here, so the instruction processed to create the website in the first place, or even the chips themselves, don't count.)

I'd like to see a graph of: Computation power consumed per person per second, how that's changed over the years, and where that computation occurs (on an axis from local to very remote). (And incidentally, the metaphors are hard here. Is computation consumed? Spent? If it's not used, you can't save it up. Maybe a better graph would be percentage of total computational capacity of the globe. I wonder what percentage of the maximum one consumes in reading this sentence?)