Dawn of a New Age

Computing used to be a shared affair, one mainframe to many users. But this was limiting, like cooking Thanksgiving dinner with only one oven. An oven has a fixed number of racks and can only cook at one temperature at a time. The mainframe would take turns executing instructions for different users. People had to make do because of the outrageous cost of a mainframe computer. But what if the oven could be shrunk down? Less raw material means that it would be cheap to produce, so everyone could afford their own miniature oven. And thus, engineers set out to create the microwave of computers, the PC.

Fast forward to 2005, and we see a strange barrier forming in the chip making industry. In the 1990s, the easiest way to improve performance was to increase the clock speed, but above 3 gigahertz, a processor becomes so hot that it requires a substantial cooling solution, should it aspire to run for more than a few minutes. Another strategy involves shrinking down all of the components in order to cram more transistors onto a chip. This has been an extremely successful technique and should progress nicely for at least another decade. Moore's Law predicts the number of transistors on an equally-priced chip to double every 2 years, but it is unrealistic to think that this can continue forever. There comes a point, though, where something is as small as it can be, or where even the slightest breeze could cripple it. It is up to speculation as to when this limit of miniaturization will be reached, by which time there is a possibility that an entirely new form of electronics will come to fruition.

So as you might expect, the two major chip manufacturers, Intel and Advanced Micro Devices, have essentially run into a wall. Their solution is to put multiple processors onto one chip, and also to design computers which coordinate multiple chips. My photograph is of a dual-core CPU made by Intel. A dual-core chip isn't terribly special in and of itself, but it represents a colossal change in personal computer architecture. Why is this so significant? Because now everyone won't just be running a PC; they'll be running a supercomputer. I wish it had been presented to me this way, because I didn't fully realize this until I heard about the PlayStation 3. It uses a special chip made by IBM, christened "The Cell", with 9 processing cores. IBM has been working on supercomputers for decades, and now they want you to have a chihuahua-sized one for under a grand.

Remember the users taking turns on the mainframe? Now your applications don't even have to take turns on one processor. It becomes trivial to run multiple operating systems concurrently. The multi-core and multi-chip paradigm postpones many of the physical limits of chip design indefinitely.

Oddly, many think the future of computing is the mainframe. Forget home cooking your meals in the microwave; just get some hot grub from the cafeteria. We would then shift to "thin clients", like cell phones, with limited memory and processing power. I for one, will weep the day that I lose my personal supercomputer.