So, on Tuesday we did bandwidth and Wednesday, memory. After that, I thought we needed a break and I told a fish story. But for Friday we were back on track with storage. What's left? Of course! Today must be CPU Day. Moore's law and all that. Intel. GHz processor speeds. Chips as hot as the surface of the sun.
Nah. As interesting as it is, I don't think I've been paying attention. From the 8008 to the Pentium it's all one big blur. Intel with their silly code names, Microsoft with their careful balancing of features with bloat, making sure that the product of their complexity and Intel's speed remains close to unity. I wrote my own word processing program many years ago to run on the 9825 and then the 9845. My own spell checker, too, which got its dictionary from my material, which guaranteed it would never flag anything that I would consider spelled correctly*. Yes, the new stuff is better, if not necessarily as idiosyncratic. But a thousand times better? A million times better? Hell no! I'll give 'em ten times, but that's it.
One major benefit of CPU speed is that it seems to render pretty much everything obsolete. I can't tell you how much wonderful equipment can be had for a song because it doesn't run with the gigaHerzage one expects in new stuff. Also, scientific computing is a great beneficiary of this increased speed. Because there are some real problems that don't bloat with increased computing power, they just get easier to solve. SETI, my favorite, is in that category. Because the SETI@Home program changed their system recently, I don't know by how many orders of magnitude the average computer increased its processing rate since the program started. Probably between one and two, or the difference between a sentence for embezzlement and one for serial murder. When you try to do signal processing in multiple dimensions (frequency, Doppler shift, and celestial coordinates), you need all the CPU you can get. (That's even more important if, as in SETI, you're not sure what you're looking for!)
OK, so it's not CPU day. What is it, then?
It's limits day.
Consider bandwidth. At 50MB/second, you can watch the best movies made with no loss of resolution, either video or audio. For print media, that's fast enough to download in seconds what it would take hours to years to read. Do we need more bandwidth? Silly human - of course we do! At least until it's great enough for a completely immersive, fully spherical experience. Then what?
Consider memory. At 1GB per computer, apparently the most important thing Microsoft can think of to do with it is provide Windows Vista with "visual effects." If memory weren't essentially free, would people want to bother with that? Maybe. But even so, once there is enough memory to drive a big enough display to completely fill the visual field, then what?
Consider storage. I, a major fan of music with somewhat eclectic tastes, have managed to fill almost 100GB of disk storage with audio material That is about an order of magnitude more than most people, which I suppose is why demand for MP3 players will be driven more by video in the future. I cannot imagine filling a terabyte drive with music MP3s. What will I do when that's the smallest drive available? Buy it and fill it with music I might, some day, listen to. And hope to live forever.
For decades the pundits have been predicting a decrease in the exponential rate of growth in computer capabilities. They are correct, of course, in that eventually exponential growth will slow down. But looking at the three items above, I can easily see orders of magnitude improvement in all three without any theoretical showstoppers. How will we use it?
Consider CPU Speed. Whether or not needed by real people, this is the one area where Science is calling, and the call will probably be heeded. By the time the real limits of parallelism, dielectric constant, heat dissipation, leakage, tunneling, and lithography are reached, quantum computers will be ready to go. Yes, they will take a while, but it will also take a while to run out of ingenuity for that list of issues that are supposed to hamper progress with conventional CPUs.
Will we have intelligent or even conscious hardware, or maybe a "prime radiant" in our lifetimes? If not, it probably won't be for lack of progress in hardware!
*Perhaps you've noticed my odd mixture of traditional and creative spelling. Then again, if you've grown up with spell checkers, perhaps you haven't. I once had someone complain to me that I "mispelled 'Gabarone'," the capital of Botswana. Most people would find this person picky and the error forgivable. Not I! Botswana is entitled to "Gaborone," and I thanked him, along with pointing out that he had misspelled "misspelled."