Computers have clearly been getting a lot better--faster, lighter, more portable--for a long time. In fact, they have been getting better at an accelerating rate.
In 1965, Gordon Moore, one of the founders of Intel, established what's known as "Moore's Law." It says that the number of transistors on a computer chip will roughly double every two years. Moore's Law has held true for over forty years. Many people now use Moore's Law as a general rule of thumb to express the acceleration of computer technology in general; in other words, the computational capability of computers has been approximately doubling every two years.
That is a fantastic rate of increase: an exponential or geometric progression. To get an idea of what that might mean for future computer technology, lets imagine you start with a penny (one cent) in 1975. That's the year the first real PCs (like the Apple II) began to appear.
So, in 1975 you have 1 cent. Two years later in 1977, you have 4 cents. 1979: 8 cents, and so on.
Here's how it would go:
45 cents in 1985.
$3.60 in 1992.
$82 in 2001.
and $1,300 in 2009.
So we have $1,300 after starting with only a penny, and that demonstrates the difference between a machine like the Apple II and the most advanced computers available today.
Future Computer Technology - A Look Ahead
What's more interesting is if we take a look into the future:$10,500 in 2015.
$84,000 in 2021.
$336,000 in 2025.
$2.6 million in 2031.
So, if Moore's Law continues to hold, we are going to see a fantastic increase in the power of computer technology by 2031. It's possible that the acceleration will slow down before then because, at some point, chip fabrication technologies will hit a fundamental limit. However, there may well be new technologies like quantum or optical computing by then, or the focus may simply shift to parallel processing so that thousands of cheap processors are linked together.
Of course, there is also a problem with creating software to take advantage of this power. Historically, software has advanced much more slowly than hardware, so it's really software that is the bottleneck.
Software companies like Microsoft will have to come up with compelling applications to make use of all that power. I think there is a good chance that artificial intelligence is going to be one of the biggest uses of the computer technology of the future.
If that's the case, job automation technology is likely to leap forward dramatically, and computers will be capable of doing many of the routine jobs in the economy--even those held by people with college degrees. That is going to create some serious issues for the job market and for the economy as a whole. That's the main focus of my new book, The Lights in the Tunnel.