In 1965, Gordon Moore of Fairchild Semiconductor, at the age of 36, made an historic observation. While preparing data for a speech, the young scientist noticed that memory chip capacity was doubling almost every 18 months. This exponential rate of improvement, now known as Moore’s Law, holds for just about every measure of integrated circuit performance.
So confident was Moore of this trend that he and several others shortly left their wellpaying jobs at Fairchild to form a new company called Intel. In 1971, after spending much of its time on memory devices, the fledgling startup introduced a new type of chip called a “miniprogrammer.” The Intel 4004, which later came to be known as a microprocessor, was a 4-bit device that ran at 108 kHz.
In the thirty years since then, many fortunes have been made as a result of the staggering rate of technological improvement first noted in Gordon Moore’s memorable speech. It makes me wonder how many people, after crossing paths with Moore and hearing his epic revelation, either made good or are kicking themselves today on account of their indifference.
Consider how Moore’s prediction has been borne out in terms of microprocessor speed. Compared to the 4004, modern computer chips are four orders of magnitude faster, as they streak past the mind-boggling mark of one billion pulses per second. To put it in perspective, think of a horse-drawn cart running alongside an F-16, full-throttle with afterburners. Now send a missile past both of them, 50 times faster than the jet fighter. The difference between the cart and the missile would be like a 4004 trying to keep up with a 1-GHz chip we now find on many PCs.
What’s even more incredible is that the pace of technology is just beginning to heat up. With the impact of the Internet, coupled with anticipated advances in chip technology, the next 30 years will undoubtedly bring more opportunities, and surprises, than the last 30.
What we do about it now could determine whether we end up like the people who acted wisely on Moore’s advice or those who chose to ignore it. We could make good, or we could find ourselves lamenting about what could have been.
I’m not suggesting that the goal in life should be to go out and make a fortune. On the contrary, such nearsighted thinking is rather more likely to land one on the poor farm or the island of discontent. What we ought to be doing, in the footsteps of Gordon Moore, is paying attention to the little things. Indeed, there really is something to all that advice about stopping to smell the roses.
Retrace the history of the “silicon rush” of the ‘60s and ‘70s, and you’ll see that it didn’t begin in the rarified air of an executive boardroom or on one of the finely manicured greens at Pebble Beach. Rather, it started in the dusty corners of cluttered labs. It started with borrowed equipment and materials scrounged from here and there. It started in basements and garages, and on a paper-strewn tabletop with a guy named Gordon quietly imparting pencil dots on a sheet of graph paper.
If I had to venture a guess, Moore was probably preparing a lecture for the benefit of a group of people from whom he could expect little in return. I imagine he was working late too, in anonymity and without fanfare. And he probably didn’t have enough time to get his slides exactly the way he wanted them. That’s how it usually is, and that’s why folks like you and I have as good a shot as anyone to be able to look back one day and say we were in on something truly significant.