The Time Gordon Moore Said We Got Moore’s Law Wrong
On the passing of Moore, recalling a 1998 conversation
Back in the 1990s and 2000s, I used to talk to Gordon Moore now and then. In fact, the Intel leadership – Moore, Andy Grove, Craig Barrett – were some of my favorite people when I was a tech journalist and they were running what was then one of the most powerful companies on the planet.
I particularly got a kick out of the exceedingly blunt Grove, who would send me one-line commentary on my published pieces such as: “This is stupid.” But that’s for another time.
Moore, who just died at 94, was a different kind of character – approachable, affable and amazingly humble for someone who had become a legend…and who had turned a $500 investment into a $7 billion personal fortune. In 1957, Moore and seven others co-founded Fairchild Semiconductor, each putting in $500. Eleven years later, Moore and Robert Noyce left and started their own chipmaker, which they called Integrated Electronics Corporation, later shortened to Intel. Grove was the next person aboard. Intel went on to invent and make the microprocessors that drove the personal computer revolution and dot-com boom. It was, for a time, the world’s most valuable company.
Anyway, in 1965, while at Fairchild, Moore published an article in Electronics titled “Cramming More Components Onto Integrated Circuits.” In the article, he proposed that these chips would improve at a steady pace, continually getting faster while also getting cheaper, which became known as Moore’s Law. As I wrote in a 1998 column about Moore’s Law: “Moore's Law is the metronome for the pace of change in technology. It states, in its most quoted form, that the number of components that can be packed on a computer chip doubles every 18 months while the price stays the same. Essentially, that means that computer power per dollar doubles every 18 months. The law, amazingly, has held true for more than 30 years.”
I wrote that column because, at the time, both Intel and IBM had announced scientific advancements that some thought would speed up chip development, going faster than Moore’s Law. So I called Moore to ask him. He was characteristically chill about it. "These things are not blowing by Moore's Law," he told me. "They're helping to keep up with it."
In fact, he went on to say, it seemed unlikely that technology would go faster than Moore’s Law. It was more likely the opposite would happen. As the circuits get exponentially smaller, he said then, there will come a time when it will be nearly impossible to make them any tinier, and the pace of improvement will slow.
Fast-forward to 2022, and Intel and its rival Nvidia got into a bit of a war of words about Moore’s Law. Nvidia CEO Jensen Huang declared Moore’s Law over; Intel responded that it was still alive and well.
But when I talked to Moore in 1998, this was his prediction: "By 2010 or 2020, we'll see a slowing in our ability to make things smaller.” Obviously, we’re past those dates now.
When he told me that, I asked Moore if it meant he’d write another article revising Moore’s Law. He pointed out that he actually did that once before. "In 1965, I said the number of components would double every year. In 1975, I updated it."
Ah! So, I said to Moore, that’s when you updated it to say components would double every 18 months, like everyone says? "No. Every two years, which has held true. I never said 18 months."
Haha! The whole world misquoted Moore’s Law for over 40 years and believed the misquote was true. Somehow that seems even more legendary.
—
Here is the column I wrote for USA Today about my conversation with Moore. I can’t find the physical clip from the newspaper but you can find it online here.
Breakthroughs affirm computer guru's growth theory
Moore's Law is kaput? Nuked? Whacked?
Gordon Moore laughs. Moore is the legendary co-founder of Intel and the man for whom Moore's Law is named. The warm laugh is the kind amused adults use when a little tyke says something outrageous.
I've asked him if innovations unveiled in the past week blow past his theory governing all of computerdom.
"These things are not blowing by Moore's Law," he says. "They're helping to keep up with it."
Moore's Law is the metronome for the pace of change in technology. It states, in its most quoted form, that the number of components that can be packed on a computer chip doubles every 18 months while the price stays the same. Essentially, that means that computer power per dollar doubles every 18 months. The law, amazingly, has held true for more than 30 years. Companies from Bell Atlantic to BackWeb Technologies build their plans around it.
To technology people, saying Moore's Law is obsolete is like telling airplane makers the law of gravity has changed. Yet, in the past week, pundits and news stories have proclaimed that recent developments mean technology will race ahead faster than Moore's Law predicts.
The first development came from Intel. It introduced an advance that would let it pack twice as much data in a memory chip. A few days later, IBM said it had developed a way to use microscopic copper wiring instead of aluminum in computer chips, which would help IBM make faster, cheaper microprocessors.
But neither alters Moore's Law. They only remove barriers that would have hindered technology from keeping up with the law. And that's what always happens. The past 30 years, there have been lots of barriers to Moore's Law. But someone always comes up with a way to move past them. And the law marches on.
It's been so on-target for so long, no one is quite sure whether its pace is inevitable, or whether Moore's Law has become Moore's Goal, and everybody works to try to keep the pace going. Moore leans toward the latter. "Companies realize they have to keep up with Moore's Law or fall behind," he says. "So it's really become kind of a driving force."
Others see it differently. "I used to think (Moore's Law) was a historical curiosity. As I continued to work on it, I thought it was a self-fulfilling prophecy," says Randy Isaac, head of basic science at IBM. "Now I view it more as a self-consistent economic cycle."
A self-what? Isaac explains that, basically, there are a lot of factors - expectations, money and many different pieces of technology - that feed and play off one another to perpetuate Moore's Law. "It just hangs together," he says.
And, somehow, breakthroughs come when they're needed. IBM's was one. Chips long have used aluminum wires because copper doesn't work well with silicon. But aluminum doesn't conduct electricity as well as copper. In a few years, the wires in chips will be so tiny, not enough electricity could move through aluminum wires. That would brake the pace of change.
IBM found a way to make copper wires work with silicon so enough electricity can get through the ever-shrinking wires. "It removes that barrier and allows us to continue with Moore's Law," Isaac says.
Still, Moore's Law probably won't hold true forever. There are serious barriers ahead. Optical lithography, used in making chips, will reach its limits in about 10 years. Insulators on chips are only four or five atoms thick. They can't get much smaller.
"By 2010 or 2020, we'll see a slowing in our ability to make things smaller," Moore says.
He then might have to retool Moore's Law, which, not many people know, he's done before. "In 1965, I said the number of components would double every year. In 1975, I updated it."
To say the number would double every 18 months, like everyone says?
"No. Every two years, which has held true. I never said 18 months."
Whoops.
Great piece, Kevin. I’m inspired to write some Citrix stories. Maybe we could co-author a Microsoft partnering story - the best partnership in Microsoft history - that includes “The Mouse That Roared” piece you wrote. A bit of history and a lesson about partnering in general.