Intel’s Disruption is Now Complete

James Allworth
6 min readNov 11, 2020

--

“Look, Clayton, I’m a busy man and I don’t have time to read drivel from academics but someone you told me you had this theory… and I’m wondering if you could come out to present what you’re learning to me and my staff and tell us how it applies to Intel.”

So begins the story that Clay Christensen would love to tell about how Andy Grove of Intel famously came to be a convert to the theory of disruption. Christensen shared with Grove his research on how steel minimills, starting at the low end of the market, had gained a foothold and used that to expand the addressable market, continued to move upmarket, and finally disrupted the giant incumbents like US Steel.

Grove immediately grokked it.

Grove and Christensen on the cover of Forbes magazine in 1999

A couple of years later, after Grove had retired as CEO of Intel but remained its Executive Chariman, he stood on stage at Comdex. He told the world that the book he now held in his hands — Christensen’s just-published The Innovator’s Dilemma — was “the most important book he’d read in a decade”.

Grove used the learnings of Christensen’s research to guide Intel over his tenure. One of the most famous examples of this was Grove pushing Intel to do something that companies rarely have the appetite to do: launch a low-margin product that cannibalized its high end products. But Intel did it — they introduced the Celeron processor in 1998. It did cannibalize their Pentium processor to an extent, but it also enabled them to capture 35% of the market they were competing in. Perhaps more importantly still though, it staved off threats from the low end.

Under Grove’s steady hand, Intel built the chips that effectively powered the personal computer that soon sat in in every home and on every desk. Alongside Microsoft, it became synonymous with the desktop computer — and famously valuable.

It was not until 2005 that Grove retired from Intel. That happened to be the same year that Paul Otellini took over as CEO. Things seemed to be off to an auspicious start for Intel and Otellini — Apple, whose Mac business was on a tear at the time — was basically the only desktop computer manufacturer that was a holdout from the x86 world that Intel represented.

And Apple converted. Steve Jobs invited Otellini on stage at Macworld to make the announcement:

Intel’s victory seemed complete.

Indeed, that deal between Apple and Intel was more important for Intel than it could have ever possibly realized. But it wasn’t because Intel had sewn up the last of the desktop computer processor market. Instead, it was because Intel had just developed a relationship with a company that was thinking about what was coming next. And when Apple were figuring out how to power it — and by it, I’m talking about the iPhone — they came to their new partner, Intel, for first right of refusal to design the chips to do.

How did Intel respond?

Well, we’re fortunate to have an interview of Otellini in his last month on the job as Intel CEO. Here’s what Otellini decided to do, when presented with the option to power the iPhone:

"We ended up not winning it or passing on it, depending on how you want to view it. And the world would have been a lot different if we'd done it," Otellini told me in a two-hour conversation during his last month at Intel. "The thing you have to remember is that this was before the iPhone was introduced and no one knew what the iPhone would do... At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn't see it. It wasn't one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought.”

Otellini presented it as a single decision. Of course, the truth inside any big organization is often more complicated than just this — there are a whole host of decisions that led them to the point that they decided to pass on such an opportunity. My co-host of Exponent, Ben Thompson at Stratechery, characterized it like this:

Intel’s fall from king of the industry to observer of its fate was already in motion by 2005: despite the fact Intel had an ARM license for its XScale business, the company refused to focus on power efficiency and preferred to dictate designs to customers like Apple, contemplating their new iPhone, instead of trying to accommodate them (like TSMC).

Either way, though, the end result was the same. Just like its partner in the PC era, Microsoft, Intel was so ensnared by its success in the PC paradigm that it couldn’t see out of it. With that — and the margins associated with its success — it saw no need to question its winning formula: that of integration of chip design and manufacturing. A promising customer comes knocking with something that doesn’t look to have the same margins as the existing business?

Not interested.

Yesterday, Apple announced the first Macs that will run on silicon that they themselves designed. No longer will Intel be inside. It’s the first change in the architecture of the CPU that the Mac runs on since… well, 2005, when they switched to Intel.

There’s a lot of great coverage of the new chips, but one piece of analysis in particular stood out to me — this chart over at Anandtech:

What about this chart is interesting? Well, it turns out, it bears a striking resemblance to one drawn before — actually, 25 years ago. Take a look at this chart drawn by Clayton Christensen, back in 1995 — in his very first article on disruptive innovation:

He might not have realized it at the time, but when Grove was reading Christensen’s work, he wasn’t just reading about how Intel would go on to conquer the personal computer market. He was also reading about what would eventually befall the company he co-founded, 25 years before it happened.

The causal mechanism behind disruption that Grove so quickly understood was that even if a disruptive innovation started off as inferior, by virtue of it dramatically expanding the market, it would improve at a far greater rate than the incumbent. It was what enabled Intel (and Microsoft) to win the computing market in the first place: even though personal computers were cheaper, selling something that sat in every home and on every desk ends up funding a lot more R&D spend than selling a few very expensive servers that only existed in server rooms.

Similarly, Apple’s initial foray into chips didn’t produce anything that special in terms of silicon. But it didn’t need to — people were happy to just have a computer that they could keep in their pocket. Apple has gone on to sell a lot of iPhones, and all those sales have funded a lot of R&D. The silicon inside them has kept improving, and improving, and improving. And their fab partner, TSMC, has gone along with them for the ride.

Finally, today marks the day where, for Intel, those two lines on the graph intersect. Unlike the last time the two lines intersected in the personal computer market, Intel is not the one doing the disrupting. And now, it’s just a matter of time before the performance of ARM-based chips continues its march upmarket into Intel’s last refuge: the server business.

Things are not going to go well for them from here on out.

If you enjoyed this article, you might like my free newsletter here — I’ll email you (very occasionally!) when there’s a new post. You might also you might enjoy Exponent, the podcast I co-host with Ben Thompson of Stratechery. We talk about this article on episode 190.

Finally, you can also reach me on Twitter at @jamesallworth.

--

--

James Allworth
James Allworth

Written by James Allworth

Co-host @exponentFM, Co-author @MeasureYourLife, Fellow @ClayChristensen's thinktank, writer @HarvardBiz