• | 8:00 am

How Microsoft lost its swagger—and why it’s back

For years, the software giant was all-powerful. Then it lost its touch. Under CEO Satya Nadella, it’s figured out how to matter in a world that no longer revolves around it.

How Microsoft lost its swagger—and why it’s back
[Source photo: Microsoft; Surface/Unsplash; Turag Photography/Unsplash; Ed Hardie/Unsplash]

I don’t recall exactly when my father upbraided me for not having urged him to invest in Microsoft the instant we first heard about the company—but the moment is still vivid in my memory.

Not because he was actually vexed by my failure to help him make a fortune, which he wasn’t. When we became aware of Microsoft, it was about eight years away from its IPO. More to the point, I was a junior high student, and nobody’s idea of a fount of financial wisdom.

Still, my dad’s jokey twinge of regret did reflect the fact that we became acquainted with Microsoft well before it was anywhere near reaching cofounders Bill Gates and Paul Allen’s famous goal of “A computer on every desk, and in every home, running Microsoft software.” Starting in mid-1978, we used Microsoft’s Level II BASIC programming language on our Radio Shack TRS-80, one of the first mainstream microcomputers. The company was just three years old, and its entire staff could fit into one snapshot.

And I bring all this up now because, all these years later, I’m still thinking about Microsoft. It’s the subject of my cover story for the new issue of Fast Company. Based on my conversations with the CEO, Satya Nadella, other executives, and outside observers, it’s a look at Nadella’s big bet on generative AI, which includes Microsoft’s multibillion-dollar partnership with OpenAI but has roots dating back decades.

The fact that a tech company founded back when America’s top-rated TV show was All in the Family has a shot at leading the current AI revolution is remarkable in itself. Skimming through a random 400-page copy of Byte magazine from 1980, I’m awash in ads for companies that no longer exist—and several from Microsoft, which seems to be the only software-centric one that still does.

The issue does contain an ad from another business founded by two guys with more enthusiasm than experience that, improbably, is also one of the planet’s most important companies almost 50 years later: Apple. That company’s story—early success, then existential crisis, then an astounding rebound—is well-documented. But far less has been written about Microsoft’s long history, which—since it’s been a profit machine all along—might seem to lack for suspense.

Still, Microsoft did have a midlife crisis of its own, though not one that threatened its bottom line, let alone called its continued viability into question. In fact, it stemmed from the company being so wrapped up in its own past successes that it felt a bit adrift.

To fast-forward through Microsoft’s boom years, its 1980 decision to pay $25,000 to license an obscure operating system and turn it into MS-DOS helped define the IBM PC standard. Once the company established a foothold on most of the world’s computers, it used every weapon at its disposal against rivals in key areas, such as graphical user interfaces, productivity software, and web browsers. Its ultra-hardball tactics eventually led to a landmark antitrust case, and briefly, the prospect of a government-mandated breakup. All along, however, it also benefited from strategic blunders made by once-imposing players, such as Lotus and Netscape: Microsoft may have been out to crush them, but they did just as much to crush themselves.

In the late ’90s and early aughts, with the competition largely vanquished, Microsoft’s dominance of PC software made it all-powerful in ways no tech giant is today. It was easy to conclude that it might stay that way forever. Working at the Windows-centric PC World magazine, I certainly did. And so, it seemed, did Microsoft, in ways that repeatedly led it astray.

For example, the company was early to the race to build operating systems for pocket-size computers, but it instinctively did so by cramming Windows onto a small screen, complete with a dinky Start button. That proposition was questionable from the start, and looked downright silly in the wake of the iPhone. By the time Microsoft came out with a daringly inventive mobile OS of its own, it was just too late.

Then there’s web browsing. When Internet Explorer thoroughly trounced Netscape and its market share crept close to 100%, Microsoft got so complacent that it allowed IE to devolve into a rickety, outdated, security nightmare. Hungry for a browser that didn’t stink, users abandoned it for Firefox, and, later, Google Chrome, in droves.

A final example: Zune. Microsoft’s ill-fated iPod wannabe turned into a reliable laugh line almost the moment it was released. It’s even a running gag in Marvel’s Guardians of the Galaxy movies. But even though the Zune wasn’t without its virtues, it was a distraction. While Microsoft was scrambling to play catch-up with the iPod, Apple was secretly working on the iPhone. I don’t have to tell you which company made the better bet.

This whole era didn’t end all at once, but there is an undeniable inflection point: Nadella’s appointment as CEO in 2014. To follow up on the above moments: After succeeding Steve Ballmer, Nadella got Microsoft out of the phone OS business, a painful but realistic decision. He killed Internet Explorer in favor of Edge, a web browser that’s both inventive and based on Google technology (!). And he suppressed the reflexive streak of copycat-ism that gave us duds like the Zune.

It’s pretty obvious that the end of the overwhelmingly Microsoft-centric age of computing was a positive development for people who use tech products, not to mention the rest of the industry. Unexpectedly, it’s also been good for Microsoft. Nadella is known for the sense of self-awareness he’s brought to the company, which I wrote about in a 2017 profile. But in areas such as AI, this newer, more humble Microsoft also has some of its old swagger back—not bad for a 47-and-a-½-year-old that once seemed well past its prime.

FOUR STORIES TO READ

My picks this week, from Fast Company and elsewhere:

The man who digitized design. I was sorry to hear about the death of Adobe cofounder John Warnock, whose work with the company’s other founder, Chuck Geschke, had a transformative impact on the world, starting with when just rendering an attractive typeface using a computer was a technological breakthrough. My colleague Jesus Diaz wrote an appreciation of the man and his work.

L’etat, c’est Musk. If you’re tired of obsessing over Elon Musk’s management of Twitter, make time for Ronan Farrow’s New Yorker piece on his creeping influence over the U.S. government, which might be the bigger, more alarming deal.

Crypto dystopia. In an excerpt from his new book, Number Go Up, Bloomberg’s Zeke Faux writes about a nightmarish scam involving faked WhatsApp miscommunications and Tether, a cryptocurrency that’s especially popular among grifters. When presentation slides really were slides. MIT Technology Review’s Claire L. Evans has a remarkable story on the pre-PowerPoint era of splashy corporate presentations. It’s worth it for the photos alone, but read the whole thing.

  Be in the Know. Subscribe to our Newsletters.

ABOUT THE AUTHOR

Harry McCracken is the technology editor for Fast Company, based in San Francisco. In past lives, he was editor at large for Time magazine, founder and editor of Technologizer, and editor of PC World. More More

More Top Stories:

FROM OUR PARTNERS