- | 9:00 am
Why AI is still stuck in its dial-up era
Today’s generative AI has a clunky interface. It’s slow. It can be expensive. And it still isn’t pervasive enough to change everything. What’s not to love?
If you were using computers in even the waning days of the previous century, you remember the routine. With a flurry of touch-tone beeps, your PC dials a phone number. An extended screech follows as it negotiates a connection with the server on the other end of the line. Then—and only then—are the wonders of the online world available to you. And when you hang up, they’re gone.
I‘m not indulging in random, unprovoked nostalgia. The dial-up era bears a striking resemblance to today’s generative AI moment. The more I muse about their similarities, the clearer it is that AI has far to go before it lives up to its overwhelming hype.
The most obvious kinship between 20th-century dial-up and 21st-century AI is one of look and feel. Since my first encounter with ChatGPT, interacting with the service reminded me of dialing into a BBS, which I began doing well before there was such a thing as the World Wide Web. The user interfaces are borderline identical: You type stuff into an empty window, press RETURN, and get text back in response. You can even see an AI chatbot’s answers arrive character by character, as if you were using a 1200-baud modem. In a day when most computing experiences provide instantaneous, graphical feedback to our points, clicks, and swipes, that’s a pretty rudimentary (albeit powerful) way to engage with a piece of software.
AI’s often plodding performance has less to do with the speed of your internet connection than the fact that it’s pulling off astonishing computational magic that taxes even the supercomputers it runs on. But the technology’s delayed gratification (or lack of gratification) reminds me of the days when I sometimes decided against doing something online, such as downloading a multi-megabyte app, simply because the payoff wasn’t worth the wait. For example, when I tried Microsoft’s Copilot inside Word, Excel, and other apps, the results were often a letdown given how long they took to appear—not exactly an incentive to give them even more of my time.
The parallels between AI and early online activity are more than screen-deep. High-speed internet is now an assumed part of everything we do on all our devices: always available, unlimited, pervasive. (Mostly. I’m not counting dead zones and data caps.) Dial-up, by contrast, was something you used in isolation from all your other PC activities, keenly aware that you were paying by the hour. Did I mention that you couldn’t use it at all if someone else was hogging the line?
Generative AI features often have a similar walled-off, metered feel. The very fact that apps frequently shove them all into a single sparkle icon is a sign that they haven’t been deployed in a truly integrated fashion. (When did you last see an app with an “internet” button?) I’m giving OpenAI $20 a month for ChatGPT Plus—the same price that AOL once cost—and yet I don’t get unlimited access to its GPT-4o large language model. Thank goodness I can live without it!
Instead of seeing AI as a default feature, many software companies are using it as an upselling opportunity. For instance, Microsoft 365’s Copilot costs an extra $30 a month, guaranteeing that companies aren’t going to dole it out widely without careful consideration. It’s as if employers had to decide whether specific employees really needed online access or not—which, once upon a time, they did.
Now, people certainly are diving into AI far more quickly than they entered cyberspace. The first BBSes and consumer online services were available by the end of the 1970s, and even in 1994, only 6% of Americans were online. Today, according to a Reuters Institute study, 32% of us have checked out ChatGPT. However, only 18% use it at least weekly, suggesting that the service hasn’t become an internet-like necessity even for a sizable chunk of people who have given it a try.
The industry can’t will AI ubiquity into existence. A recent Bloomberg article by Brody Ford, Ian King, and Evan Gorelick reports that only 3% of PCs sold this year will be “AI PCs” and that major software companies are skittish about tailoring their apps for such a tiny sliver of the market. Even Apple’s thoughtful “Apple Intelligence” approach to adding more AI to its platforms won’t be ready in time for iOS 18 and other operating-system upgrades due this fall, and won’t work on millions of older devices still in service. At this pace, how much longer will it take until every desktop PC, laptop, phone, and tablet is AI-ready?
The internet didn’t truly live up to its potential until dial-up faded away. According to the Pew Research Center, it took until 2007 for broadband to reach 51% of U.S. households, which means that even then, 49% either had dial-up or no online connection at all. So give AI time. Someday, we’ll get it in a form that makes today’s ChatGPT feel as charmingly old-timey as AOL circa 1998 does now. Just don’t count on it happening in the next year or three.
You’ve been reading Plugged In, Fast Company‘s weekly tech newsletter from me, global technology editor Harry McCracken. If a friend or colleague forwarded this edition to you—or if you’re reading it on FastCompany.com—you can check out previous issues and sign up to get it yourself every Wednesday morning. I love hearing from you: Ping me at hmccracken@fastcompany.com with your feedback and ideas for future newsletters.