• | 9:00 am

The AI industry is growing up. And the stakes are only getting higher

While 2023 was a big year for advances in the biggest component parts of the AI hardware and software stack—LLMs and GPUs—a whole ecosystem of supporting infrastructure is just beginning to define itself.

The AI industry is growing up. And the stakes are only getting higher
[Source photo: FC]

ChatGPT was the tech story of 2023 because it put the power of language AI directly into the hands of consumers. And yet ChatGPT was just the outside wrapper, the user interface, sitting atop a stack of foundation models, server chips, and other enabling technologies all working in the background. And it’s from within the realm of these enabling technologies that some of 2023’s most meaningful advancements came.

ChatGPT got a brain upgrade with the addition of OpenAI’s new GPT-4 model in March 2023. GPT-4 immediately outperformed competing models from GoogleMeta, and Anthropic on a range of benchmarks, delivering more factual answers with fewer “hallucinations” and better reasoning and problem solving (it reached human-level scores on standardized tests), with enhanced safety guardrails. Importantly, GPT-4 is multimodal; that is, it can generate responses from text, image, and audio input.

The surprising performance of LLMs like GPT-4 owes a lot to supersizing not just the models but also the amount of computing power used to train them. Training GPT-4 reportedly required between 10,000 and 25,000 of Nvidia’s A100 graphics processing units (GPUs). In fact, the GPUs used to train the biggest LLMs have come almost exclusively from Nvidia, whose chips have become table stakes for researchers trying to build toward state-of-the-art performance.

Even as many enterprises struggled to apply generative AI in meaningful ways in 2023, the tech industry sunk billions into the race toward bigger and smarter models. Meta made big waves in 2023 with the release of its family of Llama LLMs on the open-source repository Hugging Face (Llama models have been downloaded more than 30 million times). In July, Anthropic debuted its new Claude 2 LLM, which was immediately recognized for its nuanced text generation and summarization skill and its capacity to remember hundreds of pages of prompt information. Cohere continued pushing on the performance of its language models, which Amazon AWS began distributing through its Bedrock AI platform in July.

While 2023 was a big year for advances in the biggest component parts of the AI hardware and software stack—LLMs and GPUs—a whole ecosystem of supporting infrastructure is just beginning to define itself. It’s quite possible that 2024’s biggest stories will concern smaller models that run on phones, or new governance systems that hold AI systems true to company policies, transparency requirements from laws like the EU’s AI Act, and state laws on data privacy, bias, and discrimination.

  Be in the Know. Subscribe to our Newsletters.

ABOUT THE AUTHOR

Mark Sullivan is a senior writer at Fast Company, covering emerging tech, AI, and tech policy. Before coming to Fast Company in January 2016, Sullivan wrote for VentureBeat, Light Reading, CNET, Wired, and PCWorld More

More Top Stories:

FROM OUR PARTNERS

Brands That Matter
Brands That Matter