Generative AI has developed so quickly in the past two years, massive breakthroughs seemed more a question of “when” rather than “if.” But in recent weeks, Silicon Valley has become increasingly concerned that advancements are slowing. One early indication is the lack of progress between models released by the biggest players in the space. OpenAI is reportedly facing a significantly smaller increase in quality for its next model GPT-5, while Anthropic has delayed the release of its most powerful model Opus, according to wording that was removed from its website. Even at tech giant Google, its upcoming version of Gemini is reportedly not living up to internal expectations. If progress is plateauing, it would call into question a core assumption that Silicon Valley has treated as religion: scaling laws. The idea is that adding more computing power and more data guarantees better models to an infinite degree. But those recent developments suggest they may be more theory than law. The key problem could be that AI companies are running out of data to train models on, hitting what experts call the “data wall.” Instead, they’re turning to synthetic data, or AI-generated data. CNBC’s Deirdre Bosa explores whether AI progress is slowing, and what it means for the industry. Chapters: 0:00-1:03 Introduction 1:03-6:12 Chapter 1 - AI performance anxiety 6:12-10:28 Chapter 2 - Why progress is slowing 10:28-13:25 Chapter 3 - The search for use cases Anchor: Deirdre Bosa Produced by Jasmin |
Tech leaders are exploring nuclear power...
Sri-Kumar Global Strategies President Ko...
Melissa Otto, Head of TMT Research at Vi...
Alona Gornick, Churchill Asset Managemen...
Mike Khouw joins CNBC’s Melissa Lee and ...
Chris Marinac, Janney Montgomery Scott, ...
Rick Davis, Stonecourt Capital partner a...
This robot’s design was inspired by four...
Bloomberg Surveillance hosted by Tom Kee...
Wedbush Senior Equity Research Analyst D...
We tested out the DJI Action 5 Pro versu...