The progress of artificial intelligence (AI) is hitting a speed bump. OpenAI’s upcoming model, Orion, promises only minor improvements over GPT-4, signaling that rapid advancements in AI might be slowing. Orion, expected to debut in early 2025, is OpenAI’s next-generation language model. Despite setting a high bar with major leaps from GPT-3 to GPT-4, AI might be approaching the limits of its current capabilities.
According to reports from The Information, employees at OpenAI describe Orion’s performance gains as “far smaller” than what was achieved in previous upgrades. In particular, Orion’s coding capabilities appear only slightly better, if at all, than those of GPT-4. This suggests that, for now, AI’s rapid evolution may be losing momentum.
Hitting Barriers in AI Development
The report blames the “dwindling supply of high-quality text and other data.” OpenAI has reportedly used much of the best data in developing previous models, leaving less material to work with for new ones.
To address this, the company has begun supplementing Orion’s training with synthetic data from earlier models, including GPT-4. However, some employees worry this approach could lead to models that simply mimic their predecessors rather than surpass them.
“For general-knowledge tasks, it seems we’re reaching a performance plateau with large language models,” Ion Stoica, cofounder and executive chair at Databricks, told The Information. He added that “factual data” remains more valuable than synthetic data in training these models.
The slowing improvement also brings financial challenges. Orion requires far more resources to train and operate than previous models. As a result, each new step in AI development is becoming more expensive. This trend could ultimately make it financially difficult to sustain further AI advancements without breakthroughs in efficiency.
Beyond financial implications, Orion’s high operational demands have environmental consequences. As Orion and similar AI models require expansive data center operations, they contribute to a larger carbon footprint. This raised concerns among industry experts who argue that scaling up AI without a matching increase in efficiency could put pressure on global energy resources. With data centers already contributing to significant carbon emissions, the balance between AI progress and environmental stewardship is now in question.
A Shift in Strategy
In response to these hurdles, OpenAI has formed a special foundations team to find new ways of advancing its models. This team is focused on improving Orion’s capabilities after the initial training process and finding new ways to maximize limited data.
OpenAI CEO Sam Altman is optimistic that the path to artificial general intelligence (AGI) is clear but requires creative use of existing models. He suggests combining LLMs with reasoning methods like the o1 model and agentic AI.
But while these efforts may help maintain progress, they indicate a shift from the rapid advancements that have characterized AI’s growth so far.
Orion’s stagnation shows a broader reality for AI: its incredible progress might be starting to plateau. This could mark a turning point as OpenAI, and other companies look for new paths to keep AI advancing in a world with finite resources.