Over the past decade, one of the most striking developments in the field of artificial intelligence has been the rapid increase in the compute used to train models. As shown in the chart, the amount of compute per model has been doubling approximately every five months. Since 2010, this has resulted in an average annual growth rate of 4.7x.
This surge is not solely due to advances in hardware; it is also being fuelled by substantial corporate investment and an increasingly competitive R&D landscape. The rise of modern AI systems is more than just a technical leap – it marks a strategic shift with wide-ranging implications.
Key Takeaways for Organisations
- A New Investment Frontier: Large language models and multi-modal architectures are no longer just academic pursuits – they are now capital-intensive strategic assets. AI is becoming less of a “future-facing” concept and more of a central pillar of today’s competitive advantage.
- Infrastructure & Energy Implications: As compute requirements grow, organisations must plan not only for scalable infrastructure, but also for energy efficiency and sustainability. These factors will increasingly shape the conversation around AI readiness and implementation.
- The Foundation of the Knowledge Economy: AI-generated intelligence is fast becoming a core driver of economic growth and innovation. In this landscape, intelligence is no longer just an outcome, but part of a value-generating chain in itself.
Looking Ahead
If the current trend continues, the compute required to train a single AI model in just a few years may far surpass today’s largest systems. This trajectory is likely to intensify ongoing debates around ethics, regulation, environmental impact, and long-term AI governance.
Discussion