We’re releasing an analysis showing that since 2012, the amount of compute used in the largest AI training runs has been increasing exponentially with a 3.4-month doubling time (by comparison, Moore’s Law had a 2-year doubling period)[^footnote-correction]. Since 2012, this metric has grown by more than 300,000x (a 2-year doubling period would yield only a 7x increase). Improvements in compute have been a key component of AI progress, so as long as this trend continues, it’s worth preparing for the implications of systems far outside today’s capabilities.
Originally published on
OpenAI News.
Latest Briefs
Fast updates from the latest stories.
NEWS
+1
New-Age Tech Stocks Rebound: FirstCry Leads Gains This Week
Mar 21, 2026
EXCLUSIVE
+4
How fusion power works and the startups pursuing it
Mar 21, 2026
COMPANIES
AI boom? OpenAI set to double its team by end of 2026; new hires to be deployed across these fields - Report
Mar 21, 2026
NEWS
NeuroPause Lab Introduces 'AI Action Firewall' for Enhanced AI Safety
Mar 21, 2026