LogoThread Easy
  • Explorer
  • Composer un thread
LogoThread Easy

Votre partenaire tout-en-un pour les threads Twitter

© 2025 Thread Easy All Rights Reserved.

Explorer

Newest first — browse tweet threads

Keep on to blur preview images; turn off to show them clearly

OpenAI has historically scaled up training compute by around 100x with each new generation of its GPT.

However, GPT-5 appears to be an exception to this trend.

🧵

OpenAI has historically scaled up training compute by around 100x with each new generation of its GPT. However, GPT-5 appears to be an exception to this trend. 🧵

GPT-4 was trained on 2e25 floating-point operations, and OpenAI said GPT-4.5 was about an order-of-magnitude (10x) scale-up. We don’t have a rigorous estimate yet, but GPT-5’s compute scale may be *between* GPT-4 and GPT-4.5, and it is probably not a large scale-up from 4.5.

avatar for Epoch AI
Epoch AI
Fri Aug 08 18:18:49
  • Previous
  • 1
  • Next