
If we find out in 2024, was o1's Transformer base trained on 10+x as much compute as GPT-4's?
10
1kṀ1080Jan 2
19%
chance
1D
1W
1M
ALL
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
Related questions
Related questions
Will $10,000 worth of AI hardware be able to train a GPT-3 equivalent model in under 1 hour, by EOY 2027?
18% chance
Will inflection AI have a model that is 10X the size of original GPT-4 at the end of Q1, 2025?
14% chance
Will there be an LLM (as good as GPT-4) that was trained with 1/100th the energy consumed to train GPT-4, by 2026?
82% chance
Will there be evidence in 2025 that in April 2023, OpenAI had a GPT-4.5 or higher model?
16% chance
In yottaFLOPs (10^24), how much compute will GPT-4 be trained with?
22
Will a model be trained using at least as much compute as GPT-3 using AMD GPUs before Jan 1 2026?
84% chance
Will we have an open-source model that is equivalent GPT-4 by end of 2025?
82% chance
How much compute will be used to train GPT-5?
Will an open source model beat GPT-4 in 2024?
76% chance
Will a language model comparable to GPT-4 be trained, with ~1/10th the amount of energy it took train GPT-4, by 2028?
92% chance