
How Will the LLM Hallucination Problem Be Solved?
32
1kṀ9772029
9%
8%
Vector Embeddings (as with Pinecone https://www.pinecone.io/)
2%
Filtering (as with Deepmind AlphaCode https://www.deepmind.com/blog/competitive-programming-with-alphacode)
0.1%
Ensemble Combined with Fine Tuning
0.5%
Joint Embedding Predictive Architecture (https://arxiv.org/pdf/2301.08243.pdf)
1%
Feed Forward Algorithms (https://www.cs.toronto.edu/~hinton/FFA13.pdf)
9%
Bigger model trained on more data + RL
1.4%
Vigger models + prompt engineering
48%
It won't be
20%
Giving all LLMs access to the internet and databases of scientific papers
By the year 2028, how will the Hallucination Problem have been solved for the vast majority of applications out there?
This question is managed and resolved by Manifold.
Get
1,000 to start trading!
Sort by:
@LukeFrymire this would imply that either the options in my market are overvalued or your market is overvalued.
@PatrickDelaney I think the resolution criteria is fairly different. Mine requires that a scale-based solution is possible, yours requires it to be the primary method in production.
@VictorLevoso ught hit v instead of b in keyboard and didn't look at the question properly before clicking submit and now can't edit it or erase it.
Related questions
Related questions
Will LLMs be banned at the 2026 MIT Mystery Hunt?
24% chance
Will LLM hallucinations be a fixed problem by the end of 2025?
8% chance
LLM Hallucination: Will an LLM score >90% on SimpleQA before 2026?
60% chance
Which LLM will come up with at least one funny idea?
Will LLM hallucinations be a fixed problem by the end of 2028?
49% chance
Will hallucinations (made up facts) created by LLMs go below 1% on specific corpora before 2025?
38% chance
Will scaling current methods be enough to eliminate LLM hallucination?
15% chance
How will the data shortage for LLM gets solved
Will an LLM be able to solve Raven's Progressive Matrices from an image in 2025?
65% chance
Will LLMs mostly overcome the Reversal Curse by the end of 2025?
50% chance