Technical challenges with GenAI
- Reducing hallucinations and improving accuracy
- Make LLMs generate results more quickly and cheaply
- Optimize context length and context construction
- Training LLMs more efficiently
- Improving the quality of data
- Incorporating other data modalities
- Productionizing new model architecture
- Develop GPU alternatives
- Making agents usable
- Improve learning from human preferences
- Improving UI/UX experience with GenAI
Hallucinations and Confabulations
There are a number of issues related to modle accuracy that pose challenges for GenAI models. Most prominant among them are the effect of Hallucinations, or more linguistically, confabulations, though the former term is now firmly understood and established. Models confabulate, hallucinate, by making up facts or sentences that have no reasoanble bearing to reality.