Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> In my mind, hallucination is when some aspect of the model's response should be consistent with reality

By "reality", do you mean the training corpus? Because otherwise, this seems like a strange standard. Models don't have access to "reality".





> Models don't have access to "reality"

This is an explanation of why models "hallucinate" not a criticism for the provided definition of hallucination.


That's a poor definition, then. It claims that a model is "hallucinating" when its output doesn't match a reference point that it can't possibly have accurate information about. How is that an "hallucination" in any meaningful sense?



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: