Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is what really scares me about people using AI. It will confidently hallucinate studies and quotes that have absolutely no basis in reality, and even in your own field you're not going to know whether what it's saying is real or not without following up on absolutely every assertion. But people are happy to completely buy its diagnoses of rare medical conditions based on what, exactly?


Give a single example using gpt-5 thinking.


GPT-5 thinking is one of the biggest offenders and it's quite incredible to me that you think it doesn't hallucinate. It makes me strongly suspect your own judgment is impaired. Also, what do you mean asking for "reproducible examples"? Is it somehow not a valid example if it only sometimes makes up citations?


Share any example where I can get it to make an obvious error




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: