Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

But you don't submit that rough draft with the 110% conviction that it's correct, which is what an LLM will do when it hallucinates.

It won't say "I think it should look something like this but I might be wrong," it'll say "simple! here is how you do it."

Hence hallucination and not error. It thinks it's right.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: