Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

clearly LLM app has added such logic to their app:

``` if (query.IsHallucinated()) { notifyHumanOfHallucination(); } ```

this one line will get them that unicorn eval



I think that LLMs are hallucinating by design. I'm not sure we'll ever get to a 0% hallucinations and we should be ok with it (at least for the next coming years?). So getting an alert on hallucination becomes less interesting. What is more interesting perhaps is knowing the rate that this happens. And keeping track on whether this rate increases or decreases with time or with changes to models.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: