Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wonder when we can get LLM, which might be more "stupid" but knows what it does not know rather than hallucinates... Though perhaps when it would be not LLM but entirely different tech :)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: