Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It feels like OP is not using the tools that LLM are correctly. Yes they hallucinate, but I've found they rarely do on first run. It's only when you insist that they start doing it.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: