Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
bigstrat2003
7 months ago
|
parent
|
context
|
favorite
| on:
The Rise of Whatever
On the contrary, those things are quite predictable. Once you know those issues exist, you can reliably avoid them. But with LLMs you
can't
reliably avoid hallucinations. The unreliability is baked into the very nature of the tool.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: