Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> The Chinese Room is a good example if you want deeper thought into the distinction between acting and being.

It's not a deeper thought at all. It's an appeal to the emotion of humans who want to believe that what we do in our heads somehow produces "understanding", while the procedural steps inside a computer doesn't. But to my mind, it doesn't give a convincing reason for this assertion. You can't prove that what we're doing in our heads, isn't functionally equivalent to what is going on inside some future AGI. There's no such test, just hubris.

As far as the Turing test goes, if you can't tell the difference between a computer and a human, what is the point of even arguing that humans still hold some claim to being the only one in the fight that actually "understand"?



The Chinese room is not saying humans are smart. It is showing the difference between knowledge and action.

Understanding a language is distinct from responding to language prompts.

Unless you are claiming LLM is AGI which is a laughable proposition the fact that people say LLM isn't intelligent isn't impactful on whether AGI isn't intelligent.

There isn't disagreement (outside extremists) that AGI doesn't exist yet. The only disagreement is "when do we know it does".

Not having a test isn't hubris. It instead shows it is a fundamentally hard question. What does intelligence mean?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: