> Am I the only one who feels that Claude Code is what they would have imagined basic AGI to be like 10 years ago?
That wouldn't have occurred to me, to be honest. To me, AGI is Data from Star Trek. Or at the very least, Arnold Schwarzenegger's character from The Terminator.
I'm not sure that I'd make sentience a hard requirement for AGI, but I think my general mental fantasy of AGI even includes sentience.
Claude Code is amazing, but I would never mistake it for AGI.
I would categorize sentient AGI as artificial consciousness[1], but I don't see an obvious reason AGI inherently must be conscious or sentient. (In terms of near-term economic value, non-sentient AGI seems like a more useful invention.)
For me, AGI is an AI that I could assign an arbitrarily complex project, and given sufficient compute and permissions, it would succeed at the task as reliably as a competent C-suite human executive. For example, it could accept and execute on instructions to acquire real estate that matches certain requirements, request approvals from the purchasing and legal departments as required, handle government communication and filings as required, construct a widget factory on the property using a fleet of robots, and operate the factory on an ongoing basis while ensuring reliable widget deliveries to distribution partners. Current agentic coding certainly feels like magic, but it's still not that.
"Consciousness" and "sentience" are terms mired in philosophical bullshit. We do not have an operational definition of either.
We have no agreement on what either term really means, and we definitely don't have a test that could be administered to conclusively confirm or rule out "consciousness" or "sentience" in something inhuman. We don't even know for sure if all humans are conscious.
What we really have is task specific performance metrics. This generation of AIs is already in the valley between "average human" and "human expert" on many tasks. And the performance of frontier systems keeps improving.
"Consciousness" seems pretty obvious. The ability to experience qualia. I do it, you do it, my dog does it. I suspect all mammals do it, and I suspect birds do too. There is no evidence any computer program does anything like it.
The definition of "featherless biped" might have more practical merit, because you can at least check for feathers and count limbs touching the ground in a mostly reliable fashion.
We have no way to "check for qualia" at all. For all we know, an ECU in a year 2002 Toyota Hilux has it, but 10% of all humans don't.
That wouldn't have occurred to me, to be honest. To me, AGI is Data from Star Trek. Or at the very least, Arnold Schwarzenegger's character from The Terminator.
I'm not sure that I'd make sentience a hard requirement for AGI, but I think my general mental fantasy of AGI even includes sentience.
Claude Code is amazing, but I would never mistake it for AGI.