Are people going to start throwing around the term "AGI" now that "AI" has become diluted? Eventually we are going to have to start using "RAGI" to indicate that we are talking about real artificial general intelligence.
We've diluted the term AI before. Eventually the hype will wear off and we'll call them LLMs, just like what happened to all the previous versions of machine learning or various expert systems.
Vending Machines used to be called robots. Then they stopped seeming magic.
To have a "real" AGI is a dream ML researchers have been chasing their whole lives. The top engineers in ML, earning six figures, sometimes claim to have achieved it. People all over the world are expected to benefit when a practical AGI reaches them.
It's become increasingly clear that they're two different concepts so we need two different names.
And nobody involved with currently commercialized projects is going to stop using the term AI, so a new term was needed. AGI seems as good as any other -- do you object?
I see no reason we'll need a third term as you suggest, unless we come to a new gigantic breakthrough that is miles beyond our current conception of AI, but is not yet AGI.
> It's become increasingly clear that they're two different concepts so we need two different names.
Here's a suggestion; stop calling LLMs "AI". Yes, I know; the shareholders will hate it. But then, you're not building towards any expectation of intelligent behavior. The fact that we have to qualify the existence of intelligence with a different acronym says it all; people are disappointed with what we have. AI simply isn't enough, we need it to be generalized before we get reliable results!
So... yeah, I do object. Users won't object because they're hungry for a better experience, and developers won't because they need every excuse they can get to charge recurring service revenue. Suspicious onlookers like me and the parent are the only ones who end up questioning the whole thing.
This says the mission is creating AGI, i.e. that's the primary goal/purpose. It doesn't mean it's something they think they've been doing already, just what they've been trying to work towards. There are actually some really good blog posts by Altman diving into this much deeper.
Fair enough if that is what they believe. Personally I find AGI is a bit unrealistic and mentioning it is just for the purpose of creating hype. It feels like something Musk would say to give people this vague futurism belief of their tech that won't actually happen this century if ever.