> can also lie through its none existent teeth about it
Ironically, it seems to me that you are anthropomorphizing ChatGPT a bit too much here. It has no reason to lie so I think it's more likely that it just doesn't know such game exists. It probably came up with it independently or doesn't have a strong memory of it. In some respect, it would be even more impressive if it was actually "lying through its teeth" because it would imply the AI had some kind of hidden agenda.
Similarly I don't think it makes sense to say it "knows" anything at all. I would be more comfortable saying Wolfram Alpha knows things than saying an LLM does, but I'm not comfortable with either.
I'm not sure I'm comfortable with "remembers" either. My gut says I want to say I'd be more comfortable with that word for a web cache, but due to my understanding of human memory as constructive maybe I should be more comfortable with that for an LLM than for any other software.
ChatGPT does indeed know nothing at all. Proving this is quite easy, it was trained on text generation and can generate paragraphs quite well, so if you ask it to tell you about Harry Potter's family tree it will do well.
However, it will fail immediately when you ask it to print an ASCII chart of Harry Potter's family tree, because it does not actually "know" anything, and it will make all sorts of odd connections.
The clearest observation I can make of ChatGPT's success is that the general public is quite ill-informed and easily impressed by theatrics, both lessons we've already learned from politics.
That seems less like a reasoning issue and more an issue of building up an ASCII chart in a single pass. I doubt most humans would be able to accomplish that.
To demonstrate this a bit, I asked for HP's family tree as JSON, suitable for use in a charting library, and this is what it came up with:
Sorry about HN formatting, but you get the idea. This looks fairly accurate to me. What about this demonstrates less "reasoning" then turning it into an ASCII chart?
Ironically, it seems to me that you are anthropomorphizing ChatGPT a bit too much here. It has no reason to lie so I think it's more likely that it just doesn't know such game exists. It probably came up with it independently or doesn't have a strong memory of it. In some respect, it would be even more impressive if it was actually "lying through its teeth" because it would imply the AI had some kind of hidden agenda.