> Yes, that's my fall back as well. If it receives zero instructions, will it take any action?
By design, no.
But, importantly, that's because the closest it has to an experience of time is an ongoing input of tokens. Humans constantly get new input, so for this to be a fair comparison, the LLM would also have to get constant new input.
Humans in solitary confinement become mentally ill (both immediately and long-term), and hallucinate stuff (at least short term, I don't know about long term).
By design, no.
But, importantly, that's because the closest it has to an experience of time is an ongoing input of tokens. Humans constantly get new input, so for this to be a fair comparison, the LLM would also have to get constant new input.
Humans in solitary confinement become mentally ill (both immediately and long-term), and hallucinate stuff (at least short term, I don't know about long term).