The argument is that LLMs don't have any motivations or goals, unless a human prompts them. They're not trying to stay alive or reproduce. They don't get hungry, feel pain or loneliness. They're just complex tools.
I know. The implication of that argument is that the motivations or goals of humans are different from those prompts in some way that means that GPT is not "really" intelligent.
That's my point. It's raising some arbitrary human trait to special status, when it really isn't. Human goals are set by some external process too - evolution. And they aren't really even intrinsically fixed. They can be modified through drugs.
I think the strongest demonstration of that is post orgasm clarity (if you're a man anyway). Your whole motivation changes in an instant.
A tamagotchi can have these things programmed, the only reason LLMs don’t is because we didn’t code them to do it. Or train them in an environment to maximize those things.