Could absolutely be that. Or it is so smart that it realizes that the author believes they have given enough information and that it should not have to land on a low chance guess. So that pattern is the only one that make sense in that case.
What the author provided is not necessarily the same as what the software forwarded to the model, especially if some sort of "recall" feature is being used.
Maybe unlikely that is that smart though