Unless the MCP server itself has an LLM call inside of it (rare), the MCP server is pretty deterministic. It’s the AI that invokes it that’s actually indeterministic, but the user is already using that.
In the real world, where it is (at least in our current state of overall programming language tooling, and the existence of physics) intractable to prove all eventualities and absence of side-effects of executed code, determinism is indeed a spectrum.
If we want to be specific here, I would say the "pretty deterministic" is equal to "as deterministic as your typical non-LLM REST API call", which still spans a big range of determinism.