Incorrect. LLMs are designed to be deterministic (when temperature=0). Only if you choose for them to be non-deterministic are they so. Which is no different in the case of GCC. You can add all kinds of random conditionals if you had some reason to want to make it non-deterministic. You never would, but you could.
There are some known flaws in GPUs that can break that assumption in the real world, but in theory (and where you have working, deterministic hardware) LLMs are absolutely deterministic. GCC also stops being deterministic when the hardware breaks down. A cosmic bit flip is all it takes to completely defy your assertion.