Swift is correct that for any process governed by quantum mechanics (including radioactive decay), we are only able to give a statistical description of what will happen. That's not just a limit of how precise our measuring devices can be made, or an acknowledgment that our initial information is imperfect; it's a fundamental limitation on how much can be known even in principle about a system. From what we can tell, such processes really are fundamentally random. As far as quantum theory is concerned, if two systems are prepared in the same state, they are identical, and there is no reason that one of them gives one result when measured and a second one gives a different result, as opposed to the other way around.
However, it is also true that someone might someday come up with a replacement for quantum mechanics that is deterministic. Such a model is said to have "hidden variables", and the idea is that even if two systems are in the same state, there are some values (which we do not know how to measure, and which are probably not measurable even in principle) that further describe each system. So with such a model, the reason two systems in the same state evolve differently is that they aren't actually identical. Instead there are additional parameters that could define them, and if you knew what those parameters were, you could predict which one would go which way. It's worthwhile noting that, in order to agree with observations, any such mechanism that underlies quantum theory needs to be nonlocal. That is, the hidden variables that describe the state of a system have to be able to change in response to events or measurements that take place arbitrarily far away, without regard for the speed of light limit on communication. Models like this have been created, mostly just as an interesting exercise, since it doesn't really allow us any additional ability to predict what the results of measurements will be.
Conserve energy. Commute with the Hamiltonian.