How to change one's mind
Suppose beliefs locate us in centered logical space: to believe something is to rule out not only ways a universe might be, but ways things might be for an individual at a time. Then there will be two kinds of rational belief change: we can learn something new about our present situation, and we can change our situation and adjust our beliefs to this change. The rule for changes of the first kind is conditionalization. The rule for changes of the second kind doesn't have an official name yet, as far as I know. (In the AGM/KM framework, it is called "update", but we Bayesians often use "update" for conditioning.) In practice, the two rules always go hand in hand: you never learn something new without changing your situation, and you hardly ever change your situation without learning anything new.
In this paper, I try to spell out the two rules, and their combination: Believing in afterlife: conditionalization in a changing world (PDF).
I'm a bit unhappy with some parts of the story, and I should probably say more about alternative accounts in the literature, and why I don't like them. So hopefully there will be an update soon. In the meantime, comments are as always very welcome!
Hey wo,
According to you, T(v>w) = 1/|{u: vRu}| if vRw; and 0 otherwise. Is this meant to hold in general? Suppose we add a twist to your flies example. Suppose you know that God slightly prefers a world with an even number of flies to one with an odd number of flies. Then it seems, for example, that T(w3>w2) should be slightly greater than T(w3>w1). How would you deal with such a case?