Exploding desks and indistinguishable situations
I've thought a bit about belief update recently. One thing I noticed is that it is often assumed in the literature (usually without argument) that if you know that there are two situations in your world that are evidentially indistinguishable from your current situation, then you should give them roughly the same credence. Although I agree with some of the applications, the principle in general strikes me as very implausible. Here is a somewhat roundabout counter-example that has a few other interesting features as well.
I think my desk probably won't explode in the next few minutes. Now suppose the Experimental Philosophy Front tells me the following, and I have reason to believe them:
*) It's true that your desk won't explode, but the desk of your twin brother will. Until that point, we will make sure that your brother's credence that his desk will explode equals your credence that your desk will explode.
Should that make me change my mind on whether my desk will explode?
Well, there's "should" and "should". As far as epistemic rationality is concerned, it shouldn't: I was confident before that my desk won't explode, and what I've learned in the meantime only confirmed that belief. On the other hand, it might be practically beneficial if I could make myself have the irrational and false belief that my desk will explode, as that might save my brother's life. So I have a practical reason to believe that my desk will explode, but no epistemic reason.
Whether I have a practical reason to believe that my desk will explode perhaps depends on how exactly the Philosophy Front will see to it that my brother's credences are aligned with mine. Will they constantly determine my credences and implant them into my brother's brain? In this case, my credences will causally influence his. Suppose that's not how they do it. They use a simpler method: they figured out that my brother and I are deterministic reasoners of the same type, so that starting with the same relevant assumptions and presented with the same evidence, we will reach the same conclusion. Hence all they needed to do to align our credences with respect to the desk is to tell my brother (*) as well, and make sure he trusts them as much as I do. If that is what they do, it is no longer clear that I have a practical reason to make myself believe that my desk will explode: whatever I do has no influence on whether my brother will die at his exploding desk. Tricking myself to be irrational will provide evidence that my brother will survive, but it will have no causal influence on his fate. (This is of course Newcomb's problem.)
Does the precise way in which the Front makes our credences align also matter for my belief's epistemic rationality? It seems not. Whatever the Front tells me about how they make my brother's credences mirror mine, that is surely irrelevant to whether my desk will explode. In particular, suppose they decided that just to be safe, they not only present my brother with the same announcement (*), they also made his entire surroundings indistinguishable from mine: they replaced the flower pots in his office, the papers on his desk, etc, and they replaced his memories by mine. All this just to make sure that he will reach the same conclusions as I with respect to the desk. Still, I should remain confident that my desk won't explode.
If that doesn't seem obvious, note that I still haven't learned anything that is any evidence that my desk will explode. On the contrary, what I learned confirmed that it will not explode. Moreover, why should it matter that my brother and I agree about flower pots: what does this have to do with my desk? Suppose I know that we're in the exact same evidential situation except perhaps with respect to the position of the flower pots. Should I then say: if the flower pot in my brother's office is exactly where mine is, then there's a good chance that my desk will explode? That seems crazy. It doesn't matter where our flower pots are, and whether we have different evidence about their position. Finally, the view that I should now be uncertain about my desk is inconsistent with me believing what the Front said, namely that my brother's desk will explode and mine won't. So on this view, I can't rationally believe what they said. But that's odd: why can't I? After all, they didn't say anything incoherent. (There's also the problem that if I don't believe them, I lose the reason for thinking that my desk might explode.)
So here we have a case in which my situation is evidentially indistinguishable from another situation in the same world, and yet I should be certain that I am not in that other situation. And I take it that it makes no difference whether the other situation involves my twin brother or my future self. That is, if I learned that at some day in the future, I will sit at my desk shortly before it explodes, and that my credence in the desk exploding will be aligned with my current credence, and that I will be made to have the same evidence I have now, I should likewise remain certain that my desk right now will not explode.
Is your or your brother's desk anywhere near where the picture below was taken? (And will you ever explain why you posted it in your blog?)
Couldn't you and your twin brother just go out shopping until one of the desks has exploded?