Norms of rationality
My officemate Jens-Christian, my flatmate Weng Hong and his officemate Aidan have started a blog on bunnies probabilitiy, possibility and rationality. There's already a couple of good posts by Weng Hong.
We had a little chat about the normativity of rationality today. Unlike with moral norms, I cannot imagine people who vastly disgree with me on the norms of rationality and who actually act upon their different norms. Can you imagine people who usually infer "~P" from "P and Q", update their beliefs by counter-conditionalizing P'(H) = 1-P(H|E), and always try to minimize their expected utility? I can't. By contrast, I find it easy to imagine people who value torturing innocent people and do so. This indicates that so-called norms of rationality are to a large part not real norms at all, but conceptual necessities. So is the "ought" or "must" in "if you believe P and Q, you must/ought to also believe P" like the "must" in "if it is true that P and Q, then it must also be true that P"? I think it's more like the "ought" in "if you go 'File' -> 'Save', the program ought to save the current document". Software can be buggy and fail to do what it's supposed to do according to its design specification. It is inconceivable that a word processor generally doesn't do any of the things that characterize a word processor. But it is conceivable that it fails occasionally and under specific conditions. (Then perhaps what Dutch books arguments try to show is that if you don't obey the probability axioms, you do something -- viz. give different evaluations to the same states of affairs -- which, if you did the same thing on a large scale, would rob you of your status as an agent with beliefs and desires.)