Evidentialism, conservatism, skepticism
An evil scientist might have built a brain in vat that has all the experiences you currently have. On the basis of your experiences, you cannot rule out being that brain in a vat. But you can rule out being that scientist. In fact, being that scientist is not a skeptical scenario at all. For example, if the scientist in question suspects that she is a scientist building a brain in a vat, then that would not constitute a skeptical attitude.
So skeptical scenarios are centred possibilities. And so it should be easy to construct cases where an agent transitions from a non-skeptical scenario into a skeptical scenario. Such cases could serve as interesting test cases for evidentialism and epistemic conservatism, but I find it surprisingly hard to come up with a good example.
Here is the basic idea. Consider a possible agent who at present leads an ordinary life in a fairly ordinary world. Her senses are reliable, what she thinks are her friends are genuine friends, what she has not yet observed resembles the observed, and so on. But all that is about to change. Her friends will be replaced by actors, her senses will become unreliable, the samples she draws will be unrepresentative, and so on. By 2025, she ends up a brain in a vat. And of course she never gets any clue that these changes take place.
There are two relevant scenarios here: being the agent in 2015 (for short, H15) and being the agent in 2025 (H25). The latter is a radical skeptical scenario; the former is not.
Now suppose the evidence you will have in 2025 is compatible with H25, in the way in which our evidence is always compatible with radical skeptical scenarios. If you proportion your beliefs to your evidence, you should then assign very low probability to H25. For example, since you have no evidence to the contrary, you should be confident that what you take to be your friends in 2025 are real friends rather than actors or not real people at all.
Suppose further that your present evidence, in 2015, is compatible with H15, the hypothesis that you will become a brain in a vat by 2025. This is a less radical skeptical scenario than H25. It does not imply that your friends are not friends, or that your senses are unreliable. It is not all that different from the scientist's scenario above -- a scenario in which somebody else's senses are unreliable, except that here the "somebody else" is your (distant) future self. So the principle that skeptical scenarios deserve little credence does not apply to H15, or at any rate not as strongly as to H25.
For concreteness, let's say your present evidence makes it rational to give credence 0.1 to H15, while your evidence in 2025 only warrants credence 0.01 in H25. If you proportion your belief to the evidence, your credence in the hypothesis that you will be envatted by 2025 therefore goes down between now and 2025. On the other hand, it may well be that at no point you learn anything that, by the lights of your previous credences, would render the hypothesis that you will be envatted by 2025 less probable.
To illustrate, suppose that tomorrow morning you have experiences as of rain outside your window. The scenario in which you are on the way to becoming a brain in a vat divides into many more specific possibilities, varying (among other things) in what you experience along the way. The same is true for the scenario in which you are not on the way to becoming a brain in a vat. Before you went to sleep tonight, your credence that upon awakening you would have experiences as of rain conditional on the first type of scenario should arguably be the same as your credence conditional on the second type of scenario. So if you follow the conservative policy to only change your credences in a hypothesis if you learn something that is relevant to that hypothesis by the lights of your previous credence, then your rain experience should not affect your credence in the hypothesis that you will be envatted by 2025. The same is true for every other experience you will have. For remember that the envatting scenario in question is one in which your experience never gives you any clues about what is happening.
Assuming your present credence of 0.1 in the "envatted by 2025" hypothesis is rational, it follows that conservatism and evidentialism make different recommendations for your credence in 2025: conservatism says it should still be 0.1; evidentialism ("proportion your belief to your evidence") says it should be 0.01.
That's the basic idea. We could now explore which of these recommendations is more plausible. Unfortunately, it is hard to make sense of the case as presented so far.
The main problem should be obvious: in the absence of relevant evidence you should not give credence 0.1 to H15, the hypothesis that you will become envatted by 2025. Even if that hypothesis doesn't count as a radical skeptical scenario, it is sufficiently bizarre and specific to deserve negligible prior credence. So we're really talking about a difference between something like 1 x 10^-25 as the conservative recommendation and 1 x 10^-26 as the evidentialist recommendation, and it's hard to get any intuitive grasp of those quantities. Another problem is that your evidence arguably underdetermines the credence you should give to H15 (and H25 in 2025), so that we end up with overlapping set-valued recommendations.
We need to adjust the case so that "you" (no longer actual you) have some evidence in favour of H15 that raises its probability into non-negligible regions and ideally also makes the probability more determinate.
So let's say that in 2015 you know that a fair coin has been tossed; if it landed tails you will be slowly turned into a brain in vat (as in H15); nothing special will happen on heads. Now your credence in H15 should plausibly be 0.5. Conservatism says it should remain at 0.5 until at least 2025. But what does evidentialism say?
In 2025, you still remember what you once learned; you remember that ten years ago a fair coin was tossed and that if it landed tails then you are now a brain in a vat. If your evidence includes this information, then it arguably warrants giving credence 0.5 to the hypothesis that you are now a brain in a vat. Think about simpler cases: you have taken a drug that, with 50 percent probability, causes very realistic hallucinations of rain; now it seems to you as if it's raining. Knowing about the drug's power, how confident should you be that your visual experience is reliable? Not much more than 50 percent, I would say.
The Principal Principle here seems to trump the norm that skeptical scenarios should get low probability -- that is, that one should trust one's senses, that one should expect the unobserved to resemble the observed, and so on. (A strange and interesting fact, if true. But let's move on.)
We could say that in the chancy vat scenario your evidence does not include the information about the coin, but only your apparent memories of that information, but I don't think this would help much. We could also change the case so that you lose your memories of the coin. But I'm not sure how that would work and anyway I want a case without memory loss since otherwise the application of conservatism becomes unclear.
The best I can come up with is to say that in 2015 you hear rumours of H15 -- rumours that do not strike you as very credible but that do raise the probability of H15 from around 10^-25 to around 0.1. As above, conservatism then says your credence in H25 should still be 0.1 in 2025. And I think evidentialism should recommend a significantly lower credence. After all, the weak and defeasible norm to take rumours seriously shouldn't trump the norm to trust one's senses!
However, for some reason (that I don't understand) this assumption doesn't look all that convincing when applied to your situation in 2025. At that point you still remember the somewhat credible rumours from 2015; are you allowed to dismiss them merely because they would entail bad things about your present epistemic situation?