We will not be motivated to assemble an alternate simulation until there is too much to be explained away. The strategy makes sense. The problem is that we lose track of how much contrary evidence we have explained away so the usual alarms do not go off. This has also been called the garden path fallacy: taking one step that seems very straightforward, and then another, and each step makes so much sense that you do not notice how far you are getting from the main road.
I think this is what the knokwledge shields mentioned in Accelerated Expertise are referring to. This tendency to stick to a theory and explain away conflicting evidence. One or two coincidences may be feasible to accommodate for but it's easy to pile on more until the entire thing becomes very unlikely.