I worry that people don’t adequately separate two kinds of caution. Call them local caution and global caution. Suppose some new spacecraft is about to be launched. A hundred experts have evaluated it and determined that it’s safe. But some low-ranking engineer at NASA who happens to have some personal familiarity with the components involved looks at the schematics and just has a really bad feeling. It’s not that there’s any specific glaring flaw. It’s not any of the known problems that have ever led to spacecraft failure before. Just that a lot of the parts weren’t quite designed to go together in exactly that way, and that without being entirely able to explain his reasoning, he would not be the least bit surprised if that spacecraft exploded.
What is the cautious thing to do? The locally cautious response is for the engineer to accept that a hundred experts probably know better than he does. To cautiously remind himself that it’s unlikely he would discover a new spacecraft failure mode unlike any before. To cautiously admit that grounding a spacecraft on an intuition would be crazy. But the globally cautious response is to run screaming into the NASA director’s office, demanding that he stop the launch immediately until there can be a full review of everything. There’s a sense in which this is rash and ignores all sorts of generally wise and time-tested heuristics like the ones above. But if by “caution” you mean you want as few astronauts as possible to end up as smithereens, it’s the way to go.