Effective altruism and existential risk reduction face a single point of failure: they depend on civilization. Risks to civilization endanger effective altruism, existential risk reduction, and all significant humanitarian causes.
When you take the long view, civilizational collapse happens all the time. In contrast, many existential risks are speculative or rare: either they have never happened before, like nanotechnology weapons, or they are extremely uncommon, like large asteroid strikes.
You invest in the things you value, but you also need to be investing in the thing that lets you pursue your values in the first place: civilization.