A Few Rules | Collaborative Fund

Source: A Few Rules | Collaborative Fund, by Morgan Housel

A list of possible wisdom.

The person who tells the most compelling story wins. Not the best idea. Just the story that catches people’s attention and gets them to nod their heads.

Something can be factually true but contextually nonsense. Bad ideas often have at least some seed of truth that gives their followers confidence.

Tell people what they want to hear and you can be wrong indefinitely without penalty.

Woodrow Wilson said government “is accountable to Darwin, not to Newton.” It’s a useful idea. Everything is accountable to one of the two, and you have to know whether something adapts and changes over time or perpetually stays the same.

Behavior is hard to fix. When people say they’ve learned their lesson they underestimate how much of their previous mistake was caused by emotions that will return when faced with the same circumstances.

“Logic is an invention of man and may be ignored by the universe,” historian Will Durant says. That’s why forecasting is hard.

Being good at something doesn’t promise rewards. It doesn’t even promise a compliment. What’s rewarded in the world is scarcity, so what matters is what you can do that other people are bad at.

The world is governed by probability, but people think in black and white, right or wrong – did it happen or did it not? – because it’s easier.

Henry Luce said, “Show me a man who thinks he’s objective and I’ll show you a man who’s deceiving himself.” People see what they want to see, hear what they want to hear, and view the world through the lens of their own unique life experiences.

People learn when they’re surprised. Not when they read the right answer, or are told they’re doing it wrong, but when their jaw hits the floor.

Most fields have only a few laws. Lots of theories, hunches, observations, ideas, trends, and rules. But laws – things that are always true, all the time – are rare.

The only thing worse than thinking everyone who disagrees with you is wrong is the opposite: being persuaded by the advice of those who need or want something you don’t.

Simple explanations are appealing even when they’re wrong. “It’s complicated” isn’t persuasive even when it’s right.

Self-interest is the most powerful force in the world. Which can be great, because situations where everyone’s interests align are unstoppable; bad because people’s willingness to benefit themselves at the expense of others is so seductive.

History is deep. Almost everything has been done before. The characters and scenes change, but the behaviors and outcomes rarely do. “Everything feels unprecedented when you haven’t engaged with history.”

Don’t expect balance from very talented people. People who are exceptionally good at one thing tend to be exceptionally bad at another, due to overconfidence and mental bandwidth taken up by the exceptional skill. Skills also have two sides: No one should be shocked when people who think about the world in unique ways you like also think about the world in unique ways you don’t like.

Progress happens too slowly to notice, setbacks happen too fast to ignore. There are lots of overnight tragedies, but no overnight miracles. Growth is driven by compounding, which always takes time. Destruction is driven by single points of failure, which can happen in seconds, and loss of confidence, which can happen in an instant.

It is way easier to spot other people’s mistakes than your own. We judge others based solely on their actions, but when judging ourselves we have an internal dialogue that justifies our mistakes and bad decisions.

Reputations have momentum in both directions, because people want to associate with winners and avoid losers.

History is driven by surprising events, forecasting is driven by predictable ones. It’s not an easy problem to solve.

Orthodox Privilege, by Paul Graham

Source: Orthodox Privilege, by Paul Graham

There has been a lot of talk about privilege lately. Although the concept is overused, there is something to it, and in particular to the idea that privilege makes you blind — that you can’t see things that are visible to someone whose life is very different from yours.

But one of the most pervasive examples of this kind of blindness is one that I haven’t seen mentioned explicitly. I’m going to call it orthodox privilege: The more conventional-minded someone is, the more it seems to them that it’s safe for everyone to express their opinions.

It’s safe for them to express their opinions, because the source of their opinions is whatever it’s currently acceptable to believe. So it seems to them that it must be safe for everyone. They literally can’t imagine a true statement that would get them in trouble.

And yet at every point in history, there were true things that would get you in terrible trouble to say. Is ours the first where this isn’t so? What an amazing coincidence that would be.

It doesn’t seem to conventional-minded people that they’re conventional-minded. It just seems to them that they’re right. Indeed, they tend to be particularly sure of it.

If you believe there’s nothing true that you can’t say, then anyone who gets in trouble for something they say must deserve it.

How To Understand Things, by Nabeel Qureshi

Source: How To Understand Things, by Nabeel Qureshi

What we call ‘intelligence’ is as much about virtues such as honesty, integrity, and bravery, as it is about ‘raw intellect.’

Intelligent people simply aren’t willing to accept answers that they don’t understand — no matter how many other people try to convince them of it, or how many other people believe it, if they aren’t able to convince them selves of it, they won’t accept it.

One component of it is energy: thinking hard takes effort, and it’s much easier to just stop at an answer that seems to make sense, than to pursue everything that you don’t quite get down an endless, and rapidly proliferating, series of rabbit holes. … But it’s not just energy. You have to be able to motivate yourself to spend large quantities of energy on a problem, which means on some level that not understanding something — or having a bug in your thinking — bothers you a lot. You have the drive, the will to know.

Related to this is honesty, or integrity: a sort of compulsive unwillingness, or inability, to lie to yourself.

Another quality I have noticed in very intelligent people is being unafraid to look stupid. … Most people are not willing to do this — looking stupid takes courage, and sometimes it’s easier to just let things slide.

The best thing I have read on really understanding things is the Sequences, especially the section on Noticing Confusion.

understanding is not a binary “yes/no”. It has layers of depth.

Conflict vs. mistake in non-zero-sum games | LessWrong 2.0

Source: Conflict vs. mistake in non-zero-sum games | LessWrong 2.0, by Nisan

Summary: Whether you behave like a mistake theorist or a conflict theorist may depend more on your negotiating position in a non-zero-sum game than on your worldview.

Plot the payoffs in a non-zero-sum two-player game, and you’ll get a set with the Pareto frontier on the top and right. You can describe this set with two parameters: The surplus is how close the outcome is to the Pareto frontier, and the allocation tells you how much the outcome favors player 1 versus player 2.

It’s tempting to decompose the game into two phases: A cooperative phase, where the players coordinate to maximize surplus; and a competitive phase, where the players negotiate how the surplus is allocated.

Of course, in the usual formulation, both phases occur simultaneously. But this suggests a couple of negotiation strategies where you try to make one phase happen before the other:

  1. “Let’s agree to maximize surplus. Once we agree to that, we can talk about allocation.”
  2. “Let’s agree on an allocation. Once we do that, we can talk about maximizing surplus.”

I’m going to provocatively call the first strategy mistake theory, and the second conflict theory.

Now I don’t have a good model of negotiation. But intuitively, it seems that mistake theory is a good strategy if you think you’ll be in a better negotiating position once you move to the Pareto frontier. And conflict theory is a good strategy if you think you’ll be in a worse negotiating position at the Pareto frontier.

If you’re naturally a mistake theorist, this might make conflict theory seem more appealing. Imagine negotiating with a paperclip maximizer over the fate of billions of lives. Mutual cooperation is Pareto efficient, but unappealing. It’s more sensible to threaten defection in order to save a few more human lives, if you can get away with it.

It also makes mistake theory seem unsavory: Apparently mistake theory is about postponing the allocation negotiation until you’re in a comfortable negotiating position. (Or, somewhat better: It’s about tricking the other players into cooperating before they can extract concessions from you.)

This is kind of unfair to mistake theory, which is supposed to be about educating decision-makers on efficient policies and building institutions to enable cooperation. None of that is present in this model.

But I think it describes something important about mistake theory which is usually rounded off to something like “[mistake theorists have] become part of a class that’s more interested in protecting its own privileges than in helping the poor or working for the good of all”.

It’s Not Enough to Be Right. You Also Have to Be Kind. | Forge (Medium)

Source: It’s Not Enough to Be Right. You Also Have to Be Kind. | Forge (Medium), by Ryan Holiday

the central conceit of a dangerous assumption we seem to have made as a culture these days: that being right is a license to be a total, unrepentant asshole.

I thought if I was just overwhelmingly right enough, I could get people to listen. … Yet, no amount of yelling or condescension or trolling is going to fix any of this. It never has and never will.

it’s so much easier to be certain and clever than it is to be nuanced and nice. … But putting yourself in their shoes, kindly nudging them to where they need to be, understanding that they have emotional and irrational beliefs just like you have emotional and irrational beliefs—that’s all much harder. So is not writing off other people.