Believing without evidence is always morally wrong | Aeon

Source: Believing without evidence is always morally wrong | Aeon, by Francisco Mejia Uribe

[William Kingdon Clifford’s] once seemingly exaggerated claim that ‘it is wrong always, everywhere, and for anyone, to believe anything upon insufficient evidence’ is no longer hyperbole but a technical reality.

In ‘The Ethics of Belief’ (1877), Clifford gives three arguments as to why we have a moral obligation to believe responsibly, that is, to believe only what we have sufficient evidence for, and what we have diligently investigated. His first argument starts with the simple observation that our beliefs influence our actions. … The most natural objection to this first argument is that while it might be true that some of our beliefs do lead to actions that can be devastating for others, in reality most of what we believe is probably inconsequential for our fellow humans. … I think critics had a point – had – but that is no longer so. In a world in which just about everyone’s beliefs are instantly shareable, at minimal cost, to a global audience, every single belief has the capacity to be truly consequential in the way Clifford imagined.

The second argument Clifford provides to back his claim that it is always wrong to believe on insufficient evidence is that poor practices of belief-formation turn us into careless, credulous believers. Clifford puts it nicely: ‘No real belief, however trifling and fragmentary it may seem, is ever truly insignificant; it prepares us to receive more of its like, confirms those which resembled it before, and weakens others; and so gradually it lays a stealthy train in our inmost thoughts, which may someday explode into overt action, and leave its stamp upon our character.’

Clifford’s third and final argument as to why believing without evidence is morally wrong is that, in our capacity as communicators of belief, we have the moral responsibility not to pollute the well of collective knowledge. … While Clifford’s final argument rings true, it again seems exaggerated to claim that every little false belief we harbour is a moral affront to common knowledge. Yet reality, once more, is aligning with Clifford, and his words seem prophetic. Today, we truly have a global reservoir of belief into which all of our commitments are being painstakingly added: it’s called Big Data. You don’t even need to be an active netizen posting on Twitter or ranting on Facebook: more and more of what we do in the real world is being recorded and digitised, and from there algorithms can easily infer what we believe before we even express a view. In turn, this enormous pool of stored belief is used by algorithms to make decisions for and about us. And it’s the same reservoir that search engines tap into when we seek answers to our questions and acquire new beliefs. Add the wrong ingredients into the Big Data recipe, and what you’ll get is a potentially toxic output. If there was ever a time when critical thinking was a moral imperative, and credulity a calamitous sin, it is now.

The Best Way to Predict the Future is to Create It. But Is It Already Too Late? | NIH, by Alan Kay

Source: The Best Way to Predict the Future is to Create It. But Is It Already Too Late?, by Alan Curtis Kay

To me, future is not five years or ten years. I think about the future as extending in front of us at least as far as the era that we live in, which I date back to kind of the invention of science or the early 17th century so it is always worthwhile thinking a hundred years or few hundred years ahead.

If “the best way to predict the future is to invent it”, is it too late to invent a healthy future?

Children are the future we send to the future.

The best way to predict the future is to invent the children who will invent it.

The Political Instability Task Force estimated that, between 1956 and 2016, a total of forty-three genocides took place, causing the death of about 50 million people.

When you start looking at this stuff you start thinking about “Oh, this is starting to look like normal behavior. Kind of like war.” This is not off to the side behavior. But then the other thought, what is interesting is if you look at the world population here, the United States has about 340 million people in it and if you go back in time to when the entire world only had 340 million which was about 1000 a.d. or so, what you see is they had wars everywhere. The United States has not had an internal war for 150 years now so the entire population of the world somehow was being stable. And in fact the European Union has not had an internal war for 75 years and that is about 600 million.

The larger societies have damped down these impulses. But on the other hand, these are really unstable, I think.

World Population
Lists of wars by date | Wikipedia

If you look at the National Institute of Health: National Institute of Mental Health, what they classify all mental illnesses roughly outside of the normal… There are things that are outside the normal. For instance schizophrenia is less than 1 percent. Suicide thoughts are more popular. I went looking for other things outside the normal. For instance, conscientious objectors. Going back in all of the wars that we ever had in the 20th century, world war one and world war two and so forth, have always been just about a quarter of one percent of the population has registered as a conscientious objector when they get drafted. So what is interesting is 99.77 percent are not conscientious objectors. Doesn’t mean they aren’t going to shoot their gun in the air, because some people didn’t register that still did not want to kill anybody. But as far as expressing an opinion, it is completely normal to be willing to kill others if your culture says it is okay.

If we do not normalize, people here know very well that humans are basically delusional. We believe things, we project those beliefs out onto the world. If we looked at what is the mapping between what we think is reality in here and what we think is out there, we would see a complete mismatch. I would call that a definition of not being sane. … But instead we don’t define [sanity] that way. We define insanity or mental illness simply outside of what most people actually do, no matter how crazy it is.

What if the normal human species is mentally ill or worse? I think this is provable.

List of cognitive biases | Wikipedia

[heavily paraphrased:] NIMH (the National Institute of Mental Health), from its website, has a purpose, vision, mission, topics… but it is not in the charge of NIMH to start looking at the “normal” human mental disorders of thinking and behavior that are so disastrous to civilization and life.

We sometimes think we’re not very imaginative. The problem is we are, just on some odd ideas. 77 percent of Americans believe in angels. 94 percent believe in supreme beings. Not zero percentage of Americans believe in sacrificing children to the gods. 50 percent believe in demons and ghosts. 21 percent still believe in witches. … A great book to read if you like this kind of thing is called The Great Cat Massacre. … It was a genocide of cats when they decided that cats were the witches in France and causing all manner of bad problems so they wound up killing a million cats. It is a book that explores how humans get these crazy ideas and then act as though they are real.

We have a really hard time with other kinds of things which our imaginations are not set up [for], like even things that are going to happen to us. We find it difficult to vividly imagine disasters ahead of time to take action to prevent them. We can be heroes in a real disaster. Why can’t we be a hero ahead of time?

The flood that we are ignoring, the thing we’re not building the dam for, is preparing the children for the next generation. We just complain about the state of the children when they come out of college.

The children basically, genetically, are aiming to learn the most important things that they will ever know from the culture around them. It is the form of that culture that is going to provide, for most children, the most lasting impressions.

If you look at the total number of people who could help teachers (retired STEM degree holders age 60-75), it is way more than the number of classrooms that need to be helped and yet almost none of these people are getting into the classrooms to help.

If you want to do something right now to make an enormous difference and perhaps create more children that can create the bigger changes, get out there and help the existing elementary school teachers. … I got these figures from NIH. Looks like there are over 9000 people in NIH that could be out there helping raise better children to think better than their parents do today.

There is no middle ground for deep disagreements about facts | Aeon

Source: There is no middle ground for deep disagreements about facts | Aeon, by Klemens Kappel

One particularly pernicious form of disagreement arises when we not only disagree about individuals facts but also disagree about how best to form beliefs about those facts, that is, about how to gather and assess evidence in proper ways. This is deep disagreement, and it’s the form that most societal disagreements take.

Deep disagreements are, in a sense, irresolvable. It is not that Amy is incapable of following Ben’s arguments or is generally insensitive to evidence. Rather, Amy has a set of beliefs that insulates her from the very sort of evidence that would be crucial for showing her to be mistaken. No line of argument or reasoning that Ben could sincerely present to Amy would rationally convince her.

We are used to the idea that respectfully accommodating the views of fellow citizens, whose intelligence and sincerity is not in doubt, requires some degree of moderation on our part. We cannot, it seems, both fully respect others, regard them as intelligent and sincere, and still be fully convinced that we are right and they are completely wrong, unless we simply agree to disagree. But on a societal level we cannot do that, since ultimately some decision must be made.

What is particularly troubling about some societal disagreements is that they concern factual matters that tend to be almost impossible to resolve since there is no agreed-upon method to do so, all while relating to important policy decisions. Generally, theorising about liberal democracy has focused largely on moral and political disagreements, while tacitly assuming that there would be no important factual disagreements to consider. It has been taken for granted that we would eventually agree about the facts, and the democratic processes would concern how we should adjudicate our differences in values and preferences. But this assumption is no longer adequate, if it ever was.

The Universe Is Always Looking | The Atlantic

Source: The Universe Is Always Looking | The Atlantic, by Philip Ball

[Schrödinger’s] cat is still hauled out today as if to imply that we’re as puzzled as ever by the mere fact that the quantum world at small scales turns into the world of classical physics at human scales. The fact is, however, that this so-called quantum-classical transition is now largely understood. … quantum physics is not replaced by another sort of physics at large scales. It actually gives rise to classical physics.

At the root of the distinction, though, lies the fact that quantum objects have a wave nature—which is to say, the equation Schrödinger devised in 1924 to quantify their behavior tells us that they should be described as if they were waves, albeit waves of a peculiar, abstract sort that are indicative only of probabilities. It is this waviness that gives rise to distinctly quantum phenomena like interference, superposition, and entanglement. These behaviors become possible when there is a well-defined relationship between the quantum “waves”: in effect, when they are in step. This coordination is called “coherence.”

Macroscopic, classical objects don’t display quantum interference or exist in superpositions of states because their wave functions are not coherent. … Every real system in the universe sits somewhere, surrounded by other stuff and interacting with it. … Quantum superpositions of states … are highly contagious and apt to spread out rapidly. And that is what seems to destroy them. … As time passes, the initial quantum system becomes more and more entangled with its environment. In effect, we then no longer have a well-defined quantum system embedded in an environment. Rather, system and environment have merged into a single superposition. … This spreading is the very thing that destroys the manifestation of a superposition in the original quantum system. Because the superposition is now a shared property of the system and its environment, we can no longer “see” the superposition just by looking at the little part of it. What we understand to be decoherence is not actually a loss of superposition but a loss of our ability to detect it in the original system.

And this has nothing to do with observation in the normal sense: We don’t need a conscious mind to “look” in order to “collapse the wave function.” All we need is for the environment to disperse the quantum coherence. We obtain classical uniqueness from quantum multiplicity when decoherence has taken its toll. … All of the photons of sunlight that bounce off the moon are agents of decoherence, and are more than adequate to fix its position in space and give it a sharp outline. The universe is always looking.