Science Is Getting Less Bang for Its Buck | The Atlantic

Source: Science Is Getting Less Bang for Its Buck | The Atlantic, by Patrick Collison and Michael Nielsen

[Scientific progress is] requiring larger teams, far more extensive scientific training, and the overall economic impact is getting smaller.

in the early days of the Nobel Prize, future Nobel scientists were 37 years old, on average, when they made their prizewinning discovery. But in recent times that has risen to an average of 47 years, an increase of about a quarter of a scientist’s working career.

When Ernest Rutherford discovered the nucleus of the atom in 1911, he published it in a paper with just a single author: himself. By contrast, the two 2012 papers announcing the discovery of the Higgs particle had roughly a thousand authors each. On average, research teams nearly quadrupled in size over the 20th century, and that increase continues today. For many research questions, it requires far more skills, expensive equipment, and a large team to make progress today.

U.S. productivity growth is way down. It’s been dropping since the 1950s, when it was roughly 6 times higher than today. That means we see about as much change over a decade today as we saw in 18 months in the 1950s.

Believing without evidence is always morally wrong | Aeon

Source: Believing without evidence is always morally wrong | Aeon, by Francisco Mejia Uribe

[William Kingdon Clifford’s] once seemingly exaggerated claim that ‘it is wrong always, everywhere, and for anyone, to believe anything upon insufficient evidence’ is no longer hyperbole but a technical reality.

In ‘The Ethics of Belief’ (1877), Clifford gives three arguments as to why we have a moral obligation to believe responsibly, that is, to believe only what we have sufficient evidence for, and what we have diligently investigated. His first argument starts with the simple observation that our beliefs influence our actions. … The most natural objection to this first argument is that while it might be true that some of our beliefs do lead to actions that can be devastating for others, in reality most of what we believe is probably inconsequential for our fellow humans. … I think critics had a point – had – but that is no longer so. In a world in which just about everyone’s beliefs are instantly shareable, at minimal cost, to a global audience, every single belief has the capacity to be truly consequential in the way Clifford imagined.

The second argument Clifford provides to back his claim that it is always wrong to believe on insufficient evidence is that poor practices of belief-formation turn us into careless, credulous believers. Clifford puts it nicely: ‘No real belief, however trifling and fragmentary it may seem, is ever truly insignificant; it prepares us to receive more of its like, confirms those which resembled it before, and weakens others; and so gradually it lays a stealthy train in our inmost thoughts, which may someday explode into overt action, and leave its stamp upon our character.’

Clifford’s third and final argument as to why believing without evidence is morally wrong is that, in our capacity as communicators of belief, we have the moral responsibility not to pollute the well of collective knowledge. … While Clifford’s final argument rings true, it again seems exaggerated to claim that every little false belief we harbour is a moral affront to common knowledge. Yet reality, once more, is aligning with Clifford, and his words seem prophetic. Today, we truly have a global reservoir of belief into which all of our commitments are being painstakingly added: it’s called Big Data. You don’t even need to be an active netizen posting on Twitter or ranting on Facebook: more and more of what we do in the real world is being recorded and digitised, and from there algorithms can easily infer what we believe before we even express a view. In turn, this enormous pool of stored belief is used by algorithms to make decisions for and about us. And it’s the same reservoir that search engines tap into when we seek answers to our questions and acquire new beliefs. Add the wrong ingredients into the Big Data recipe, and what you’ll get is a potentially toxic output. If there was ever a time when critical thinking was a moral imperative, and credulity a calamitous sin, it is now.

The Best Way to Predict the Future is to Create It. But Is It Already Too Late? | NIH, by Alan Kay

Source: The Best Way to Predict the Future is to Create It. But Is It Already Too Late?, by Alan Curtis Kay

To me, future is not five years or ten years. I think about the future as extending in front of us at least as far as the era that we live in, which I date back to kind of the invention of science or the early 17th century so it is always worthwhile thinking a hundred years or few hundred years ahead.

If “the best way to predict the future is to invent it”, is it too late to invent a healthy future?

Children are the future we send to the future.

The best way to predict the future is to invent the children who will invent it.

The Political Instability Task Force estimated that, between 1956 and 2016, a total of forty-three genocides took place, causing the death of about 50 million people.

When you start looking at this stuff you start thinking about “Oh, this is starting to look like normal behavior. Kind of like war.” This is not off to the side behavior. But then the other thought, what is interesting is if you look at the world population here, the United States has about 340 million people in it and if you go back in time to when the entire world only had 340 million which was about 1000 a.d. or so, what you see is they had wars everywhere. The United States has not had an internal war for 150 years now so the entire population of the world somehow was being stable. And in fact the European Union has not had an internal war for 75 years and that is about 600 million.

The larger societies have damped down these impulses. But on the other hand, these are really unstable, I think.

World Population
Lists of wars by date | Wikipedia

If you look at the National Institute of Health: National Institute of Mental Health, what they classify all mental illnesses roughly outside of the normal… There are things that are outside the normal. For instance schizophrenia is less than 1 percent. Suicide thoughts are more popular. I went looking for other things outside the normal. For instance, conscientious objectors. Going back in all of the wars that we ever had in the 20th century, world war one and world war two and so forth, have always been just about a quarter of one percent of the population has registered as a conscientious objector when they get drafted. So what is interesting is 99.77 percent are not conscientious objectors. Doesn’t mean they aren’t going to shoot their gun in the air, because some people didn’t register that still did not want to kill anybody. But as far as expressing an opinion, it is completely normal to be willing to kill others if your culture says it is okay.

If we do not normalize, people here know very well that humans are basically delusional. We believe things, we project those beliefs out onto the world. If we looked at what is the mapping between what we think is reality in here and what we think is out there, we would see a complete mismatch. I would call that a definition of not being sane. … But instead we don’t define [sanity] that way. We define insanity or mental illness simply outside of what most people actually do, no matter how crazy it is.

What if the normal human species is mentally ill or worse? I think this is provable.

List of cognitive biases | Wikipedia

[heavily paraphrased:] NIMH (the National Institute of Mental Health), from its website, has a purpose, vision, mission, topics… but it is not in the charge of NIMH to start looking at the “normal” human mental disorders of thinking and behavior that are so disastrous to civilization and life.

We sometimes think we’re not very imaginative. The problem is we are, just on some odd ideas. 77 percent of Americans believe in angels. 94 percent believe in supreme beings. Not zero percentage of Americans believe in sacrificing children to the gods. 50 percent believe in demons and ghosts. 21 percent still believe in witches. … A great book to read if you like this kind of thing is called The Great Cat Massacre. … It was a genocide of cats when they decided that cats were the witches in France and causing all manner of bad problems so they wound up killing a million cats. It is a book that explores how humans get these crazy ideas and then act as though they are real.

We have a really hard time with other kinds of things which our imaginations are not set up [for], like even things that are going to happen to us. We find it difficult to vividly imagine disasters ahead of time to take action to prevent them. We can be heroes in a real disaster. Why can’t we be a hero ahead of time?

The flood that we are ignoring, the thing we’re not building the dam for, is preparing the children for the next generation. We just complain about the state of the children when they come out of college.

The children basically, genetically, are aiming to learn the most important things that they will ever know from the culture around them. It is the form of that culture that is going to provide, for most children, the most lasting impressions.

If you look at the total number of people who could help teachers (retired STEM degree holders age 60-75), it is way more than the number of classrooms that need to be helped and yet almost none of these people are getting into the classrooms to help.

If you want to do something right now to make an enormous difference and perhaps create more children that can create the bigger changes, get out there and help the existing elementary school teachers. … I got these figures from NIH. Looks like there are over 9000 people in NIH that could be out there helping raise better children to think better than their parents do today.