If people could understand what computing was about

If people could understand what computing was about, the iPhone would not be a bad thing. But because people don’t understand what computing is about, they think they have it in the iPhone, and that illusion is as bad as the illusion that Guitar Hero is the same as a real guitar.

Reconstruction of a Train Wreck: How Priming Research Went off the Rails | Replicability-Index

Source: Reconstruction of a Train Wreck: How Priming Research Went off the Rails | Replicability-Index
Response Comment by Daniel Kahneman:

What the blog gets absolutely right is that I placed too much faith in underpowered studies. As pointed out in the blog, and earlier by Andrew Gelman, there is a special irony in my mistake because the first paper that Amos Tversky and I published was about the belief in the “law of small numbers,” which allows researchers to trust the results of underpowered studies with unreasonably small samples. We also cited Overall (1969) for showing “that the prevalence of studies deficient in statistical power is not only wasteful but actually pernicious: it results in a large proportion of invalid rejections of the null hypothesis among published results.” Our article was written in 1969 and published in 1971, but I failed to internalize its message.

My position when I wrote “Thinking, Fast and Slow” was that if a large body of evidence published in reputable journals supports an initially implausible conclusion, then scientific norms require us to believe that conclusion. Implausibility is not sufficient to justify disbelief, and belief in well-supported scientific conclusions is not optional. This position still seems reasonable to me – it is why I think people should believe in climate change. But the argument only holds when all relevant results are published.

An electromagnetic shock – An electromagnetic shock

Source: An electromagnetic shock – An electromagnetic shock

ON MARCH 13th 1989 a surge of energy from the sun, from a “coronal mass ejection”, had a startling impact on Canada. Within 92 seconds, the resulting geomagnetic storm took down Quebec’s electricity grid for nine hours. It could have been worse. On July 23rd 2012 particles from a much larger solar ejection blew across the orbital path of Earth, missing it by days. Had it hit America, the resulting geomagnetic storm would have destroyed perhaps a quarter of high-voltage transformers, according to Storm Analysis Consultants in Duluth, Minnesota.

America runs on roughly 2,500 large transformers, most with unique designs. But only 500 or so can be built per year around the world. It typically takes a year or more to receive an ordered transformer, and that is when cranes work and lorries and locomotives can be fuelled up. Some transformers exceed 400 tonnes.

Kit that protects transformers from EMP also saves them from geomagnetic storms, though the reverse is not true. … The expense of installing surge-blockers and other EMP-proofing kit on America’s big transformers is debated. The EMP Commission’s report in 2008 reckoned $3.95bn or less would do it. Others advance higher figures. But a complete collapse of the grid could probably be prevented by protecting several hundred critical transformers for perhaps $1m each.

Book Review: Surfing Uncertainty | Slate Star Codex

Source: Book Review: Surfing Uncertainty | Slate Star Codex

Surfing Uncertainty isn’t pop science and isn’t easy reading. Sometimes it’s on the border of possible-at-all reading. … It’s your book if you want to learn about predictive processing at all, since as far as I know this is the only existing book-length treatment of the subject. And it’s comprehensive, scholarly, and very good at giving a good introduction to the theory and why it’s so important. So let’s be grateful for what we’ve got and take a look.

The key insight: the brain is a multi-layer prediction machine. All neural processing consists of two streams: a bottom-up stream of sense data, and a top-down stream of predictions. These streams interface at each level of processing, comparing themselves to each other and adjusting themselves as necessary. … both streams contain not only data but estimates of the precision of that data. … Each level receives the predictions from the level above it and the sense data from the level below it. Then each level uses Bayes’ Theorem to integrate these two sources of probabilistic evidence as best it can.

there might be some unresolvable conflict between high-precision sense-data and predictions. The Bayesian math will indicate that the predictions are probably wrong. The neurons involved will fire, indicating “surprisal” – a gratuitously-technical neuroscience term for surprise. The higher the degree of mismatch, and the higher the supposed precision of the data that led to the mismatch, the more surprisal – and the louder the alarm sent to the higher levels.

When the higher levels receive the alarms from the lower levels, this is their equivalent of bottom-up sense-data. … All the levels really hate hearing alarms. Their goal is to minimize surprisal – to become so good at predicting the world (conditional on the predictions sent by higher levels) that nothing ever surprises them.

Locus Online Perspectives » Cory Doctorow: Demon-Haunted World

Source: Locus Online Perspectives » Cory Doctorow: Demon-Haunted World

Software – whose basic underlying mechanism is ‘‘If this happens, then do this, otherwise do that’’ – allows cheaters to be a lot more subtle, and thus harder to catch. Software can say, ‘‘If there’s a chance I’m undergoing inspection, then be totally honest – but cheat the rest of the time.’’

what happens when the things you own start to cheat you?

Wannacry was a precursor to a new kind of cheating: cheating the in­dependent investigator, rather than the government. Imagine that the next Dieselgate doesn’t attempt to trick the almighty pollution regulator (who has the power to visit billions in fines upon the cheater): instead, it tries to trick the reviewers

these forms of cheating treat the owner of the device as an enemy of the company that made or sold it, to be thwarted, tricked, or forced into con­ducting their affairs in the best interest of the com­pany’s shareholders. To do this, they run programs and processes that attempt to hide themselves and their nature from their owners, and proxies for their owners (like reviewers and researchers).

The software in gadgets makes it very tempting indeed to fill them with pernicious demons, but [the Computer Fraud and Abuse Act (1986) and section 1201 of the Digital Millen­nium Copyright Act (1998)] criminalize trying to exorcise those demons.