Source: Surveillance Kills Freedom By Killing Experimentation | WIRED, by Bruce Schneier
Excerpted from The End of Trust(McSweeney’s issue 54)
When we’re being watched, we conform. We don’t speak freely or try new things. But social progress happens in the gap between what’s legal and what’s moral.
It’s easy to imagine the more conservative among us getting enough power to make illegal what they would otherwise be forced to witness.
For social norms to change, people need to deviate from these inherited norms. People need the space to try alternate ways of living without risking arrest or social ostracization. People need to be able to read critiques of those norms without anyone’s knowledge, discuss them without their opinions being recorded, and write about their experiences without their names attached to their words. People need to be able to do things that others find distasteful, or even immoral. The minority needs protection from the tyranny of the majority.
Privacy makes all of this possible. Privacy encourages social progress by giving the few room to experiment free from the watchful eye of the many.
Source: Cory Doctorow: Zuck’s Empire of Oily Rags – Locus Online
while the acknowledgment of the problem of Big Tech is most welcome, I am worried that the diagnosis is wrong. … we’re confusing automated persuasion with automated targeting. … Facebook isn’t a mind-control ray. It’s a tool for finding people who possess uncommon, hard-to-locate traits, whether that’s “person thinking of buying a new refrigerator,” “person with the same rare disease as you,” or “person who might participate in a genocidal pogrom,”
It’s fashionable to treat the dysfunctions of social media as the result of the naivete of early technologists, who failed to foresee these outcomes. The truth is that the ability to build Facebook-like services is relatively common. What was rare was the moral recklessness necessary to go through with it.
dossiers on billions of people hold the power to wreak almost unimaginable harm, and yet, each dossier brings in just a few dollars a year. For commercial surveillance to be cost effective, it has to socialize all the risks associated with mass surveillance and privatize all the gains.
There’s an old-fashioned word for this: corruption. In corrupt systems, … the costs are widely diffused while the gains are tightly concentrated, so the beneficiaries of corruption can always outspend their victims to stay clear.
Facebook doesn’t have a mind-control problem, it has a corruption problem. Cambridge Analytica didn’t convince decent people to become racists; they convinced racists to become voters.
Source: Warrant Protections against Police Searches of Our Data – Schneier on Security
The cell phones we carry with us constantly are the most perfect surveillance device ever invented, and our laws haven’t caught up to that reality.
Traditionally, information that was most precious to us was physically close to us. It was on our bodies, in our homes and offices, in our cars. Because of that, the courts gave that information extra protections. Information that we stored far away from us, or gave to other people, afforded fewer protections. … The Internet has turned that thinking upside-down. … all our data is literally stored on computers belonging to other people. It’s our e-mail, text messages, photos, Google docs, and more all in the cloud. We store it there not because it’s unimportant, but precisely because it is important.
The issue here is not whether the police should be allowed to use that data to help solve crimes. Of course they should. The issue is whether that information should be protected by the warrant process that requires the police to have probable cause to investigate you and get approval by a court.
Source: The Trouble with Politicians Sharing Passwords
the premise of justifying a bad practice purely on the basis of it being common is extremely worrying. It’s normalising a behaviour that we should be actively working towards turning around.
What’s the Problem Credential Sharing is Solving?
Let’s start here because it’s important to acknowledge that there’s a reason Nadine (and others) are deliberately sharing their passwords with other people. … sourcing help from staffers … delegation … collaboration … there are indeed technology solutions available to solve this problem
One of the constant themes that came back to me via Twitter was “plausible deniability” … The assertion here is that someone in her position could potentially say “something bad happened under my account but because multiple people use it, maybe it was someone else”. The thing is, this is precisely the antithesis of identity and accountability and if this is actually a desirable state, then frankly there’s much bigger problems at hand.
there are plenty of people who unwittingly put an organisation at risk due to having rights to things they simply don’t need … We call the antidote for this the principle of least privilege … social engineering is especially concerning in an environment where the sharing of credentials is the norm. When you condition people to treating secrets as no longer being secret but rather something you share with someone else that can establish sufficient trust, you open up a Pandora’s box of possible problems because creating a veneer of authenticity in order to gain trust is precisely what phishers are so good at!
The great irony of the debates justifying credential sharing is that they were sparked by someone attempting to claim innocence with those supporting him saying “well, it could have been someone else using his credentials”! This is precisely why this is problem! Fortunately, this whole thing was sparked by something as benign as looking at porn and before anyone jumps up and down and says that’s actually a serious violation, when you consider the sorts of activities we task those in parliament with, you can see how behaviour under someone’s identity we can’t attribute back to them could be far, far more serious.
Source: Here’s What I’m Telling US Congress about Data Breaches
Increasingly, the assumption has to be that everything we digitise may one day end up in unauthorised hands and the way we authenticate ourselves must adapt to be resilient to this.