We’re losing the war against surveillance capitalism | Salon

Source: We’re losing the war against surveillance capitalism | Salon, by Michael Corn

Many headlines call out the demise of privacy, but what they really mean is that some of your personal information is being sold, or stolen, or simply misused. The two concepts are not quite the same. It is reasonable to consider the loss of personal information under the general heading of privacy, but separating the two concepts opens the door to a more effective conversation about how to protect them both.

When we read about Facebook or Google (or our own government) wanting to listen in on your phone calls, read your emails, or review your Facebook feed, we’re talking about privacy, pure and simple. Privacy in this case means freedom to engage in conversation or thought without unwanted or unknown surveillance.

Protecting one’s personal information takes us into a different realm with more everyday practical implications. When I give Google my phone number in exchange for a Gmail or Google Voice account, I’m exchanging my data for a service. And I suspect most of us are fine with this type of value-based trade-off. Google needs to know where to route my Google Voice phone calls or how to text me an alert related to my account. It’s Google’s subsequent reuse of this information where things start to go awry.

What does it mean to see privacy as a civil rights struggle? The collapse of our privacy is exposing each of us to palpable risks: the erosion of the right to pray, to study, to congregate, or to participate in our democracy. In a digital world, privacy is the barrier between civil society and racial, political, or religious profiling writ large. … Privacy violations are a gateway to identity-based targeting, which singles out individuals by race, religion, or gender identity.

Oppression originates whenever one group marks another group as “other,” then uses that “otherness” to isolate, discriminate, and disempower. Often the markers for discriminatory behavior are obvious: darker skin, for example, or observation of gender. But what if every marker of your individuality were known and sold, accessible to advertisers without your knowledge? The potential for manipulation or oppression is palpable, as one could easily use personal information for these suspect marking purposes.

All of the regulations imposed on the major data brokers suffer from one fatal flaw: they reflect a belief that a statutory, regulatory response to this problem can succeed. … none of these proposals fundamentally address when it is permissible to collect personal information and what can be done with it.

Imagine you come to work one day, and find someone has put a nude picture of you on the wall. You quickly have it removed but the embarrassment and anger lingers. Eventually, even that fades — maybe you even move to a new job where no one knows you as “the naked person.” Embarrassing, but you recover. For those who have had their personal information stolen, there is no “but you recover.” It is impossible to fully remove published information from the digital web. Even if you could miraculously convince every legitimate web service to remove your data, you can never convince those who illicitly deal in personal data to erase it. This immutability of stolen personal data is part of the horror of modern crimes such as revenge porn.

This is again why an incremental, regulatory based approach to protecting personal information will always fail: because a wound to our digital privacy never heals. We can’t wait for a loss, then regulate the circumstances that led to it. By then it is too late.

Imagine a world where we didn’t have to figure out how to reign in Facebook. Where creating a set of regulations wasn’t something we had to do, but rather Facebook (or Google, or the furniture store down the street) had to figure out how to operate with the principle that personal information may not be bought or sold. That is, we preserve our privacy by simply forbidding our personal information from being used as a commodity. Would this eliminate the need for statutes protecting our personal information? No, we’d still want to regulate how and when a service provider could ask for and how they must secure your personal data. But we’d have a principled floor — a bright line not to be crossed — eliminating some of the worst abuses.

Preventing the sale of our personal information is the only effective tool left to preserve our civil rights as they are assailed by both commercial and governmental bodies. People are not a commodity, and we need to legislate that it is wrong to imbue humans with attributes we reserve for property. It is deeply saddening that we need to call for laws to say: I am not for sale.

We’re Banning Facial Recognition. We’re Missing the Point. | The New York Times | Opinion

Source: We’re Banning Facial Recognition. We’re Missing the Point. | The New York Times | Opinion, by Bruce Schneier

The whole point of modern surveillance is to treat people differently, and facial recognition technologies are only a small part of that.

In all cases, modern mass surveillance has three broad components: identification, correlation and discrimination.

Facial recognition is a technology that can be used to identify people without their knowledge or consent. … But that’s just one identification technology among many. People can be identified at a distance by their heart beat or by their gait, using a laser-based system. Cameras are so good that they can read fingerprints and iris patterns from meters away. And even without any of these technologies, we can always be identified because our smartphones broadcast unique numbers called MAC addresses. Other things identify us as well: our phone numbers, our credit card numbers, the license plates on our cars.

Once we are identified, the data about who we are and what we are doing can be correlated with other data collected at other times. … It can be purchasing data, internet browsing data, or data about who we talk to via email or text. It might be data about our income, ethnicity, lifestyle, profession and interests. There is an entire industry of data brokers
who make a living analyzing and augmenting data about who we are — using surveillance data collected by all sorts of companies and then sold without our knowledge or consent.

The whole purpose of this process is for companies — and governments — to treat individuals differently.

Regulating this system means addressing all three steps of the process… The problem is that we are being identified without our knowledge or consent, and society needs rules about when that is permissible. Similarly, we need rules about how our data can be combined with other data, and then bought and sold without our knowledge or consent. … Finally, we need better rules about when and how it is permissible for companies to discriminate.

Today, facial recognition technologies are receiving the brunt of the tech backlash, but focusing on them misses the point. We need to have a serious conversation about all the technologies of identification, correlation and discrimination, and decide how much we as a society want to be spied on by governments and corporations — and what sorts of influence we want them to have over our lives.

The case for … cities that aren’t dystopian surveillance states | Cory Doctorow

Source: The case for … cities that aren’t dystopian surveillance states | The Guardian, by Cory Doctorow

Imagine your smartphone knew everything about the city – but the city didn’t know anything about you.

Why isn’t it creepy for you to know when the next bus is due, but it is creepy for the bus company to know that you’re waiting for a bus? It all comes down to whether you are a sensor – or a thing to be sensed.

homes were sensing and actuating long before the “internet of things” emerged. Thermostats, light switches, humidifiers, combi boilers … our homes are stuffed full of automated tools that no one thinks to call “smart,” largely because they aren’t terrible enough to earn the appellation.

Instead, these were oriented around serving us, rather than observing or controlling us… In your home, you are not a thing, you are a person, and the things around you exist for your comfort and benefit, not the other way around.

Shouldn’t it be that way in our cities?

As is so often the case with technology, the most important consideration isn’t what the technology does: it’s who the technology does it to, and who it does it for. The sizzle reel for a smart city always involves a cut to the control room, where the wise, cool-headed technocrats get a god’s-eye view over the city they’ve instrumented and kitted out with electronic ways of reaching into the world and rearranging its furniture.

It’s a safe bet that the people who make those videos imagine themselves as one of the controllers watching the monitors – not as one of the plebs whose movements are being fed to the cameras that feed the monitors. It’s a safe bet that most of us would like that kind of god’s-eye view into our cities, and with a little tweaking, we could have it.

This is an example of how a smart city could work: a place through which you move in relative anonymity, identified only when needed, and under conditions that allow for significant controls over what can be done with your data.

If it sounds utopian, it’s only because of how far we have come from the idea of a city being designed to serve its demos, rather than its lordly masters. We must recover that idea. As a professional cyberpunk dystopian writer, I’m here to tell you that our ideas were intended as warnings, not suggestions.

All Activities Monitored| The New Atlantis

Source: All Activities Monitored | The New Atlantis, by Jon Askonas
RE: Eyes in the Sky: The Secret Rise of Gorgon Stare and How It Will Watch Us All,
by Arthur Holland Michel

How military drone technology is quietly creeping into policing, business, and everyday life

The main theme is straightforward: Wide-area persistent surveillance, combined with machine learning and massive storage, is a novel technology that threatens civil liberties, even while it offers a number of new capabilities for serving the common good. But another theme runs obliquely through the book: What capacity do we, as individuals or as a society, have to shape — or prevent — a dangerous technological development?

Gorgon Stare was first developed to disrupt attacks in Iraq by IEDs (improvised explosive devices), which had become the main cause of death among U.S.-led coalition forces. … Like so many other technologies created for war, this type of surveillance has come home, and early adopters have found many inventive uses. Security companies have used it to protect events like NASCAR races — in one case, the surveillance system allowed a security team to quickly track back a hostile fan to his trailer to eject him from the event. The Forest Service deploys wide-area surveillance to monitor potential forest fire zones. And of course, a number of law enforcement agencies, ranging from the FBI and the Department of Homeland Security to local police departments, have experimented successfully, if controversially, with using the technology to fight crime.

Michel’s story thus displays the ethical problem of technological development in high relief. A small group of engineers came together to build a powerful weapon to meet the needs of war. In so doing, they have shifted, for everyone, the balance of power between citizen and state, between individual and corporation, and have eroded to the point of extinction what little remained of the natural rights of privacy, all around the world.

My Family Story of Love, the Mob, and Government Surveillance | The Atlantic

Source: My Family Story of Love, the Mob, and Government Surveillance | The Atlantic, by Jack Goldsmith
adapted from Jack Goldsmith’s new book, In Hoffa’s Shadow: A Stepfather, a Disappearance in Detroit, and My Search for the Truth

the government’s surveillance power has grown unfathomably since the 1960s. The “frightening paraphernalia” from six decades ago are toys compared with the redoubtable tools that allow the government to watch and record our movements and communications, and that enable it to store almost limitless amounts of data on its own or to piggyback on the masses of data that we volunteer to private firms. … Congress has ratified and legitimated what were once legally tenuous surveillance techniques. It did so after the executive branch convinced legislators that the techniques were necessary for law enforcement and national security, but it imposed various legal constraints on their use.

The result of these developments is yet another “new normal” in which the government is constrained in certain respects but citizens are far more exposed to lawful government surveillance than before. This latest new normal, like earlier ones, will not prove stable. … If history is a guide, the government will perceive a security advantage in using these and other tools in new ways to watch us and to predict and preempt our behavior. … Congress will legalize the surveillance practice on the condition, mainly, of new procedural restraints. And we will adjust to our more naked selves.

This is a depressing conclusion for many, but it is an inevitable one. The executive branch does what it thinks it must, including conduct robust surveillance, to meet our demands for safety.