All Activities Monitored| The New Atlantis

Source: All Activities Monitored | The New Atlantis, by Jon Askonas
RE: Eyes in the Sky: The Secret Rise of Gorgon Stare and How It Will Watch Us All,
by Arthur Holland Michel

How military drone technology is quietly creeping into policing, business, and everyday life

The main theme is straightforward: Wide-area persistent surveillance, combined with machine learning and massive storage, is a novel technology that threatens civil liberties, even while it offers a number of new capabilities for serving the common good. But another theme runs obliquely through the book: What capacity do we, as individuals or as a society, have to shape — or prevent — a dangerous technological development?

Gorgon Stare was first developed to disrupt attacks in Iraq by IEDs (improvised explosive devices), which had become the main cause of death among U.S.-led coalition forces. … Like so many other technologies created for war, this type of surveillance has come home, and early adopters have found many inventive uses. Security companies have used it to protect events like NASCAR races — in one case, the surveillance system allowed a security team to quickly track back a hostile fan to his trailer to eject him from the event. The Forest Service deploys wide-area surveillance to monitor potential forest fire zones. And of course, a number of law enforcement agencies, ranging from the FBI and the Department of Homeland Security to local police departments, have experimented successfully, if controversially, with using the technology to fight crime.

Michel’s story thus displays the ethical problem of technological development in high relief. A small group of engineers came together to build a powerful weapon to meet the needs of war. In so doing, they have shifted, for everyone, the balance of power between citizen and state, between individual and corporation, and have eroded to the point of extinction what little remained of the natural rights of privacy, all around the world.

My Family Story of Love, the Mob, and Government Surveillance | The Atlantic

Source: My Family Story of Love, the Mob, and Government Surveillance | The Atlantic, by Jack Goldsmith
adapted from Jack Goldsmith’s new book, In Hoffa’s Shadow: A Stepfather, a Disappearance in Detroit, and My Search for the Truth

the government’s surveillance power has grown unfathomably since the 1960s. The “frightening paraphernalia” from six decades ago are toys compared with the redoubtable tools that allow the government to watch and record our movements and communications, and that enable it to store almost limitless amounts of data on its own or to piggyback on the masses of data that we volunteer to private firms. … Congress has ratified and legitimated what were once legally tenuous surveillance techniques. It did so after the executive branch convinced legislators that the techniques were necessary for law enforcement and national security, but it imposed various legal constraints on their use.

The result of these developments is yet another “new normal” in which the government is constrained in certain respects but citizens are far more exposed to lawful government surveillance than before. This latest new normal, like earlier ones, will not prove stable. … If history is a guide, the government will perceive a security advantage in using these and other tools in new ways to watch us and to predict and preempt our behavior. … Congress will legalize the surveillance practice on the condition, mainly, of new procedural restraints. And we will adjust to our more naked selves.

This is a depressing conclusion for many, but it is an inevitable one. The executive branch does what it thinks it must, including conduct robust surveillance, to meet our demands for safety.

Shoshana Zuboff Explains the Age of Surveillance Capitalism | The Intercept

Source: “A Fundamentally Illegitimate Choice”: Shoshana Zuboff on the Age of Surveillance Capitalism | The Intercept, by Sam Biddle

RE: The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power, by Shoshana Zuboff

The cliched refrain that if you’re “not paying for a product, you are the product”? Too weak, says Zuboff. You’re not technically the product, she explains over the course of several hundred tense pages, because you’re something even more degrading: an input for the real product, predictions about your future sold to the highest bidder so that this future can be altered.

Now we have markets of business customers that are selling and buying predictions of human futures. I believe in the values of human freedom and human autonomy as the necessary elements of a democratic society. As the competition of these prediction products heats up, it’s clear that surveillance capitalists have discovered that the most predictive sources of data are when they come in and intervene in our lives, in our real-time actions, to shape our action in a certain direction that aligns with the kind of outcomes they want to guarantee to their customers. That’s where they’re making their money. These are bald-faced interventions in the exercise of human autonomy, what I call the “right to the future tense.” The very idea that I can decide what I want my future to be and design the actions that get me from here to there, that’s the very material essence of the idea of free will.

to the extent that we do need help and we do look to the internet, it is a fundamentally illegitimate choice that we are now forced to make as 21st century citizens. In order to get the help I need, I’ve got to march through surveillance capitalism supply chains. Because Alexa and Google Home and every other gewgaw that has the word “smart” in front of it, every service that has “personalized” in front of it is nothing but supply chain interfaces for the flow of raw material to be translated into data, to be fashioned into prediction products, to be sold in behavioral futures markets so that we end up funding our own domination. If we’re gonna fix this, no matter how much we feel like we need this stuff, we’ve got to get to a place where we are willing to say no.

Privacy Rights and Data Collection in a Digital Economy

Source: U.S. Senate Testimony | Idle Words, by Maciej Cegłowski

RE: U.S. Senate hearing on “Privacy Rights and Data Collection in a Digital Economy.”, by The Committee on Banking, Housing, and Urban Affairs

The sudden ubiquity of this architecture of mass surveillance, and its enshrinement as the default business model of the online economy, mean that we can no longer put off hard conversations about the threats it poses to liberty.

Adding to this urgency is the empirical fact that, while our online economy depends on the collection and permanent storage of highly personal data, we do not have the capacity to keep such large collections of user data safe over time.

While many individual data breaches are due to negligence or poor practices, their overall number reflects an uncomfortable truth well known to computer professionals—that our ability to attack computer systems far exceeds our ability to defend them, and will for the foreseeable future.

In the regulatory context, discussion of privacy invariably means data privacy—the idea of protecting designated sensitive material from unauthorized access.

It is true that, when it comes to protecting specific collections of data, the companies that profit most from the surveillance economy are the ones working hardest to defend them against unauthorized access.

But there is a second, more fundamental sense of the word privacy, one which until recently was so common and unremarkable that it would have made no sense to try to describe it.

That is the idea that there exists a sphere of life that should remain outside public scrutiny, in which we can be sure that our words, actions, thoughts and feelings are not being indelibly recorded. This includes not only intimate spaces like the home, but also the many semi-private places where people gather and engage with one another in the common activities of daily life—the workplace, church, club or union hall.

The tension between these interpretations of what privacy entails, and who is trying to defend it, complicates attempts to discuss regulation.

Tech companies will correctly point out that their customers have willingly traded their private data for an almost miraculous collection of useful services, services that have unquestionably made their lives better, and that the business model that allows them to offer these services for free creates far more value than harm for their customers.

Consumers will just as rightly point out that they never consented to be the subjects in an uncontrolled social experiment, that the companies engaged in reshaping our world have consistently refused to honestly discuss their business models or data collection practices, and that in a democratic society, profound social change requires consensus and accountability.

While it is too soon to draw definitive conclusions about the GDPR, there is a tension between its concept of user consent and the reality of a surveillance economy that is worth examining in more detail.

A key assumption of the consent model is any user can choose to withhold consent from online services. But not all services are created equal—there are some that you really can’t say no to.

The latent potential of the surveillance economy as a toolkit for despotism cannot be exaggerated. The monitoring tools we see in repressive regimes are not ‘dual use’ technologies—they are single use technologies, working as designed, except for a different master.

 

Also: Think You’re Discreet Online? Think Again | Opinion | The New York Times, by Zeynep Tufekci

Because of technological advances and the sheer amount of data now available about billions of other people, discretion no longer suffices to protect your privacy. Computer algorithms and network analyses can now infer, with a sufficiently high degree of accuracy, a wide range of things about you that you may have never disclosed, including your moods, your political beliefs, your sexual orientation and your health. There is no longer such a thing as individually “opting out” of our privacy-compromised world.

Such tools are already being marketed for use in hiring employees, for detecting shoppers’ moods and predicting criminal behavior. Unless they are properly regulated, in the near future we could be hired, fired, granted or denied insurance, accepted to or rejected from college, rented housing and extended or denied credit based on facts that are inferred about us.

This is worrisome enough when it involves correct inferences. But because computational inference is a statistical technique, it also often gets things wrong — and it is hard, and perhaps impossible, to pinpoint the source of the error, for these algorithms offer little to no insights into how they operate.

Surveillance Kills Freedom By Killing Experimentation, by Bruce Schneier

Source: Surveillance Kills Freedom By Killing Experimentation | WIRED, by Bruce Schneier

Excerpted from The End of Trust(McSweeney’s issue 54)

When we’re being watched, we conform. We don’t speak freely or try new things. But social progress happens in the gap between what’s legal and what’s moral.

It’s easy to imagine the more conservative among us getting enough power to make illegal what they would otherwise be forced to witness.

For social norms to change, people need to deviate from these inherited norms. People need the space to try alternate ways of living without risking arrest or social ostracization. People need to be able to read critiques of those norms without anyone’s knowledge, discuss them without their opinions being recorded, and write about their experiences without their names attached to their words. People need to be able to do things that others find distasteful, or even immoral. The minority needs protection from the tyranny of the majority.

Privacy makes all of this possible. Privacy encourages social progress by giving the few room to experiment free from the watchful eye of the many.