Source: Dude, you broke the future! – Charlie’s Diary, by Charlie Stross
This is the text of my keynote speech at the 34th Chaos Communication Congress in Leipzig, December 2017.
(You can also watch it on YouTube, but it runs to about 45 minutes.)
My recipe for fiction set ten years in the future used to be 90% already-here, 9% not-here-yet but predictable, and 1% who-ordered-that. But unfortunately the ratios have changed. I think we’re now down to maybe 80% already-here—climate change takes a huge toll on infrastructure—then 15% not-here-yet but predictable, and a whopping 5% of utterly unpredictable deep craziness.
Old, slow AI … Corporations
…
The problem with corporations is that despite their overt goals—whether they make electric vehicles or beer or sell life insurance policies—they are all subject to instrumental convergence insofar as they all have a common implicit paperclip-maximizer goal: to generate revenue. If they don’t make money, they are eaten by a bigger predator or they go bust. Making money is an instrumental goal—it’s as vital to them as breathing is for us mammals, and without pursuing it they will fail to achieve their final goal, whatever it may be.
…
It seems to me that our current political upheavals are best understood as arising from the capture of post-1917 democratic institutions by large-scale AIs. … Our major political parties are led by people who are compatible with the system as it exists—a system that has been shaped over decades by corporations distorting our government and regulatory environments. We humans are living in a world shaped by the desires and needs of AIs, forced to live on their terms, and we are taught that we are valuable only insofar as we contribute to the rule of the machines.
…
If we look at our historical very slow AIs, what lessons can we learn from them about modern AI—the flash flood of unprecedented deep learning and big data technologies that have overtaken us in the past decade?
plenty of technologies have, historically, been heavily regulated or even criminalized for good reason … Let me give you four examples—of new types of AI applications—that are going to warp our societies even worse than the old slow AIs of yore have done. This isn’t an exhaustive list: these are just examples. We need to work out a general strategy for getting on top of this sort of AI before they get on top of us.
…
Political hacking tools: social graph-directed propaganda … They identified individuals vulnerable to persuasion who lived in electorally sensitive districts, and canvas them with propaganda that targeted their personal hot-button issues.
…
the use of neural network generated false video media … This stuff is still geek-intensive and requires relatively expensive GPUs. But in less than a decade it’ll be out in the wild, and just about anyone will be able to fake up a realistic-looking video of someone they don’t like doing something horrible. … The smart money says that by 2027 you won’t be able to believe anything you see in video unless there are cryptographic signatures on it, linking it back to the device that shot the raw feed—and you know how good most people are at using encryption? The dumb money is on total chaos.
…
Thanks to deep learning, neuroscientists have mechanised the process of making apps more addictive. … true deep learning driven addictiveness maximizers can optimize for multiple attractors simultaneously. Now, Dopamine Labs seem, going by their public face, to have ethical qualms about the misuse of addiction maximizers in software. But neuroscience isn’t a secret, and sooner or later some really unscrupulous people will try to see how far they can push it.
…
Unfortunately there are even nastier uses than scraping social media to find potential victims for serial rapists. Does your social media profile indicate your political or religious affiliation? Nope? Don’t worry, Cambridge Analytica can work them out with 99.9% precision just by scanning the tweets and Facebook comments you liked. Add a service that can identify peoples affiliation and location, and you have the beginning of a flash mob app: one that will show you people like Us and people like Them on a hyper-local map.