This episode features an in-depth (and occasionally contentious) interview with Bart Gellman about his new book, Dark Mirror: Edward Snowden and the American Surveillance State, which can be found on his website and on Amazon. I’m tagged in the book as having been sharply critical of Gellman’s Snowden stories, and I live
Peter Singer continues his excursion into what he calls “useful fiction” – thrillers that explore real-world implications of emerging technologies – in Burn-In: A Novel of the Real Robotic Revolution, to be released May 26, 2020. This interview explores a thoroughly researched (and footnoted!) host of new technologies, many already in production or on the horizon, all packed inside a plot-driven novel. The book is a painless way to understand what these technologies make possible and their impact on actual human beings. And the interview ranges widely over the policy implications, plus a few plot spoilers.
J.P. Morgan once responded to President Teddy Roosevelt’s charge that he’d violated federal antitrust law by saying, “If we have done anything wrong, send your man to see my man, and we’ll fix it up.” That used to be the gold standard for monopolist arrogance in dealing with government, but Google and Apple have put J.P. Morgan in the shade with their latest instruction to the governments of the world: You can’t use our app to trace COVID-19 infections unless you promise not to use it for quarantine or law enforcement purposes. They are only able to do this because the two companies have more or less 99% of the phone OS market. That’s more control than Morgan had of US railways, and their dominance apparently allows them to say, “If you think we’ve done something wrong, don’t bother to send your man; ours is too busy to meet.” Nate Jones and I discuss the question of Silicon Valley overreach in this episode. (In that vein, I apologize unreservedly to John D. Rockefeller, to whom I mistakenly attributed the quote.) The sad result is that a promising technological adjunct to contact tracing has been delayed and muddled by ideological engineers to the point where it isn’t likely to be deployed and used in a timely way.
In this episode, I interview Thomas Rid about his illuminating study of Russian disinformation, Active Measures: The Secret History of Disinformation and Political Warfare. It lays out a century of Soviet, East European, and Russian disinformation, beginning with an elaborate and successful operation against the White Russian expatriate resistance to Bolshevik rule in the 1920s. Rid has dug into recently declassified material using digital tools that enable him to tell previously untold tales – the Soviets’ remarkable success in turning opposition to US nuclear missiles in Europe into a mass movement (and the potential shadow it casts on the legendary Adm. Hyman Rickover, father of the US nuclear navy), the unimpressive record of US disinformation compared to the ruthless Soviet version, and the fake American lobbyist (and real German agent) who persuaded a German conservative legislator to save Willy Brandt’s leftist government. We close with two very different predictions about the kind of disinformation we’ll see in the 2020 campaign.
David Kris, Paul Rosenzweig, and I dive deep on the big tech issue of the COVID-19 contagion: Whether (but mostly how) to use mobile phone location services to fight the virus. We cover the Israeli approach, as well as a host of solutions adopted in Singapore, Taiwan, South Korea, and elsewhere. I’m a big fan of Singapore, which produced in a week an app that Nick Weaver thought would take a year.
In our interview, evelyn douek, currently at the Berkman Klein Center and an SJD candidate at Harvard, takes us deep into content moderation. Displaying a talent for complexifying an issue we all want to simplify, she explains why we can’t live with social platform censorship and why we can’t live without it. She walks us through the growth of content moderation, from spam, through child porn, and on to terrorism and “coordinated inauthentic behavior” – the identification of which, evelyn assures me, does not require an existentialist dance instructor. Instead, it’s the latest and least easily defined category of speech to be suppressed by Big Tech. It’s a mare’s nest, but I, for one, intend to aggravate our new Tech Overlords for as long as possible.
That’s the question I debate with David Kris and Nick Weaver as we explore the ways in which governments are using location data to fight the spread of COVID-19. Phone location data is being used to enforce quarantines and to track contacts with infected people. It’s useful for both, but Nick thinks the second application may not really be ready for a year – too late for this outbreak.
Our interview subject is Jason Healey, who has a long history with Cyber Command and a deep recent oeuvre of academic commentary on cyber conflict. Jay explains Cyber Command’s doctrine of “persistent engagement” and “defending forward” in words that I finally understand. It makes sense in terms of Cyber Command’s aspirations as well as the limitations it labored under in the Obama Administration, but I end up wondering whether it’s going to be different from “deterrence through having the best offense.” Nothing wrong with that, in my view – as long as you have the best offense by a long shot, something that is by no means proven.
If your podcast feed has suddenly become a steady diet of more or less the same COVID-19 stories, here’s a chance to listen to cyber experts talk about what they know about – cyberlaw. Our interview is with Elsa Kania, adjunct senior fellow at the Center for a New American Security and one of the most prolific students of China, technology, and national security. We talk about the relative strengths and weaknesses of the artificial intelligence ecosystems in the two countries.
Our interview in this episode is with Glenn Gerstell, freed at last from some of the constraints that come with government service. We cover the Snowden leaks, how private and public legal work differs (hint: it’s the turf battles), Cyber Command, Russian election interference, reauthorization of FISA, and the daunting challenges the US (and its Intelligence Community) will face as China’s economy begins to reinforce its global security ambitions.
This episode features a lively (and – fair warning – long) interview with Daphne Keller, Director of the Program on Platform Regulation at Stanford University’s Cyber Policy Center. We explore themes from her recent paper on regulation of online speech. It turns out that more or less everyone has an ability to restrict users’ speech online, and pretty much no one has both authority and an interest in fostering free-speech values. The ironies abound: Conservatives may be discriminated against, but so are Black Lives Matter activists. In fact, it looks to me as though any group that doesn’t think it’s the victim of biased content moderation would be well advised to scream as loudly about censorship or the others for fear of losing the victimization sweepstakes. Feeling a little like a carny at the sideshow, I serve up one solution for biased moderation after another, and Daphne methodically shoots them down. Transparency? None of the companies is willing, and the government may have a constitutional problem forcing them to disclose how they make their moderation decisions. Competition law? A long haul, and besides, most users like a moderated Internet experience. Regulation? Only if we take the First Amendment back to the heyday of broadcast regulation. As a particularly egregious example of foreign governments and platforms ganging up to censor Americans, we touch on the CJEU’s insufferable decision encouraging the export of European defamation law to the US – with an extra margin of censorship to keep the platform from any risk of liability. I offer to risk my Facebook account to see if that’s already happening.
This episode features an interview on the Bezos phone flap with David Kaye and Alex Stamos. David is a UN Special Rapporteur and clinical professor of law at UC Irvine who first drew attention to an FTI Consulting report concluding that the Saudis did hack Bezos’ phone. Alex is director of the Stanford Internet Observatory and was the CSO at Facebook; he thinks the technical case against the Saudis needs work, and he calls for a supplemental forensic review of the phone.