Our interview is with Mara Hvistendahl, investigative journalist at The Intercept and author of a new book, The Scientist and the Spy: A True Story of China, the FBI, and Industrial Espionage, as well as a deep WIRED article on the least known Chinese AI champion, iFlytek. Mara’s book raises
Peter Singer continues his excursion into what he calls “useful fiction” – thrillers that explore real-world implications of emerging technologies – in Burn-In: A Novel of the Real Robotic Revolution, to be released May 26, 2020. This interview explores a thoroughly researched (and footnoted!) host of new technologies, many already in production or on the horizon, all packed inside a plot-driven novel. The book is a painless way to understand what these technologies make possible and their impact on actual human beings. And the interview ranges widely over the policy implications, plus a few plot spoilers.
J.P. Morgan once responded to President Teddy Roosevelt’s charge that he’d violated federal antitrust law by saying, “If we have done anything wrong, send your man to see my man, and we’ll fix it up.” That used to be the gold standard for monopolist arrogance in dealing with government, but Google and Apple have put J.P. Morgan in the shade with their latest instruction to the governments of the world: You can’t use our app to trace COVID-19 infections unless you promise not to use it for quarantine or law enforcement purposes. They are only able to do this because the two companies have more or less 99% of the phone OS market. That’s more control than Morgan had of US railways, and their dominance apparently allows them to say, “If you think we’ve done something wrong, don’t bother to send your man; ours is too busy to meet.” Nate Jones and I discuss the question of Silicon Valley overreach in this episode. (In that vein, I apologize unreservedly to John D. Rockefeller, to whom I mistakenly attributed the quote.) The sad result is that a promising technological adjunct to contact tracing has been delayed and muddled by ideological engineers to the point where it isn’t likely to be deployed and used in a timely way.
David Kris, Paul Rosenzweig, and I dive deep on the big tech issue of the COVID-19 contagion: Whether (but mostly how) to use mobile phone location services to fight the virus. We cover the Israeli approach, as well as a host of solutions adopted in Singapore, Taiwan, South Korea, and elsewhere. I’m a big fan of Singapore, which produced in a week an app that Nick Weaver thought would take a year.
In our interview, evelyn douek, currently at the Berkman Klein Center and an SJD candidate at Harvard, takes us deep into content moderation. Displaying a talent for complexifying an issue we all want to simplify, she explains why we can’t live with social platform censorship and why we can’t live without it. She walks us through the growth of content moderation, from spam, through child porn, and on to terrorism and “coordinated inauthentic behavior” – the identification of which, evelyn assures me, does not require an existentialist dance instructor. Instead, it’s the latest and least easily defined category of speech to be suppressed by Big Tech. It’s a mare’s nest, but I, for one, intend to aggravate our new Tech Overlords for as long as possible.
If your podcast feed has suddenly become a steady diet of more or less the same COVID-19 stories, here’s a chance to listen to cyber experts talk about what they know about – cyberlaw. Our interview is with Elsa Kania, adjunct senior fellow at the Center for a New American Security and one of the most prolific students of China, technology, and national security. We talk about the relative strengths and weaknesses of the artificial intelligence ecosystems in the two countries.
This episode features a lively (and – fair warning – long) interview with Daphne Keller, Director of the Program on Platform Regulation at Stanford University’s Cyber Policy Center. We explore themes from her recent paper on regulation of online speech. It turns out that more or less everyone has an ability to restrict users’ speech online, and pretty much no one has both authority and an interest in fostering free-speech values. The ironies abound: Conservatives may be discriminated against, but so are Black Lives Matter activists. In fact, it looks to me as though any group that doesn’t think it’s the victim of biased content moderation would be well advised to scream as loudly about censorship or the others for fear of losing the victimization sweepstakes. Feeling a little like a carny at the sideshow, I serve up one solution for biased moderation after another, and Daphne methodically shoots them down. Transparency? None of the companies is willing, and the government may have a constitutional problem forcing them to disclose how they make their moderation decisions. Competition law? A long haul, and besides, most users like a moderated Internet experience. Regulation? Only if we take the First Amendment back to the heyday of broadcast regulation. As a particularly egregious example of foreign governments and platforms ganging up to censor Americans, we touch on the CJEU’s insufferable decision encouraging the export of European defamation law to the US – with an extra margin of censorship to keep the platform from any risk of liability. I offer to risk my Facebook account to see if that’s already happening.
In breaking news from 1995, the Washington Post takes advantage of a leaked CIA history paper to retell the remarkable tale of Crypto AG, a purveyor of encryption products to dozens of governments – and allegedly a wholly controlled subsidiary of US and German intelligence. Nick Weaver, Paul Rosenzweig, and I are astonished at the derring-do and unapologetic enthusiasm for intelligence collection. I mean, really: The Pope?
This week’s interview is with Jonathan Reiber, a writer and strategist in Oakland, California, and former Chief Strategy Officer for Cyber Policy and Speechwriter at the Department of Defense, currently senior advisor at Technology for Global Security and visiting scholar at the UC Berkeley Center for Long-Term Cybersecurity. His recent report offers a candid view of strained relations between Silicon Valley and the Pentagon. The interview explores the reasons for that strain, the importance of bridging the gap, and how that can best be done.
This episode features an interview on the Bezos phone flap with David Kaye and Alex Stamos. David is a UN Special Rapporteur and clinical professor of law at UC Irvine who first drew attention to an FTI Consulting report concluding that the Saudis did hack Bezos’ phone. Alex is director of the Stanford Internet Observatory and was the CSO at Facebook; he thinks the technical case against the Saudis needs work, and he calls for a supplemental forensic review of the phone.
This week Maury Shenk guest hosts the podcast.
Even with a “phase one” trade deal with China apparently agreed, there’s of course plenty still at stake between China and the US in the tech space. Nate Jones reports on the Chinese government order for government offices to purge foreign software and equipment within three years and the plans of Arm China to develop chips using “state-approved” cryptography. Nick Weaver and I agree that, while there are some technical challenges on this road, there’s a clear Chinese agenda to lose dependency on US suppliers.
The apparent terror attack at Naval Air Station Pensacola spurs a debate among our panelists about whether the FISA Section 215 metadata program deserves to be killed, as Congress has increasingly signaled it intends to do. If the Pensacola attack involved multiple parties acting across US borders, still a live possibility as we talked, then it would be just about the first such attacks since 9/11 – and exactly the kind of attack the metadata program was designed to identify in advance.