Our news roundup for this episode is heavy on China and tech policy. And most of the news is bad for tech companies. Jordan Schneider tells us that China is telling certain agencies, not to purchase Teslas or allow them on the premises, for fear that Elon Musk’s famously intrusive record-keeping systems will give
- Our interview is with Alex Stamos, who lays out a complex debate over child sexual abuse that’s now roiling Brussels. The application of European privacy standards and AI hostility to internet communications providers has called into question the one tool that has reduced online child sex predation. Scanning for sex abuse images works
In our 326th episode of the Cyberlaw Podcast, Stewart Baker interviews Lauren Willard, who serves as Counselor to the Attorney General. Stewart is also joined Nick Weaver (@ncweaver), David Kris (@DavidKris), and Paul Rosenzweig (@RosenzweigP).
Our interview this week focuses on section 230 of the Communications Decency Act and features Lauren Willard,…
This is the week when the movement to reform Section 230 of the Communications Decency Act got serious. The Justice Department released a substantive report suggesting multiple reforms. I was positive about many of them (my views here). Meanwhile, Sen. Josh Hawley (R-MO) has proposed a somewhat similar set of changes in his…
Our interview is with Mara Hvistendahl, investigative journalist at The Intercept and author of a new book, The Scientist and the Spy: A True Story of China, the FBI, and Industrial Espionage, as well as a deep WIRED article on the least known Chinese AI champion, iFlytek. Mara’s book raises…
J.P. Morgan once responded to President Teddy Roosevelt’s charge that he’d violated federal antitrust law by saying, “If we have done anything wrong, send your man to see my man, and we’ll fix it up.” That used to be the gold standard for monopolist arrogance in dealing with government, but Google and Apple have put J.P. Morgan in the shade with their latest instruction to the governments of the world: You can’t use our app to trace COVID-19 infections unless you promise not to use it for quarantine or law enforcement purposes. They are only able to do this because the two companies have more or less 99% of the phone OS market. That’s more control than Morgan had of US railways, and their dominance apparently allows them to say, “If you think we’ve done something wrong, don’t bother to send your man; ours is too busy to meet.” Nate Jones and I discuss the question of Silicon Valley overreach in this episode. (In that vein, I apologize unreservedly to John D. Rockefeller, to whom I mistakenly attributed the quote.) The sad result is that a promising technological adjunct to contact tracing has been delayed and muddled by ideological engineers to the point where it isn’t likely to be deployed and used in a timely way.
We begin with a new US measure to secure its supply chain for a critical infrastructure – the bulk power grid. David Kris unpacks a new Executive Order restricting purchases of foreign equipment for the grid.
Nick Weaver, meanwhile, explains the remarkable extent of surveillance built into Xiaomi phones and questions the company’s claim that it was merely acquiring pseudonymous ad-related data like others in the industry.
It wouldn’t be the Cyberlaw Podcast if we didn’t wrangle over mobile phones and the coronavirus. Mark MacCarthy says that several countries – Australia, the UK, and perhaps France – are deviating from the Gapple model for using phones for infection tracing. Several have bought in. India, meanwhile, is planning a much more government-driven approach to using phone apps to combat the pandemic.
In this episode, I interview Thomas Rid about his illuminating study of Russian disinformation, Active Measures: The Secret History of Disinformation and Political Warfare. It lays out a century of Soviet, East European, and Russian disinformation, beginning with an elaborate and successful operation against the White Russian expatriate resistance to Bolshevik rule in the 1920s. Rid has dug into recently declassified material using digital tools that enable him to tell previously untold tales – the Soviets’ remarkable success in turning opposition to US nuclear missiles in Europe into a mass movement (and the potential shadow it casts on the legendary Adm. Hyman Rickover, father of the US nuclear navy), the unimpressive record of US disinformation compared to the ruthless Soviet version, and the fake American lobbyist (and real German agent) who persuaded a German conservative legislator to save Willy Brandt’s leftist government. We close with two very different predictions about the kind of disinformation we’ll see in the 2020 campaign.
The Cyberspace Solarium Commission’s report was released into the teeth of the COVID-19 crisis and hasn’t attracted the press it probably deserved. But the commissioners included four sitting Congressmen who plan to push for adoption of its recommendations. And the Commission is going to be producing more material – and probably more press attention – over the coming weeks. In this episode, I interview Sen. Angus King, co-chair of the Commission, and Dr. Samantha Ravich, one of the commissioners.
We focus almost exclusively on what the Commission’s recommendations mean for the private sector. The Commission has proposed a remarkably broad range of cybersecurity measures for business. The Commission recommends a new products liability regime for assemblers of final goods (including software) who don’t promptly patch vulnerabilities. It proposes two new laws requiring notice not only of personal data breaches but also of other significant cyber incidents. It calls for a federal privacy and security law – without preemption. It updates Sarbanes-Oxley to include cybersecurity principles. And lest you think the Commission is in love with liability, it also proposed liability immunities for critical infrastructure owners operating under government supervision during a crisis. We cover all these proposals, plus the Commission’s recommendation of a new role for the Intelligence Community in providing support to critical US companies.
There’s a fine line between legislation addressing deepfakes and legislation that is itself a deep fake. Nate Jones reports on the only federal legislation addressing the problem so far. I claim that it is well short of a serious regulatory effort – and pretty close to a fake law.
In contrast, India seems serious about imposing liability on companies whose unbreakable end-to-end crypto causes harm, at least to judge from the howls of the usual defenders of such crypto. David Kris explains how the law will work. I ask why Silicon Valley gets to impose the externalities of encryption-facilitated crime on society without consequence when we’d never allow tech companies to say that society should pick up the tab for their pollution because their products are so cool. In related news, the FBI may be turning the Pensacola military terrorism attack into a slow-motion replay of the San Bernardino fight with Apple, this time with more top cover.