This is the week when the movement to reform Section 230 of the Communications Decency Act got serious. The Justice Department released a substantive report suggesting multiple reforms. I was positive about many of them (my views here). Meanwhile, Sen. Josh Hawley (R-MO) has proposed a somewhat similar set of changes in his bill, introduced this week. Nate Jones and I dig into the provisions, and both of us expect interest from Democrats as well as Republicans.

The National Security Agency has launched a pilot program to provide secure DNS resolver services for US defense contractors. If that’s such a good idea, I ask, why doesn’t everybody do it, and Nick Weaver tells us they can. Phil Reitinger’s Global Cyberalliance offers Quad9 for this purpose.

Gus Hurwitz brings us up to date on a host of European cyberlaw developments, from terror takedowns (Reuters, Tech Crunch) to competition law to the rise of a disturbingly unaccountable and self-confident judiciary. Microsoft’s Brad Smith, meanwhile, wins the prize for best marriage of business self-interest and Zeitgeist in the twenty-first century.

Hackers used LinkedIn’s private messaging feature to send documents containing malicious code which defense contractor employees were tricked into opening. Nick points out just what a boon LinkedIn is for cyberespionage (including his own), and I caution listeners not to display their tats on LinkedIn.

Speaking of fools who kind of have it coming, Nick tells the story of the now former eBay executives who have been charged with sustained and imaginatively-over-the-top harassment of the owners of a newsletter that had not been deferential to eBay. (Wired, DOJ)

It’s hard to like the defendants in that case, I argue, but the law they’ve been charged under is remarkably sweeping. Apparently it’s a felony to intentionally use the internet to cause substantial emotional distress. Who knew? Most of us who use Twitter thought that was its main purpose. I also discover that special protections under the law are extended not only to prevent internet threats and harassment of service animals but also horses of any kind. Other livestock are apparently left unprotected. PETA, call your office.

Child abusers cheered when Zoom buckled to criticism of its limits on end-to-end encryption, but Nick insists that the new policy offers safeguards for policing misuse of the platform. (Ars Technica, Zoom)

I take a minute to roast Republicans in Congress who have announced that no FISA reauthorization will be adopted until John Durham’s investigation of FISA abuses is done, which makes sense until you realize that the FISA provisions up for reauthorization have nothing to do with the abuses Durham is investigating. So we’re giving international terrorists a break from scrutiny simply because the President can’t keep the difference straight.

Nate notes that a story previewed in April has now been confirmed: Team Telecom is recommending the blocking of a Hong Kong-US undersea cable over national security concerns.

Gus reminds us that a bitter trade fight between the US and Europe over taxes on Silicon Valley services is coming. (Politico, Ars Technica)

Nick and I mourn the complete meltdown of mobile phone contact tracing. I argue that from here on out, some portion of coronavirus deaths should be classified as mechanogenic (caused by engineering malpractice). Nick proposes instead a naming convention built around the Therac-25.

And we close with a quick look at the latest data dump from Distributed Denial of Secrets. Nick thinks it’s strikingly contemporaneous but also surprisingly unscandalizing.


Download the 321st Episode (mp3).

You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

Our interview this week is with Chris Bing, a cybersecurity reporter with Reuters, and John Scott-Railton, Senior Researcher at Citizen Lab and PhD student at UCLA. John coauthored Citizen Lab’s report last week on BellTroX and Indian hackers for hire, and Chris reported for Reuters on the same organization’s activities – and criminal exposure – in the United States. The most remarkable aspect of the story is how thoroughly normalized hacking legal and lobbying opponents seems to have become, at least in parts of the US legal and investigative ecosystem. I suggest that instead of a long extradition battle, the US give the head of BellTroX a ticket to the US and a guaranteed income for the next few years as a witness against his customers.

In the news roundup, Nick Weaver tells the remarkable story of how Facebook funded an exploit aimed at taking down a particularly vile online abuser of young girls who was nearly invulnerable because he was using TAILS, the secure, thumb drive-based communication system (Vice, Gizmodo). This is a great story because it really doesn’t fit into any of the stilted narratives into which most internet security stories are usually jammed.

Nick also notes Big Tech’s pledge to do more to stop child abuse online. I suggest that only Dr. Evil would be impressed by the amounts of money being invested in the campaign.

Well, another week, another Zoom bomb.  Now the company is taking heat because it terminated several Tiananmen Square commemorative Zoom sessions after China complained (NYT, Zoom). David Kris and I don’t think Zoom had much choice about cutting off the Chinese customers.  Terminating the US account holder who organized a session, however, was a bad move – and one that’s since been corrected by the company.

Nate Jones and I square off again for Round 545 on content moderation, spurred this time by reports that Sen. Josh Hawley is drafting legislation inspired by the Trump Administration’s Section 230 EO. Meanwhile several Republican senators are pushing the FCC to act on the order. Nate and I find rare bipartisan common ground on the idea that Congress should require social media companies to take down foreign government online messaging – and maybe work with the US government to stop it at the source.

David reports on a fairly (and deservedly) obscure EU cloud independence project. It seems to have been embraced by Microsoft, which I accuse of going full AT&T – embracing government regulation as a competitive differentiator. As if to prove my point, Microsoft announces that it’s getting out of the business of doing facial recognition for the police – until it can persuade Congress to regulate its competitors.

Why are spies targeting vaccine research? Nate highlights the excellent Risky Biz newsletter analysis of what drives COVID-19 cyberespionage.

Nick flags the potential significance of ARM wrestling, as the UK chip designer ARM fights its JV partner for control of its Chinese joint venture. Nick also assigns a “moderate” threat label to the latest Universal Plug n Pwn exploit. It’s only moderate because there are so many pwned IOT devices already in a position to DDOS targets of opportunity.

In quick hits, I note that Israel has halted its controversial use of intelligence capabilities to monitor the spread of the coronavirus, but the government reserves the right to revive monitoring if a second wave shows up (JPost, Yahoo). Poor Brewster Kahle is looking like an internet hippie who fell asleep at Woodstock and woke up at Altamont. The Internet Archive is ending its program of offering free, unrestricted copies of e-books, but the publishers who sued over that program may decide to keep suing until they’ve broken his entire “digital library” model, and maybe the Internet Archive as well (NYT, Ars Technica). That would be a shame. Finally, you can have a thousand talents, but honesty may not be one of them. Charles Lieber, the Harvard University professor arrested for lying about his lucrative China contracts, has now been indicted on false statement charges.



Download the 320th Episode (mp3).

You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.



Our interview with Ben Buchanan begins with his report on how artificial intelligence may influence national and cybersecurity. Ben’s quick takes: better for defense than offense, and probably even better for propaganda. The best part, in my view, is Ben’s explanation of how to poison the AI that’s trying to hack you – and the scary possibility that China is already poisoning Silicon Valley’s AI.

By popular request, we’ve revisited a story we skipped last week to do a pretty deep dive on the decision (for now) that Capital One can’t claim attorney-client work product privilege in a Mandiant intrusion response report prepared after its breach. Steptoe litigator Charles Michael  and I talk about how IR firms and CISOs should respond to the decision, assuming it stands up on appeal.

Maury Shenk notes the latest of about a hundred warnings, this time from Christopher Krebs, the director of DHS’s cybersecurity agency and the head of Britain’s GCHQ, that China’s intelligence service ­– and every other intelligence service on the planet – seem to be targeting COVID-19 research. I ask whether sauce for the Western goose should be sauce for the Chinese gander.

Maury takes us through the week in internet copyright fights. Ideological copyright enforcement meets the world’s dumbest takedown bots as Twitter removes a Trump campaign video tribute to George Floyd due to a copyright claim. The video is still available on Trump’s YouTube channel.

We puzzle over Instagram’s failure to provide a license to users of its embedding API. The announcement could come as an unwelcome surprise to users who believed that embedding images, rather than hosting them directly, provides insulation against copyright claims.

Finally, much as I love Brewster Kahle, I’m afraid that Kahle’s latest move marks his transition from internet hippie to “holy fool” – and maybe a broke one. His Internet Archive, the online library best known for maintaining the Internet Wayback Machine makes scanned copies of books available to the public on terms that resemble a library’s. The setup was arguably legal – and no one was suing – until Kahle decided to let people download more books than his company had paid for. Now he faces an ugly copyright lawsuit.

Speaking of ugly lawsuits, Mark MacCarthy and Paul Rosenzweig comment on the Center for Democracy and Technology’s complaint that Trump violated tech companies’ right to free speech with his executive order on section 230. (ReutersNYT) I question whether this lawsuit will get far.

This Week in Working the Ref: Facebook and Mark Zuckerberg are facing criticism from users, competitors, civil rights organizations for failing to censor the people those groups hate. (Ars TechnicaPolitico). Meanwhile, Snap scores points by ending promotion of Trump’s account after concluding his tweets incited violence. I can’t help wondering what Snap would have done with FDR’s “day that will live in infamy” speech.

Where is Nate Jones when you need him?  He would love this story: A Twitter user sacrificed a Twitter account to show that Trump is treated differently than others by the platform. Of course, the panel notes, that’s pretty much what Twitter says it does.

In quick hits, I serve notice that no one should be surprised if Justice brings an adtech antitrust suit against Google. The Israeli government announces an attack on its infrastructure so late that the press has already identified and attributed its retaliatory cyberattack on Iran’s ports. And somebody pretty good – probably not the Russians, I argue – is targeting industrial firms.


Download the 319th Episode (mp3).

You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

On July 1, 2020, the California attorney general is expected to begin enforcing the California Consumer Privacy Act (CCPA), California’s groundbreaking new privacy law which has been in effect since January 1, 2020. In addition, the attorney general is also finalizing regulations that interpret and build upon the CCPA. To minimize the risk of potentially substantial penalties, businesses should familiarize themselves with the CCPA regulations and ensure that they are prepared to comply with the new requirements.

Join members of Steptoe’s Privacy & Cybersecurity practice for a webinar that will provide an overview of the CCPA and the attorney general’s regulations and will discuss key areas of CCPA compliance.

Date: June 11, 2020

Time: 2:00 p.m – 3:00 p.m. EDT

Click here to register. 


This episode features an in-depth (and occasionally contentious) interview with Bart Gellman about his new book, Dark Mirror: Edward Snowden and the American Surveillance State, which can be found on his website and on Amazon. I’m tagged in the book as having been sharply critical of Gellman’s Snowden stories, and I live up to the billing in this interview. He responds to my critique in good part. Gellman offers detailed insights into Edward Snowden’s motives and relationships to foreign governments, as well as how journalism – and journalistic lawyering – is done in the Big Leagues.

Our news roundup focuses heavily on the Trump Administration’s executive order on section 230 of the Communications Decency Act (Wall Street JournalWashington Post). I end up debating all three of my co-panelists – Nate Jones, Nick Weaver, and Evelyn Douek, rejoining us on a particularly good day, given her expertise. We agree to disagree on whether Silicon Valley applies its rules in a fashion that discriminates against conservatives. More interesting is the rough consensus that Silicon Valley’s heavy influence over our speech is worth worrying about and that transparency is one of the better ways to discipline that influence. No one but me is willing to consider the possibility that the executive order represents a good step toward transparency.

Nate and I find much room to agree, though, on the tragicomedy emerging from the reauthorization of three relatively straightforward FISA provisions. Stay tuned for a House-Senate conference, plus heavy lobbying of the President.

Nick explains NSA’s outing of Russian military hackers targeting mail relay software (CyberScoopNSA).

Nate and I cover the latest in US-China decoupling – the FCC and Justice Department enthusiasm for kicking Chinese telecom firms out of the country and, in a possible new front, heavy scrutiny being given to Chinese-built transformers.

Evelyn tells us that, as a visa holder, she’s definitely hoping that the courts overturn US rules forcing visa applicants to disclose their social media handles. I predict that her hopes will be dashed.

Finally, Nick explains who needs a “quantum holographic catalyzer” to protect against 5G telecom emissions. Quick answer: No one. It’s a fake cure for fake malady.


Download the 318th Episode (mp3).

You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.


Our interview is with Mara Hvistendahl, investigative journalist at The Intercept and author of a new book, The Scientist and the Spy: A True Story of China, the FBI, and Industrial Espionage, as well as a deep WIRED article on the least known Chinese AI champion, iFlytek. Mara’s book raises questions about the expense and motivations of the FBI’s pursuit of commercial spying from China.

In the News Roundup, Gus Hurwitz, Nick Weaver, and I wrestle with whether Apple’s lawsuit against Corellium is really aimed at the FBI. The answer looks to be affirmative, since an Apple victory would make it harder for contractors to find hackable flaws in the iPhone.

Germany’s top court ruled that German intelligence can no longer freely spy on foreigners – or share intelligence with other western countries. The court seems to be trying to leave the door open to something that looks like intelligence collection, but the hurdles are many. Which reminds me that I somehow missed the 100th anniversary of the Weimar Republic.

There’s Trouble Right Here in Takedown City. Gus lays out all the screwy and maybe even dangerous takedown decisions that came to light last week. YouTube censored epidemiologist Knut Wittkowski for opposing lockdown. It suspended and then reinstated a popular Android podcast app for the crime of cataloging COVID-19 content. We learned that anyone can engage in a self-help right to be forgotten with a bit of backdating and a plagiarism claim. Classical musicians are taking it on the chin in their battle with aggressive copyright enforcement bots and a sluggish Silicon Valley response.

In that climate, who can blame the Supreme Court for ducking cases asking for a ruling on the scope of Section 230? They’ve dodged one already, and we predict the same outcome in the next one.

Finally, Gus unpacks the recent report on the DMCA from the Copyright Lobbying Office – er, the Copyright Office.

With relief, we turn to Matthew Heiman for more cyber and less law. It sure looks like Israel launched a disruptive cyberattack on Iranian port facility. It was probably a response to Iranian cybe-rmeddling with Israeli water systems.

Nick covers Bizarro-world cybersecurity: It turns out malware authors now can hire their own black-market security pentesters.

I ask about open-source security and am met with derisive laughter, which certainly seems fair after flaws were found in dozens of applications.

I also cover a Turing Test for the 21st Century: Can you sext successfully with an AI and don’t know it’s an AI? And the news from AI speech imitation is that Presidents Trump and Obama have fake-endorsed Lyrebird.

Gus reminds us that most of privacy law is about unintended consequences, like telling Grandma she’s violating GDPR by posting her grandchildren’s photos without their parents’ consent.

Beerint at last makes its appearance, as it turns out that military and intelligence personnel can be tracked with a beer enthusiast app.

Finally, in the wake of Joe Rogan’s deal with Spotify, I offer assurances that the Cyberlaw Podcast is not going to sell out for $100 million.


Download the 317th Episode (mp3).

You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!


The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.



Peter Singer continues his excursion into what he calls “useful fiction” – thrillers that explore real-world implications of emerging technologies – in Burn-In: A Novel of the Real Robotic Revolution, to be released May 26, 2020. This interview explores a thoroughly researched (and footnoted!) host of new technologies, many already in production or on the horizon, all packed inside a plot-driven novel. The book is a painless way to understand what these technologies make possible and their impact on actual human beings. And the interview ranges widely over the policy implications, plus a few plot spoilers.

Continue Reading Episode 316: Our AI Future – Sexbots, Toilet Drones, and Robocops?


J.P. Morgan once responded to President Teddy Roosevelt’s charge that he’d violated federal antitrust law by saying, “If we have done anything wrong, send your man to see my man, and we’ll fix it up.” That used to be the gold standard for monopolist arrogance in dealing with government, but Google and Apple have put J.P. Morgan in the shade with their latest instruction to the governments of the world: You can’t use our app to trace COVID-19 infections unless you promise not to use it for quarantine or law enforcement purposes. They are only able to do this because the two companies have more or less 99% of the phone OS market. That’s more control than Morgan had of US railways, and their dominance apparently allows them to say, “If you think we’ve done something wrong, don’t bother to send your man; ours is too busy to meet.” Nate Jones and I discuss the question of Silicon Valley overreach in this episode. (In that vein, I apologize unreservedly to John D. Rockefeller, to whom I mistakenly attributed the quote.) The sad result is that a promising technological adjunct to contact tracing has been delayed and muddled by ideological engineers to the point where it isn’t likely to be deployed and used in a timely way.

Continue Reading Episode 315: Google to Washington: “Send your man to see my man. And we’ll stiff him.”

While most businesses have been preoccupied with navigating the effects of the COVID-19 pandemic, a significant change to businesses’ data security obligations has taken effect in New York. On March 21, 2020, the second part of the Stop Hacks and Improve Electronic Data Security Act (the SHIELD Act) went into effect in New York State. The SHIELD Act was signed into law in July 2019 and part of the legislation, amending New York’s data breach notification law, went into effect last October. The new data security requirements are not limited to a specific industry, but apply to any person or business that owns or licenses computerized data that includes the private information of New York residents.[1]

The SHIELD Act mandates a covered business “develop, implement and maintain reasonable safeguards to protect the security, confidentiality and integrity of the private information, including but not limited to, disposal of data.” To comply with the SHIELD Act, a business’ data security program must include the following:

  • “Reasonable administrative safeguards,” such as:
    • Designating “one or more employees to coordinate the security program”;
    • Identifying “reasonably foreseeable internal and external risks”;
    • Assessing “the sufficiency of safeguards in place to control the identified risks”;
    • Training and managing “employees in the security program practices and procedures”;
    • Selecting “service providers capable of maintaining appropriate safe-guards” and requiring “those safeguards by contract”; and
    • Adjusting “the security program in light of business changes or new circumstances.”
  • “Reasonable technical safeguards,” such as:
    • Assessing “risks in network and software design”;
    • Assessing “risks in information processing, transmission and storage”;
    • Detecting, preventing and responding “to attacks or system failures”; and
    • Regularly testing and monitoring “the effectiveness of key controls, systems and procedures.”
  • “Reasonable physical safeguards,” such as:
    • Assessing “risks of information storage and disposal”;
    • Detecting, preventing, and responding “to intrusions”;
    • Protecting “against unauthorized access to or use of private information during or after the collection, transportation, and destruction or disposal of the information”; and
    • Disposing “of private information within a reasonable amount of time after it is no longer needed for business purposes by erasing electronic media so that the information cannot be read or reconstructed.”

A small business[2] complies with the SHIELD Act’s data security program requirements where its “security program contains reasonable administrative, technical and physical safeguards that are appropriate for the size and complexity of the small business, the nature and scope of the small business’s activities, and the sensitivity of the personal information the small business collects from or about consumers.”

In addition, any entity that is subject to and in compliance with (i) regulations promulgated pursuant to Title V of the federal Gramm-Leach-Bliley Act, (ii) regulations implementing the federal Health Insurance Portability and Accountability Act (HIPAA) and the federal Health Information Technology for Economic and Clinical Health Act (HITECH Act), (iii) the New York State Department of Financial Services Cybersecurity Regulation, or (iv) any other federal or New York State data security rule, regulation, or statute, is deemed compliant with the SHIELD Act’s data security mandate.

The New York State Attorney General is empowered to enforce the SHIELD Act’s data security requirements and may seek injunctive relief and damages of up to $5,000 per violation. The Act, however, explicitly excludes a private right of action under the data security requirements section.

[1] “Private information” is defined in N.Y. Gen. Bus. § 899-aa and includes “any information concerning a natural person which, because of name, number, personal mark, or other identifier, can be used to identify such natural person” in combination with “(1) social security number; (2) driver’s license number or non-driver identification card number; (3) account number, credit or debit card number, in combination with any required security code, access code, password or other information that would permit access to an individual’s financial account; (4) account number, credit or debit card number, if circumstances exist wherein such number could be used to access an individual’s financial account without additional identifying information, security code, access code, or password; or (5) biometric information[.]” “Private information” also includes “a user name or e-mail address in combination with a password or security question and answer that would permit access to an online account.”

[2] A “small business” is defined as “any person or business with (i) fewer  than fifty employees; (ii) less than three million dollars in gross annual revenue in each of the last three fiscal years; or (iii) less than five million dollars in year-end total assets, calculated in accordance with generally accepted accounting principles.”

We begin with a new US measure to secure its supply chain for a critical infrastructure – the bulk power grid. David Kris unpacks a new Executive Order restricting purchases of foreign equipment for the grid.

Nick Weaver, meanwhile, explains the remarkable extent of surveillance built into Xiaomi phones and questions the company’s claim that it was merely acquiring pseudonymous ad-related data like others in the industry.

It wouldn’t be the Cyberlaw Podcast if we didn’t wrangle over mobile phones and the coronavirus. Mark MacCarthy says that several countries – Australia, the UK, and perhaps France – are deviating from the Gapple model for using phones for infection tracing. Several have bought in. India, meanwhile, is planning a much more government-driven approach to using phone apps to combat the pandemic.

Continue Reading Episode 314: Mirror-Image Decoupling