On May 12, 2021, President Biden signed a landmark Executive Order to improve and modernize the federal government’s cybersecurity infrastructure. The Executive Order comes in the wake of numerous cyber incidents targeting the United States, including the so-called SolarWinds, Microsoft Exchange, and Colonial Pipeline incidents. The Executive Order will directly affect government contractors, including companies that sell software to the government or provide IT services. More broadly, but less directly, the Executive Order is likely to influence the informal, and eventually formal, development of cybersecurity standards for software and hardware makers and providers of online services generally, even when the government is not a customer.

President Biden’s Executive Order takes the following steps:

  • Removing barriers to sharing threat information
    • The Executive Order helps facilitate the sharing of cyber threat and incident information between IT service providers and federal government agencies by (1) removing contractual barriers to such exchanges and (2) requiring the reporting of information about cyber incidents to federal agencies.
  • Strengthening federal government cybersecurity
    • The Executive Order requires the federal government to adopt cybersecurity best practices including “advance[ing] toward Zero Trust Architecture; accelerat[ing] movement to secure cloud services…central[izing] and streamlin[ing] access to cybersecurity data to drive analytics for identifying and managing cybersecurity threats; and invest[ing] in both technology and personnel to match these modernization goals.” As part of these efforts, federal agencies are ordered to “adopt multi-factor authentication and encryption for data at rest and in transit, to the maximum extent consistent with Federal records laws and other applicable laws.”
  • Enhancing software supply chain security
    • The Executive Order mandates the establishment of minimum-security standards for software sold to the federal government. In particular, the standards must address:
      • “Secure software development environments;
      • “Generating and, when requested by a purchaser, providing artifacts [e.g. data] that demonstrate conformance to the processes” implemented to ensure secure software development environments”;
      • “Employing automated tools, or comparable processes, to maintain trusted source code supply chains, thereby ensuring the integrity of the code”;
      • “Employing automated tools, or comparable processes, that check for known and potential vulnerabilities and remediate them, which shall operate regularly, or at a minimum prior to product, version, or update release”;
      • “Providing, when requested by a purchaser, artifacts of the execution of the tools and processes described [in the prior two bullets] and making publicly available summary information on completion of these actions, to include a summary description of the risks assessed and mitigated”;
      • “Maintaining accurate and up-to-date data, provenance (i.e., origin) of software code or components, and controls on internal and third-party software components, tools, and services present in software development processes, and performing audits and enforcement of these controls on a recurring basis”;
      • “Providing a purchaser a Software Bill of Materials (SBOM) for each product directly or by publishing it on a public website”;
      • “Participating in a vulnerability disclosure program that includes a reporting and disclosure process”;
      • “Attesting to conformity with secure software development practices”;
      • “Ensuring and attesting, to the extent practicable, to the integrity and provenance of open source software used within any portion of a product.”
    • The Executive Order also directs development of a pilot program to create a labeling system which would allow the government (and the public) to determine whether software was developed securely.
  • Establishing a Cybersecurity Safety Review Board
    • The Board, which is to be led by individuals from the government and the private sector, will convene following major cybersecurity incidents to review and assess such incidents, mitigation, and response efforts. This idea has been likened to the National Transportation Safety Board (NTSB) for transportation incidents.
  • Standardizing the federal government’s playbook for responding to cybersecurity vulnerabilities and incidents
    • The Executive Order promotes the implementation of “standardized response processes [to] ensure a more coordinated and centralized cataloging of incidents and tracking of agencies’ progress toward successful responses.” Various federal agencies are required to coordinate to “develop a standard set of operational procedures (playbook) to be used in planning and conducting a cybersecurity vulnerability and incident response activity respecting [Federal Civilian Executive Branch] Information Systems.”
  • Improving detection of cybersecurity vulnerabilities and incidents on federal government networks
    • The Executive Order requires the federal government to “employ all appropriate resources and authorities to maximize the early detection of cybersecurity vulnerabilities and incidents on its networks.” Such measures must “include increasing the Federal Government’s visibility into and detection of cybersecurity vulnerabilities and threats to agency networks in order to bolster the Federal Government’s cybersecurity efforts.”
  • Enhancing the federal government’s investigative and remediation capabilities
    • The Executive Order requires the formulation of “policies for agencies to establish requirements for logging, log retention, and log management.”
  • Introducing National Security System Requirements
    • The Executive Order mandates the application of the requirements set forth in the Order to National Security Systems (i.e., non-civilian systems).

The Executive Order constitutes a major step forward in strengthening cyber defenses against the sorts of attacks that have bedeviled government agencies and private companies for decades now. Government contractors will need to comply with the new requirements that will result from the Executive Order. But even more broadly, the Executive Order and the rules that flow from it will have an impact on all companies by creating new expectations for threat and incident reporting and new standards (whether informal or formal) for cybersecurity.

Bruce Schneier joins us to talk about AI hacking in all its forms. He’s particularly interested in ways AI will hack humans, essentially preying on the rough rules of thumb programmed into our wetware – that big-eyed, big-headed little beings are cute and need to have their demands met or that intimate confidences should be reciprocated. AI may not even know what it’s doing, since machines are famous for doing what works unless there’s a rule against it. Bruce is particularly interested in law-hacking – finding and exploiting unintended consequences buried in the rules in the U.S. Code. If any part of that code will lend itself to AI hacking, Bruce thinks, it’s the tax code (insert your favorite tax lawyer joke here). It’s a bracing view of a possible near-term future.

In the news, Nick Weaver and I dig into the Colonial Pipeline ransomware attack and what it could mean for more aggressive cybersecurity action in Washington than the Biden administration was contemplating just last week as it was pulling together an executive order that focused heavily on regulating government contractors.

Nate Jones and Nick examine the stalking flap that is casting a cloud over Apple’s introduction of AirTags.

Michael Weiner takes us through a quick tour of all the pending U.S. government antitrust lawsuits and investigations against Big Tech. What’s striking to me is how much difference there is in the stakes (and perhaps the prospects for success) depending on the company in the dock. Facebook faces a serious challenge but has a lot of defenses. Amazon and Apple are being attacked on profitable but essentially peripheral business lines. And Google is staring at existential lawsuits aimed squarely at its core business.

Nate and I mull over the Russian proposal for a UN cybercrime proposal. The good news is that stopping progress in the UN is usually even easier than stopping legislation in Washington.

Nate and I also puzzle over ambiguous leaks about what DHS wants to do with private firms as it tries to monitor extremist chatter online. My guess: This is mostly about wanting the benefit of anonymity or a fake persona while monitoring public speech.

And then Michael takes us into the battle between Apple and Fortnite over access to the app store without paying the 30% cut demanded by Apple. Michael thinks we’ve mostly seen the equivalent of trash talk at the weigh-in so far, and the real fight will begin with the economists’ testimony this week.

Nick indulges a little trash talk of his own about the claim that Apple’s app review process provides a serious benefit to users, citing among other things the litigation-driven disclosure that Apple never send emails to users of the 125 million buggered apps it found a few years back.

Nick and I try to make sense of stories that federal prosecutors in 2020 sought phone records for three Washington Post journalists as part of an investigation into the publication of classified information that occurred in 2017.

I try to offer something new about the Facebook Oversight Board’s decision on the suspension of President Trump’s account. To my mind, a telling and discrediting portion of the opinion reveals that some of the board members thought that international human rights law required more limits on Trump’s speech – and they chose to base that on the silly notion that calling the coronavirus a Chinese virus is racist. Anyone who has read Nicholas Wade’s careful article knows that there’s lots of evidence the virus leaked from the Wuhan virology lab. If any virus in the last hundred years deserves to be named for its point of origin, then, this is it. Nick disagrees.

Nate previews an ambitious task force plan on tackling ransomware. We’ll be having the authors on the podcast soon to dig deeper into its nearly 50 recommendations.

Signal is emerging a Corporate Troll of the Year, if not the decade. Nick explains how, fresh from trolling Cellebrite, Signal took on Facebook by creating a bevy of personalized Instagram ads that take personalization to the Next Level.

Years after the fact, the New York Attorney General has caught up with the three firms that generated fake comments opposing the FCC’s net neutrality rollback. They’ll be paying fines. But I can’t help wondering why anyone thinks it’s useful to think about proposed rules by counting the number of postcards and emails that shout “yes” or “no” but offer no analysis.

                                                                                                                                               

Download the 361st Episode (mp3).

As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

Our interview is with Kevin Roose, author of Futureproof: 9 Rules for Humans in the Age of Automation debunks most of the comforting stories we use to anaesthetize ourselves to the danger that artificial intelligence and digitization poses to our jobs. Luckily, he also offers some practical and very personal ideas for how to avoid being caught in the oncoming robot apocalypse.

In the news roundup, Dmitri Alperovitch and I take a few moments to honor Dan Kaminsky, an extraordinary internet security and even more extraordinarily decent man. He died too young, at 42, as Nicole Perlroth demonstrates in one of her career-best articles.

Maury Shenk and Mark MacCarthy lay out the EU’s plan to charge Apple with anti-competitive behaviour in running its app store.

Under regulation-friendly EU competition law, the more austere U.S. version, it sure looks as though Apple is going to have trouble escaping unscathed.

Mark and I duke it out over Gov. DeSantis’s Florida bill on content moderation reform.

We agree that it will be challenged as a violation of the First Amendment and as preempted by federal section 230. Mark thinks it will fail that test. I don’t, especially if the challenge ends up in the Supreme Court, where Justice Thomas at least has already put out the “Welcome” mat.

Dmitri and I puzzle over the statement by top White House cyber official Anne Neuberger that the U.S. reprisals against Russia are so far not enough to deter further cyberattacks. We decide it’s a “Kinsley gaffe” – where a top official inadvertently utters an inconvenient truth.

This Week in Information Operations: Maury explains that China may be hyping America’s racial tensions not as a tactic to divide us but simply because it’s an irresistible comeback to U.S. criticisms or Chinese treatment of ethnic minorities. And Dmitri explains why we shouldn’t be surprised at Russia’s integrated use of hacking and propaganda. The real question is why the US has been so bad at the same work.

In shorter stories:

  • Mark covers the slooow rollout of an EU law forcing one-hour takedowns of terrorist content
  • Dmitri tells us about the evolution of ransomware into, full-service doxtortion as sensitive files of the Washington C. Police Department are leaked online
  • Dmitri also notes the inevitability of more mobile phone adtech tracking scandals, such as the compromise of US military operations
  • Maury and I discuss the extent to which China’s internet giants find themselves competing, not for consumers, but for government favor, as China uses antitrust law to cement its control of the tech sector
  • Finally, Dmitri and I unpack the latest delay in DOD’s effort to achieve cybersecurity maturity through regulatory-style compliance, an effort Dmitri believes is doomed
  • And more!

                                                                                                                                               

Download the 360th Episode (mp3)

You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

Brian Egan hosts this episode of the podcast, as Stewart Baker is hiking the wilds of New Hampshire with family. Nick Weaver joins the podcast to discuss the week in ransomware, as DOJ gets serious, and the gangs do too. Justice has a new ransomware task force,  and the gangs have asked  for $50 million not to disclose Apple product plans compromised during a breach of Quanta.

Paul Hughes gives us details on the EU’s proposal for regulating the deployment of artificial intelligence and facial recognition technology. Brian compares the EU work to the FTC’s own principles for achieving truth and fairness while using AI.

Nick finds a lot to like in Sen. Wyden’s ‘Fourth Amendment Is Not For Sale Act,’ which would ban Clearview and government purchases of location data without a warrant.

Brian summarizes the Biden administration’s series of cyber initiatives for critical infrastructure sectors. Nick can’t resist the high-grade trolling on display in the squabble between Signal and Cellebrite. Brian evaluates the administration’s sanctions on Russia for, among other things, the SolarWinds hack.

And Nick covers the ultimate consumer supply chain attack ultimate consumer supply chain attack — on password managers. Nick’s advice: “Amateurs keep their passwords in their drawers. Pros keep their passwords in their wallets.”

Download the 359th Episode (mp3)

You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

Our interview is with Mark Montgomery and John Costello, both staff to the Cyberspace Solarium Commission. The Commission, which issued its main report more than a year ago, is swinging through the pitch, following up with new white papers, draft legislative language, and enthusiastic advocacy for its recommendations in Congress, many of which were adopted last year. That makes it the most successful of the many cybersecurity commissions that have come and gone in Washington. And it’s not done yet. Mark and John review several of the most important legislative proposals the Commission will be following this year. I don’t agree with all of them, but they are all serious ideas and it’s a good bet that a dozen or more could be adopted in this Congress.

In the news roundup, David Kris and I cover the FBI’s use of a single search warrant to remove a large number of web shells from computers infected by China’s irresponsible use of its access to Microsoft Exchange. The use of a search (or, more accurately, a seizure warrant) is a surprisingly far-reaching interpretation of federal criminal Rule 41. But despite valiant efforts, David is unable to disagree with my earlier expressed view that the tactic is lawful.

Brian Egan outlines what’s new in the Biden administration’s sanctions on Russia for its SolarWinds exploits. The short version: While.some of the sanctions break new ground, as with Russian bonds, they do so cautiously.

Paul Rosenzweig, back from Costa Rica, unpacks a hacking story that has everything – terrorism, the FBI, Apple, private sector hacking, and litigation. Short version: we now know the private firm that saved Apple from the possibility of an order to hack its own phone. It’s an Australian firm named Azimuth that apparently only works for democratic governments but that is nonetheless caught up in Apple’s bully-the-cybersecurity-researchers litigation campaign.

Gus Hurwitz talks to us about the seamy side of content moderation (or at least on seamy side) – the fight against “coordinated inauthentic behaviour.”

In quicker takes, Paul gives us a master class in how to read the intel community’s Annual Threat Assessment. David highlights what may be the next Chinese telecom manufacturing target, at least for the GOP, after Huawei and ZTE. I highlight the groundbreaking financial industry breach notification rule that has finished now the comment period and is moving toward adoption. And Gus summarizes the state of Silicon Valley antitrust legislation –  everyone has a bill – so no one is likely to get a bill.

And more!

Download the 358th Episode (mp3).

You can subscribe to The Cyberlaw Podcast using iTunesGoogle PlaySpotifyPocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

They used to say that a conservative was a liberal who’d been mugged. Today’s version is that a conservative who’s comfortable with business regulation is a conservative who’s been muzzled by Silicon Valley. David Kris kicks off this topic by introducing Justice Thomas’s opinion in a case over Trump’s authority to block users he didn’t like. The case was made thoroughly moot by both the election and Twitter’s blocking of Trump, but Justice Thomas wrote separately to muse on the ways in which Twitter’s authority to block users could be regulated by treating the company as a common carrier or public accommodation. David sees a trend among conservative jurists to embrace limits on Big Social’s authority to suppress speech.

I recount my experience being muzzled by LinkedIn, which would not let me link to a new Daily Mail story about the Hunter Biden laptop and say, “The social media giants that won’t let you say the 2020 election was rigged are the people who did their best to rig it: The Hunter Biden laptop was genuine and scandalous according to the Daily Mail.” To my mind, this is Big Social protecting its own business interests by suppressing a story that could convince people that the industry has too much power over our national dialogue and our elections. (I mocked LinkedIn by posting 5 variants of my original post, all making the same point in slightly different ways. You can see this on my LinkedIn account result)

But my view that we should not let five or six Silicon Valley owners take over our national dialogue is challenged by Jamil Jaffer, a friend and conservative who is appalled at my deviation from Republican antiregulatory orthodoxy and first amendment doctrine. It’s a great conservative catfight that mirrors the much greater catfight now under way in the Republican party.

Elsewhere in the news roundup, Jordan Schneider and David dig into the claims that China has built advanced weapons systems with the help of American chip designers and Taiwanese fabs. The accusation has led the Biden administration to slap export controls on several Chinese firms. Whether this will work without more aggressive U.S. controls on, say, foreign fabs serving those firms is open to question.

More to the point, it raises questions about long term U.S. industrial policy. David notes that one answer, the bipartisan “Endless Frontier Act,” is gaining some momentum. (I understand the motivation but question the execution.) We also touch on the sad story of Intel’s recent missteps, and the opportunity that industrial policy has created for GlobalFoundries’ IPO.

Meanwhile Jamil takes on AdTech espionage, while U.S. Senators ask Digital-Ad auctioneers to name foreign clients amid national-security concerns.

We all weigh in on the administration’s cyber picks, announced over the weekend. The unanimous judgment is that Chris Inglis, Jen Easterly, and Rob Silvers are good picks – and, remarkably, ended up in the right jobs.

In shorter hits, David and I ponder Twitch’s unusual decision to start punishing people on line for misdeeds offline – misdeeds that Twitch will investigate itself. While neither of us are comfortable with the decision, including the effort to do privately what we pay cops and courts to do publicly, but there is more justification for the policy in some cases (think child sexual abuse) than might be apparent at first glance.

I tell the story of the Italian authorities identifying and arresting someone trying to hire a hitman using cryptocurrency and the dark web. As far as I know, successful cryptocurrency hitmen remain as rare as unicorns

David suggests that I should be glad not to live in Singapore, where the penalty for information the establishment doesn’t like is a criminal libel judgment that I’d be forced to crowdfund like Singapore’s government critics. I note that American sites like GoFundMe and Patreon have already imposed ideological screens that mean I wouldn’t be able to crowdfund my defense against Big Social.

And, for This Week in Data Breaches, I note the new tactic of ransomware gangs trying to pressure their victims to pay by threatening the victims’ customers with doxxing plus the remarkable phenomenon of half-billion-user data troves that the source companies say are not really the result of network breaches and so not disclosable.

And more!

Download the 357th Episode (mp3).

You can subscribe to The Cyberlaw Podcast using iTunesGoogle PlaySpotifyPocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

Our interview is with Kim Zetter, author of the best analysis to date of the weird messaging from NSA and Cyber Command about the domestic “blind spot” or “gap” in their cybersecurity surveillance. I ask Kim whether this is a prelude to new NSA domestic surveillance authorities (definitely not, at least under this administration), why the gap can’t be filled with the broad emergency authorities for FISA and criminal intercepts (they don’t fit, quite), and how the gap is being exploited by Russian (and soon other) cyberattackers. My most creative contribution: maybe AWS, where most of the domestic machines are being spun up, would trade faster cooperation in targeting such machines for a break on the know-your-customer rules they may otherwise have to comply with. And if you haven’t subscribed to Kim’s (still free for now) substack newsletter, you’re missing out.

In the news roundup, we give a lick and a promise to today’s Supreme Court decision in the fight between Oracle and Google over API copyrights, but Mark MacCarthy takes us deep on the Supreme Court’s decision cutting the heart out of most, class actions for robocalling. Echoing Congressional Dems, Mark thinks the Court’s decision is too narrow. I think it’s exactly right. We both expect Congress to revisit the law soon.

Nick Weaver and I explore the fuss over vaccination passports and how Silicon Valley can help. Considering what a debacle the Google and Apple effort on tracing turned into, with a lot of help from privacy zealots, I’m pleased that Nick and I agree that this is a tempest in a teapot. Paper vax records are likely to be just fine most of the time. That won’t prevent privacy advocates from trying to set unrealistic and unnecessary standards for any electronic vax records system, more or less guaranteeing that it will fall of its own weight.

Speaking of unrealistic privacy advocates, Charles-Albert Helleputte explains why the much-touted GDPR privacy regime is grinding to a near halt as it moves from theory to practice. Needless to say, I am not surprised.

Mark and I scratch the surface of Facebook’s Fairness Flow for policing AI bias. Like anything Facebook does, it’s attracted heavy criticism from the left, but Mark thinks it’s a useful, if limited, tool for spotting bias in machine learning algorithms. I’m half inclined to agree, but I am deeply suspicious of the confession in one “model card” that the designers of an algorithm for identifying toxic speech seem to have juiced their real-life data with what they call “synthetic data” because “real data often has disproportionate amounts of toxicity directed at specific groups.” That sure sounds as though the algorithm relying on real data wasn’t politically correct, so the researchers just made up data that fit their ideology and pretended it was real – an appalling step for scientists to take with little notice. I welcome informed contradiction.

Nick explains why there’s no serious privacy problem with the IRS subpoena to Circle, asking for the names of everyone who has more than $20 thousand in cryptocurrency transactions. Short answer: everybody who doesn’t deal in cryptocurrency already has their transactions reported to the IRS without a subpoena.

Charles-Albert and I not that the EU is on the verge of finding that South Korea’s data protection standards are “adequate” by EU standards. The lesson for the US and China is simple: The Europeans aren’t looking for compliance; they’re looking for assurances of compliance. As Fleetwood Mac once sang, “Tell me lies, tell me sweet little lies.”

Mark and I note the extreme enthusiasm with which the FBI used every high-tech tool to identify even people who simply trespassed in the Capitol on January 6. The tech is impressive, but we suspect a backlash is coming. Nick weighs in to tell me I’m wrong when I argue that we didn’t see these tools used this way against ANTIFA’s 2020 rioters.

Nick thinks we haven’t paid enough attention to the Accellion breach, and I argue that companies are getting a little too comfortable with aggressive lawyering of their public messages after a breach. One result is likely to be a new executive order about breach notification (and other cybersecurity obligations) for government contractors, I predict.

And Charles and I talk about the UK’s plan to take another bite out of end-to-end encryption services, essentially requiring them to show they can still protect kids from sexual exploitation without actually reading the texts and pictures they receive.

Good luck with that!

Download the 356th Episode (mp3).

You can subscribe to The Cyberlaw Podcast using iTunesGoogle PlaySpotifyPocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

Our interview this week is with Francis Fukuyama, a fellow and teacher at Stanford and a renowned scholar and public intellectual for at least three decades. He is the coauthor of the Report of the Working Group on Platform Scale. It’s insightful on the structural issues that have enhanced the power of platforms to suppress and shape public debate. It understands the temptation to address those issues through an antitrust lens – as well as the reasons why antitrust will fail to address the threat that platform power poses to our democracy. As a solution, it proposes to force the platforms to divest their curatorial authority over what Americans (and the world) reads, creating a host of middleware suppliers who will curate consumers’ feeds in the way that consumers prefer. We explore the many objections to this approach, from first amendment purists to those, mainly on the left, who really like the idea of suppressing their opponents on the right. But it remains the one policy proposal that could attract support from left and right and also make a real difference.

In the news roundup, Dmitri Alperovich, Nick Weaver, and I have a spirited debate over the wisdom of Google’s decision to expose and shut down a western intelligence agency’s use of zero day exploits against terrorist targets. I argue that if a vulnerabilities equities process balancing security and intelligence is something we expect from NSA, it should also be expected of Google.

Nate Jones and Dmitri explore the slightly odd policy take on SolarWinds that seems to be coming from NSA and Cyber Command – the notion that the Russians exploited NSA’s domestic blind spot by using US infrastructure for their attack. That suggests that NSA wants to do more spying domestically, although no such proposal has surface. Nate, Dmitri, and I are united in thinking that the solution is a change in US law, though Dmitri thinks a know your customer rule for cloud providers is the best answer, while I think I persuaded Nate that empowering faster and more automatic warrant procedures for the FBI is doable, pretty much as we did with the burner phone problem in the 90s.

The courts, meanwhile, seem to be looking for ways to bring back a Potter Stewart style of jurisprudence for new technology and the fourth amendment: “I can’t define it, but I know it when it creeps me out.” The first circuit’s lengthy oral argument on how long video surveillance of public spaces can continue without violating the fourth amendment is a classic of the genre.

Dmitri and Nick weigh in on Facebook’s takedown of Chinese hackers using Facebook to target Uighurs abroad.

Dmitri thinks we can learn policy lessons from the exposure (and likely sanctioning) of the private Chinese companies that carried out the operation.

Dmitri also explains why CISA’s head is complaining about the refusal of private companies to tell DHS which US government agencies were compromised in SolarWinds. The companies claimed that their NDAs with, say, Treasury meant that they couldn’t tell DHS that Treasury had been pawned. That’s an all too familiar example of federal turf fights hurting federal cybersecurity.

In our ongoing feature, This Week in US-China Decoupling, we cover the “Disaster in Alaska” evaluate the latest bipartisan bill to build a Western technology sphere to compete with China’s sector, note the completely predictable process ousting of Chinese telecom companies from the US market, and conclude that the financial sector’s effort to defy the gravity of decoupling will be a hard act to maintain.

Always late to embrace a trend, I offer Episode 1 of the Cyberlaw Podcast as a Non-Fungible Token to the first listener to cough up $150, and Nick explains why it would be cheap at a tenth the price, dashing my hopes of selling the next 354 episodes and retiring.

Nick and I have kind words for whoever is doxxing Russian criminal gangs, and I suggest offering the doxxer a financial reward (not just a hat tip in a Brian Krebs column. We fewer kind words have for the prospect that AI will soon be able to locate, track, and bankrupt problem gamblers. 

I issue a rare correction to an earlier episode, noting that Israel may not have traded its citizens’ health data for first dibs on the Pfizer vaccine. It turns out that what was deidentified aggregate health data, Israel offered Pfizer which with proper implementation may actually stay aggregate and deidentified. And I offer my own hat tip to Peter Machtiger, for a student note in an NYU law journal that cites the Cyberlaw Podcast, twice!

And more!

Download the 355th Episode (mp3).

You can subscribe to The Cyberlaw Podcast using iTunesGoogle PlaySpotifyPocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

Our news roundup for this episode is heavy on China and tech policy. And most of the news is bad for tech companies. Jordan Schneider tells us that China is telling certain agencies, not to purchase Teslas or allow them on the premises, for fear that Elon Musk’s famously intrusive record-keeping systems will give US agencies insight into Chinese facilities and personnel. Pete Jeydel says the Biden administration is prepping to make the same determination about Chinese communications and information technology, sending subpoenas to a number of Chinese tech suppliers. Meanwhile, Apple’s effort to protect its consumers from apps that collect personal data is coming under pressure from what Jordan sees as a remarkable alliance of normally warring companies, including Baidu, Tencent, and Bytedance. In addition to their commercial heft, all these companies likely have more juice in Beijing than Apple, so look for Tim Cook to climb down from his privacy high horse in China. (And Russia, where Apple has already agreed to let the Russian government specify the apps that must come preinstalled on iPhones sold in Russia.) Still, you can expect that Apple will continue to bravely refuse to cooperate with the FBI on terrorism and serious crime because that might set a precedent for cooperating with government demands in places like Russia and China (like them, I guess, but, you know, smaller).

But the episode gets its title from our discovery that President Xi’s critique of social media platforms sounds exactly like Sen. Josh Hawley’s. It is, in fact, the global bien pensant consensus, which has no dissenters to speak of now that the Chinese go to Davos. Jordan offers insights into why the Chinese government’s concerns about Big Tech might have its origins in something other than factional strife in Beijing.

David Kris and I dive into the final word from the intelligence community on foreign governments’ interference (via hacking or influence ops) in our 2020 election. The short answer is that the Russians and the Chinese didn’t hack our election machinery, in fact they didn’t even try. So, chest-beating over our 2020 cyber defenses may be a little like doing a victory lap after the other team forfeits. David and I manage to disagree about a few things, including the Hunter Biden laptop story, which I contend is now the principal disinformation campaign of 2020, as the media and Big Tech combined to throttle the story on spurious suspicions of a Russian hand in its provenance; David disagrees.

Pete Jeydel and Ishan Sharma, our interview guest, weigh in on the latest cyber conflict paper from the United Nations. We all agree that it could be worse, and that getting the General Assembly to accept it was an achievement at a time of lowered expectations for the UN.

The Cyber Space Solarium Commission is not going away, Pete and I agree, as witness the most recent report card issued to the Biden Administration by a Solarium staffer. In principle, that’s a good thing; commissions need to stick around and fight for their recommendations. But I can’t help complaining that some of the things the Commission is fighting for – Senate confirmation of a White House cyber director, and cutting DHS out of supply chain governance – are bad ideas.

We close with a recognition of the rafts of material supplied over the years to the podcast by the data protection authorities of Europe. They’ve mostly always been an example of what Texans call “all hat and no cattle” – better talkers than doers. But now their lack of serious implementation skills is catching up to them, as the companies they have penalized begin to pursue, and win, judicial appeals. That’s a trend likely to continue, and a good thing too.

Our interview is with Ishan Sharma, from the Federation of American Scientists, and author of “A More Responsible Digital Surveillance Future Multi-stakeholder Perspectives and Cohesive State & Local, Federal, and International Actions.”

If you like the episodes where I disagree profoundly with my guests, this one’s for you. I don’t think Ishan gets more than two minutes in before the critiquing begins. Still, he holds his own, defending a vision of surveillance technology that serves democratic ends and is for that reason supported and even subsidized in a global competition with the less democratic alternatives from China. I suspect that he’ll lose friends on both the left and the right as he tries to walk this line, but he’s clearly put a lot of thought into finding an alternative to technopessimism, and he defends it ably.

And more!

Download the 354th Episode (mp3).

You can subscribe to The Cyberlaw Podcast using iTunesGoogle PlaySpotifyPocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

This week we interview Eliot Higgins, founder and executive director of the online investigative collective Bellingcat and author of We Are Bellingcat.

Bellingcat has produced remarkable investigative scoops on everything from Saddam’s use of chemical weapons to exposing the Russian FSB operatives who killed Sergei Skripal with Novichok, and, most impressive, calling a member of the FSB team that tried to kill Navalny and getting him to confess. Eliot talks about the techniques that make Bellingcat so effective and the hazards, physical and moral, that surround crowdsourced investigations.

In the news, Dave Aitel gives us the latest on the Exchange server compromise, and the reckless Chinese hack-everyone spree that was apparently triggered by Microsoft’s patch of the vulnerability.

Jamil Jaffer introduces us to the vulnerability of the week – dependency confusion, and the startling speed with which it is being exploited.

I ask Nate Jones and the rest of the panel what all this means for government policy. No one thinks that the Biden published cyberstrategy tells us anything useful. More interesting are two deep dives on cyberstrategy from people with a long history in the field. We see Jim Lewis’s talk on the topic as an evolution in the direction of much harsher responses to Russian and Chinese intrusions. Dmitri Alperovich’s approach also has a hard edge, although he points out that the utter irresponsibility of the Chinese pawn-em-all tactic deserves an especially harsh response. I wonder why Cyber Command didn’t respond by releasing a worm that would install poorly secured shells on every Exchange server in China.

In other news, I blame poor (or rushed) DOD lawyering for the district court ruling that DOD couldn’t list Xiaomi as an entity aligned with the Chinese military. Jamil is more charitable both to DOD and the Judge who made the ruling, but he expects (or maybe just hopes) that the court of appeal will show DOD more deference.

Twitter, on the other hand, is praying that the Northern District of California suffers from full-blown Red State Derangement, as it asks the court there to enjoin a Texas Attorney General investigation into possible anticompetitive coordination in the Great Deplatforming of January 2021.

Nate gives us the basics. I observe that, to bring such a Hail Mary of a case, Twitter must deeply fear what its own employees were saying about the deplatforming at the time. Neither Nate nor I give Twitter a high probability of success. And even if it does succeed, red states are lining up new laws and regulatory initiatives for Silicon Valley, most notably Gov. DeSantis’s controversial effort to navigate section 230 and the first amendment.

Nate also provides a remarkably clear explanation of the sordid tale of European intelligence and law enforcement agencies trying to cut a special deal for themselves in the face of surveillance-hostile rulings from the EU’s Court of Justice. The agencies are right to want to avoid those foolish decisions, but leaving the US on the hook will only inflame trans-Atlantic relations.

In quick hits, Jamil and Dave talk us through Israel’s Unit 8200, the press on which offers a better cybersecurity VC alumni network than Stanford. Playing to type, I close with This Week in Sex Toy Security and immediately display my naivete. Wearables, who knew? But the security lapses in what Dave calls the internet of junk at least offers a new image to go with the concept of a man-in-the-middle attack.

And more!

Download the 353rd Episode (mp3).

You can subscribe to The Cyberlaw Podcast using iTunesGoogle PlaySpotifyPocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.