Ransomware attacks have been soaring in frequency and severity, affecting companies, government agencies, and nonprofits and leading to larger and larger ransom demands as a condition for unlocking the victim’s information systems. On June 30, 2021, the New York State Department of Financial Services (NYDFS) issued guidance on how potential victims can minimize the risk of a successful ransomware attack. While the controls are officially characterized as guidance, NYDFS makes clear that it “expects regulated companies to implement” the preventative controls, in particular, “whenever possible.” Companies not regulated by NYDFS should also consider implementing the guidance, since they are just as susceptible to ransomware attacks and the NYDFS guidance may be considered by other regulators and courts as contributing to a general standard of reasonable security in the face of this growing cyber threat.

NYDFS reported that from January 2020 through May 2021, NYDFS-regulated companies reported 74 ransomware attacks ranging “from crippling days-long shutdowns to minor disruption from temporary loss of a few computers.” In addition, NYDFS reported a “growing number of third-party Cybersecurity Events – where ransomware attacks against a critical vendor disrupt[ed] the operations of a regulated company.”

NYDFS’s guidance highlights nine controls to prevent or respond to ransomware attacks:

  • “Email Filtering and Anti-Phishing Training”
    • Companies should provide their workforce with “recurrent phishing training, including how to spot, avoid, and report phishing attempts.” They “should also conduct periodic phishing exercises and test whether employees will click on attachments and embedded links in fake emails, and remedial training for employees as necessary.” Lastly, companies should ensure that emails are “filtered to block spam and malicious attachments/links from reaching users.”
  • “Vulnerability and Patch Management”
    • Companies should implement “a documented program to identify, assess, track, and remediate vulnerabilities on all enterprise assets within their infrastructure.” This “program should include periodic penetration testing.” In addition, companies should ensure that “[v]ulnerability management include[s] requirements for timely application of security patches and updates” and “[w]herever possible … automatic updates” should be enabled.
  • “Multi-Factor Authentication (‘MFA’)”
    • The guidance reminds regulated companies that “MFA for remote access to the network and all externally exposed enterprise and third-party applications is required by” the NYDFS Cybersecurity Regulation. The ransomware guidance recommends that companies expand the use of MFA to “[a]ll logins to privileged accounts, whether remote or internal.”
  • “Disable RDP Access”
    • Remote Desktop Protocol (RDP) access should be disabled whenever possible. However, if “RDP access is deemed necessary, then access should be restricted to only approved (whitelisted) originating sources and [companies should] require MFA as well as strong passwords.”
  • “Password Management”
    • “Regulated companies should ensure that strong, unique passwords are used.” In particular, “passwords of at least 16 characters” should be used and “commonly used passwords” should be banned. Larger organizations should “consider a password vaulting PAM (privileged access management) solution” that would require “employees to request and check out passwords.” Finally, companies should disable “password caching” wherever possible.
  • “Privileged Access Management”
    • Companies should implement “the principle of least privileged access – each user or service account should be given the minimum level of access necessary to perform the job.” In addition, they “should universally require MFA and strong passwords” for privileged accounts and “maintain and periodically audit an inventory of all privileged accounts. Privileged accounts should be used only for tasks requiring elevated privileges, and administrators should have a second non-privileged account for all other tasks such as logging into their workstation, email, drafting documents, etc.”
  • “Monitoring and Response”
    • “Regulated companies must have a way to monitor their systems for intruders and respond to alerts of suspicious activity.” As part of such efforts, companies should “implement an Endpoint Detection and Response (‘EDR’) solution, which monitors for anomalous activity. … Companies with larger and more complex networks should also have lateral movement detection and a Security Information and Event Management (SIEM) solution that centralizes logging and security event alerting.”
  • “Tested and Segregated Backups”
    • “Regulated companies should maintain comprehensive, segregated backups that will allow recovery in the event of a ransomware attack.” In addition, “at least one set of backups should be segregated from the network and offline” to ensure backups are not compromised by the attack. Finally, companies should “periodically test backups by actually restoring critical systems from backups” so that backups “actually work when needed.”
  • “Incident Response Plan”
    • Companies should implement an “incident response plan that explicitly addresses ransomware attacks,” and regularly test the plan, with involvement of senior leadership.

NYDFS’ guidance also recommends against the payment of ransoms to attackers. NYDFS’ position on this issue is in line with that of the FBI, which also recommends against the payment of ransoms. Companies should also closely consider the advisories of the Treasury Department’s  Office of Foreign Assets Control (OFAC) and Financial Crimes Enforcement Network (FinCEN) on the sanctions and anti-money laundering (AML) risks of making or facilitating ransomware payments, as we have previously addressed.

Finally, NYDFS’ guidance advises that “any successful deployment of ransomware on a [regulated company’s] internal network should be reported to DFS ‘as promptly as possible and within 72 hours at the latest.'” It also recommends that “any intrusion where hackers gain access to privileged accounts” should be reported.

On July 7, 2021, Gov. Jared Polis signed into law the Colorado Privacy Act (CPA), which will go into effect on July 1, 2023. Like California’s and Virginia’s data privacy laws, the CPA aims to provide consumers with greater control over their data and enhanced transparency with respect to how their data is used. However, businesses should pay close attention to the unique nuances of the CPA, which are likely to complicate compliance strategies.


The CPA applies to data “controllers” that “conduct[] business in Colorado or produce[] or deliver[] commercial products or services that are intentionally targeted to residents of Colorado” and that:

  • “Control[] or process[] the personal data of [100,000 Colorado residents] or more during a calendar year,” or
  • “Derive[] revenue or receive[] a discount on the price of goods or services from the sale of personal data and process[] or control[] the personal data of [25,000 Colorado residents] or more.”

“Controller” is defined as “a person that, alone or jointly with others, determines the purposes for and means of processing personal data.”

The CPA covers “consumers” who are defined as “Colorado resident[s] acting only in an individual or household context.” Importantly, it does not cover individuals “acting in a commercial or employment context, as a job applicant or as a beneficiary of someone acting in an employment context.”

The CPA applies to “personal data” which is defined as “information that is linked or reasonably linkable to an identified or identifiable individual” and “[d]oes not include deidentified data or publicly available information.”

The CPA does not apply to:

  • Certain healthcare related information, including information related to HIPAA compliance;
  • “Activit[ies] involving the collection, maintenance, disclosure, sale, communication, or use of any personal data bearing on a consumer’s creditworthiness, credit standing, credit capacity, character, general reputation, personal characteristics, or mode of living by” (1) consumer reporting agencies, (2) furnishers of information for use in a consumer report, and (3) users of a consumer report;
  • Personal data in connection with certain state and federal laws, including:
    • The Colorado Health Benefit Exchange Act;
    • The federal Gramm-Leach-Bliley Act;
    • The federal Driver’s Privacy Protection Act of 1994;
    • The federal Children’s Online Privacy Protection Act of 1998; or
    • The federal Family Educational Rights and Privacy Act of 1974.
  • “Data maintained for employment records purposes;”
  • Air carriers;
  • A national securities association registered pursuant to the Securities Exchange Act of 1934;
  • Financial institutions covered under the federal Gramm-Leach-Bliley Act and implementing regulations;
  • “Customer data maintained by” public utilities or state body “if the data are not collected, maintained, disclosed, sold, communicated, or used except as authorized by state and federal law;” and
  • “Data maintained by” state and municipal government entities (including state institutions of higher education) “if the data is collected, maintained, disclosed, communicated, and used as authorized by state and federal law for noncommercial purposes.”

Rights and Requirements

The CPA provides consumers with the following rights:

  • “[T]he right to opt out of the processing of personal data concerning the consumer for the purposes of” (1) “targeted advertising,” (2) “sale of personal data,” and (3) “profiling in furtherance of decisions that produce legal or similarly significant effects concerning a consumer.”
  • “[T]he right to confirm whether a controller is processing personal data concerning the consumer and to access the consumer’s personal data.”
  • “[T]he right to obtain [their] personal data in a portable and, to the extent technically feasible, readily usable format that allows the consumer to transmit the data to another entity without hindrance.”
  • “[T]he right to correct inaccuracies in the consumer’s personal data, taking into account the nature of the personal data and the purposes of processing the consumer’s personal data.”
  • “[T]he right to delete personal data concerning the consumer.”

It bears emphasizing that the CPA’s right to deletion, like Virginia’s, is broader than California’s in that it applies to any personal data concerning the consumer, not just personal data collected from the consumer herself.

The CPA prohibits a controller from “increase[ing] the cost of, or decreas[ing] the availability of, [a] product or service” due solely to a consumer’s “exercise of a right” where such action by the controller is “unrelated to feasibility or the value of a service.”

The CPA requires a controller to “inform a consumer of any action taken on a request…without undue delay and, in any event, within [45] days after receipt of the request.” The CPA permits a controller to extend the 45-day period by an additional 45 days “where reasonably necessary, taking into account the complexity and number of the requests.” A controller exercising the additional 45-day period must inform the consumer of the extension within the initial 45 days following receipt of the request in addition to the reasons for the extension.

A controller is not required to comply with a consumer’s request “if the controller is unable to authenticate the request using commercially reasonable efforts, in which case the controller may request the provision of additional information reasonably necessary to authenticate the request.”

If a controller does not take action in response to a consumer request, the controller must “inform the consumer, without undue delay and, at the latest within [45] days after receipt of the request, of the reasons for not taking action and instructions for how to appeal the decision with the controller.”

The CPA requires controllers to “establish an internal process whereby consumers may appeal a refusal to take action on a request.” Controllers must respond to an appeal within 45 days of receipt with the ability to extend for an additional 60 days “taking into account the complexity and number of requests serving as the basis for the appeal.” A controller must inform a consumer of the extension within 45 days of receiving the appeal “together with the reasons for the delay.” A controller is also required to inform consumers of their right to contact the Colorado Attorney General regarding the result of their appeal.

The CPA requires controllers to adhere to the following requirements:

  • “[P]rovide consumers with a reasonably accessible, clear, and meaningful privacy notice” including:
    • “The categories of personal data collected or processed by the controller or a processor1;”
    • “The purposes for which the categories of personal data are processed;”
    • “How and where consumers may exercise their rights [under the CPA], including the controller’s contact information and how a consumer may appeal a controller’s action with regard to the consumer’s request;”
    • “The categories of personal data that the controller shares with third parties, if any;”
    • “The categories of third parties, if any, with whom the controller shares personal data.”
  • “[C]learly and conspicuously disclose the sale [of personal data to third parties] or processing [of personal data for targeted advertising], as well as the manner in which a consumer may exercise the right to opt out of the sale or processing.”
  • “[S]pecify the express purposes for which personal data are collected and processed.”
  • Ensure “collection of personal data [is] adequate, relevant, and limited to what is reasonably necessary in relation to the specified purposes for which the data are processed.”
  • “[N]ot process personal data for purposes that are not reasonably necessary to or compatible with the specified purposes for which the personal data are processed, unless the controller first obtains the consumer’s consent.”
  • “[T]ake reasonable measures to secure personal data during both storage and use from unauthorized acquisition.” Such measures “must be appropriate to the volume, scope, and nature of the personal data processed and the nature of the business.”
  • “[N]ot process personal data in violation of state or federal laws that prohibit unlawful discrimination against consumers.”
  • “[N]ot process a consumer’s sensitive data without first obtaining the consumer’s consent or, in the case of the processing of personal data concerning a known child, without first obtaining consent from the child’s parent or guardian.” “Sensitive data” is defined as (1) “[p]ersonal data revealing racial or ethnic origin, religious beliefs, a mental or physical health condition or diagnosis, sex life or sexual orientation, or citizenship or citizenship status,” (2) [g]enetic or biometric data that may be processed for the purpose of uniquely identifying an individual,” or (3) [p]ersonal data from a known child.” “‘Child’ means an individual under 13 years of age.”

In addition, the CPA prohibits controllers from “conduct[ing] processing that presents a heightened risk of harm to a consumer without conducting and documenting a data protection assessment of” such data processing activities. The data protection assessment requirement only applies to “personal data acquired on or after” the CPA’s effective date and to processing activities created or generated after July 1, 2023. The requirement is not retroactive. “Processing activities that present a heightened risk of harm to a consumer” include:

  • “Processing personal data for purposes of targeted advertising or for profiling if that profiling presents a reasonably foreseeable risk of:

(I) [u]nfair or deceptive treatment of, or unlawful disparate impact on, consumers; (II) [f]inancial or physical injury to consumers;

(III) [a] physical or other intrusion upon the solitude or seclusion, or the private affairs or concerns, of consumers if the intrusion would be offensive to a reasonable person; or

(IV) [o]ther substantial injury to consumers;”

  • “Selling personal data”; and
  • “Processing sensitive data.”

As part of the data protection assessment, controllers must “identify and weigh the benefits that may flow, directly and indirectly, from the processing to the controller, the consumer, and other stakeholders, and the public against potential risks to the rights of the consumer associated with the processing, as mitigated by the safeguards that the controller can employ to reduce the risks.” Controllers are directed to “factor into th[e] assessment the use of de-identified data and the reasonable expectations of consumers, as well as the context of processing and the relationship between the controller and the consumer whose personal data will be processed.”

Data processing assessments must be made available to the Colorado Attorney General upon request.

Enforcement and Penalties

The Colorado Attorney General and Colorado District Attorneys are granted exclusive authority to enforce the CPA. Violations of the CPA will constitute a deceptive trade practice carrying penalties of up to $2,000 per violation, but not more than $500,000 per series of violations. In addition to seeking monetary penalties, the Colorado Attorney General and Colorado District Attorneys are able seek injunctive relief to enjoin violations of the CPA. Importantly, the CPA explicitly excludes a private right of action.

Until January 1, 2025, controllers will receive an opportunity to cure violations within 60 days before facing an enforcement action.


The Colorado Attorney General is also authorized to promulgate rules under the CPA. By July 1, 2023, the Attorney General must promulgate rules for universal opt-out mechanisms related to the processing of personal data for targeted advertising or the sale of personal data. By January 1, 2025, the Attorney General may adopt additional rules “govern[ing] the process of issuing opinion letters and interpretive guidance to develop an operational framework for business that includes a good faith reliance defense of an action that may otherwise constitute a violation of [the CPA].” Such rules must become effective by July 1, 2025.


While July 1, 2023 may seem far away, businesses should begin familiarizing themselves with the CPA and address necessary compliance measures at the same time they institute compliance measures for the Virginia Consumer Data Protection Act and the California Privacy Rights Act, which take effect January 1, 2023.


1 Processor is defined as “a person that processes personal data on behalf of a controller.”

We begin the episode with a review of the massive Kaseya ransomware attack.

Dave Aitel digs into the technical aspects while Paul Rosenzweig and Matthew Heiman explore the policy and political But either way, the news is bad.

Then we come to the Florida ‘deplatforming’ law, which a Clinton appointee dispatched in a cursory opinion last week. I’ve been in a small minority who thinks the law, far from being a joke, is likely to survive (at least in part) if it reaches the Supreme Court. Paul challenges me to put my money where my mouth is. Details to be worked out, but if a portion of the law survives in the top court, Paul will be sending a thousand bucks to Trumpista nonprofit. If not, I’ll likely be sending my money to the ACLU.

Surprisingly, our commentators mostly agree that both NSA and Tucker Carlson could be telling the truth, despite the insistence of their partisans that the other side must be lying. NSA gets unaccustomed praise for its … wait for it … rapid and PR-savvy response. That’s got to be a first.

Paul and I conclude that Maine, having passed in haste the strongest state facial recognition ban yet, will likely find itself repenting at leisure.

Matthew decodes Margrethe Vestager’s warning to Apple against using privacy, security to limit competition.

And I mock Apple for claiming to protect privacy while making employees wear body cams to preserve the element of surprise at the next Apple product unveiling. Not to mention the 2-billion-person asterisk attached to Apple’s commitment to privacy.

Dave praises NSA for its stewardship of a popular open source reverse engineering tool.

And everyone has a view about cops using YouTube’s crappy AI takedown engine to keep people from posting videos of their conversations with cops.

And more!


Download the 369th Episode (mp3)


You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

This episode offers an economical overview of the six antitrust reform bills reported out of the House Judiciary Committee last week. Michael Weiner and Mark MacCarthy give us the top line for all six (though only four would make substantial new policy). We then turn quickly to the odd-couple alliances supporting and opposing the bills, including my brief cameo appearance, in Rep. Jim Jordan’s opposition, on the gratifying ground (ok, among others) that Microsoft had never explained its suppression of my recent LinkedIn post. On the whole, I think Rep. Jordan is right; there’s very little in these bills that will encourage a diversity of viewpoints on social media or among its “trust and safety” bureaucrats.

Nick Weaver trashes the FBI for its prosecution of AnMing Hu. I’m more sympathetic, but neither of us thinks this will end well for the Bureau or the China Initiative.

Adam Candeub makes his second appearance and does a fine job unpacking three recent decisions on the scope of Section 230. The short version: Facebook only partly beat the rap for sex trafficking in the Texas Supreme Court; SnapChat got its head handed to it in the speed filter case; and all the Socials won but faced persuasive dissents in a case over assistance to terrorist groups.

The long version: Silicon Valley has sold the courts a bill of goods on Section 230 for reasons that sounded good when the Internet was shiny and democratic and new. Now that disillusion has set in, the sweeping subsidy conferred by the courts is looking a lot less plausible. The wheels aren’t coming off Section 230 yet, but the paint is peeling, and Big Tech’s failure to get their reading of the law blessed by the Supreme Court ten years ago is going to cost them – mainly because their reading is inconsistent with some basic rules of statutory interpretation.

Nick and I engage on the torture indictments of executives who sold internet wiretapping capabilities to the Qaddafi regime.

Mark is unable to hose down my rant over Canada’s bone-stupid effort to impose Canadian content quotas on the internet and to write An online hate speech law of monumental vagueness.

And in closing, Nick and I bid an appropriately raucous and conflicted adieu to the Hunter Thompson of Cybersecurity: John McAfee

And More!


Download the 368th Episode (mp3)


You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!


The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

We couldn’t avoid President Biden’s trip to Europe this week. He made news (but only a little progress) on cybersecurity at every stop. Nick Weaver and I dig into the President’s consultations with Vladimir Putin, which featured veiled threats and a modest agreement on some sort of continuing consultations on protecting critical infrastructure.

Jordan Schneider sums up the G7 and NATO statements aligning with U.S. criticisms of China.

And our newest contributor, Michael Ellis, critiques the EU-U.S. consultations on technology, which featured a complete lack of U.S. resolve on getting an outcome on transatlantic data flows that would preserve US intelligence capabilities.

Michael also recaps the latest fallout from the Colonial Pipeline ransomware shutdown – new regulatory initiatives from TSA and a lot of bipartisan regulatory proposals in Congress. I note the very unusual (or, maybe, all too usual) meaning given to “bipartisanship” on Capitol Hill.

Nick isn’t exactly mourning the multiple hits now being suffered by ransomware insurers, from unexpected losses to the ultimate in concentrated loss – gangs that hack the insurer first and then systematically extort all its ransomware insurance customers.

Jordan sums up China’s new data security law. He suggests that, despite the popular reporting on the law, which emphasizes the government control narrative, the motive for the law may be closer to the motive for data protection laws in the West – consumer suspicion over how private data is being used. I’m less convinced, but we have a nice discussion of how bureaucratic imperatives and competition work in the Peoples Republic of China.

Michael and Nick dig into the White Paper on FISA applications published by the outgoing chairman of the Privacy and Civil Liberties Oversight Board. Notably, in my mind, the White Paper does not cast doubt on the Justice Department’s rebuttal to a Justice Inspector General’s report suggesting that the FISA process is riddled with error. The paper also calls urgently for renewal of the expired FISA section 215 authority and suggests several constructive changes to the FISA paperwork flow.

In quick hits, Michael brings us up to date on the FCC’s contribution to technology decoupling from China: a unanimous vote to exclude Chinese companies from the U.S. telecom infrastructure and a Fifth Circuit decision upholding its decision to exclude Chinese companies from subsidized purchases by U.S. telecom carriers.  And Jordan reminds us just how much progress China has made in exploring space.

And more!


Download the 367th Episode (mp3)


You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!


The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.



Just as retail stores, bars, restaurants, and entertainment venues in New York City have been authorized to relax COVID restrictions, they will soon have to confront a new set of requirements—this time focused on their collection of customers’ biometric information. On July 9, 2021, New York City’s new law addressing the collection and use of biometric identifier information will go into effect. The NYC Biometric Law is part of a broader trend of state and local governments adopting laws to regulate business’ collection and use of biometric information.

The NYC Biometric Law requires “[a]ny commercial establishment that collects, retains, converts, stores or shares biometric identifier information of customers” to provide notice of such practices by “placing a clear and conspicuous sign near all of the commercial establishment’s customer entrances notifying customers in plain, simple language…that customers’ biometric identifier information is being collected, retained, converted, stored or shared, as applicable.” The sign must adhere to a “a form and manner prescribed by the commissioner of consumer and worker protection by rule.”

In addition, the NYC Biometric Law prohibits “sell[ing], leas[ing], trad[ing], [or] shar[ing] in exchange for anything of value or otherwise profit[ing] from the transaction of biometric identifier information.”

“Commercial establishment” is defined as “a place of entertainment, a retail store, or a food and drink establishment.” The law does not apply to financial institutions.

“Biometric identifier information” is defined as “a physiological or biological characteristic that is used by or on behalf of a commercial establishment, singly or in combination, to identify, or assist in identifying, an individual, including, but not limited to: (i) a retina or iris scan, (ii) a fingerprint or voiceprint, (iii) a scan of hand or face geometry, or any other identifying characteristic.” Significantly, the law’s notice requirement does not apply “to [b]iometric identifier information collected through photographs or video recordings, if (i) the images or videos collected are not analyzed by software or applications that identify, or that assist with the identification of, individuals based on physiological or biological characteristics, and (ii) the images or video are not shared with, sold or leased to third-parties other than law enforcement agencies.”

The NYC Biometric Law contains a private right of action. Private plaintiffs may seek both monetary damages and injunctive relief. For violations of the notice requirement only, commercial establishments are provided with a 30-day cure period. For each violation of the notice requirement, businesses may be liable for damages of $500. Each negligent violation of the prohibition on selling, leasing, trading, or sharing biometric identifier information for anything of value or profit may result in damages of $500, while each intentional or reckless violation may lead to damages of $5,000. Plaintiffs may also recover reasonable attorney’s fees and costs.

Paul Rosenzweig lays out the much more careful, well-written, and a policy catastrophe in the making. The main problem? It tries to turn one of the most divisive issues in American life into a problem to be solved by technology. Apparently because that has worked so well in areas like content suppression. In fact, I argue, the report will be seen by many, especially in the center and on the right, as an effort to impose proportional representation quotas by stealth in a host of places that have never been the objects of such policies before. Less controversial, but only a little, is the U.S. government’s attempt to make government data available for training more AI algorithms. Jane more or less persuades me that this effort too will end in tears or stasis.

In cheerier news, the good guys got a couple of surprising wins this week. While encryption and bitcoin have posed a lot of problems for law enforcement in recent years, the FBI has responded with imagination and elan, at least if we can judge by two stories from last week. First, Nick Weaver takes us through the laugh-out-loud facts behind a, government-run encrypted phone for criminals complete with influencers, invitation-only membership, and nosebleed pricing to cement the phone’s exclusive status. Jane Bambauer unpacks some of the surprisingly complicated legal questions raised by the FBI’s creativity.

Paul Rosenzweig lays out the much more obscure facts underlying the FBI’s recovery of much of the ransom paid by Colonial Pipeline. There’s no doubt that the government surprised everyone by coming up with the private key controlling the bitcoin account. We’d like to celebrate the ingenuity behind the accomplishment, but the how it pulled it off, probably because it hopes to do the same thing again and can’t if it blows the secret. FBI isn’t actually explaining

The Biden administration is again taking a shaky and impromptu Trump policy and giving it a sober interagency foundation.  This time it’s the TikTok and WeChat bans; these have been rescinded. But a new process has been put in place that could restore and even expand those bans in a matter of months. Paul and I disagree about whether the Biden administration will end up applying the Trump policy to TikTok or WeChat or to a much larger group of Chinese apps.

For comic relief, Nick regales us with Brian Krebs’s wacky story of the FSB’s weird and counterproductive attempt to secure communications to the FSB’s web site.

Jane and I review the latest paper by Bruce Schneier (and Henry Farrell) on how to address the impact of technology on American democracy. We are not persuaded by its suggestion that our partisan divide can best be healed by more understanding, civility, and aggressive prosecutions of Republicans.

Finally, everyone confesses to some confusion about the claim that the Trump Justice Department breached norms in its criminal discovery motions that turned up records relating to prominent Democratic congressmen and at least one Trump administration official.

Best bet: this flap will turn out to be less interesting the more we learn. But I renew my appeal, this time aimed at outraged Democrats, for more statutory guardrails and safeguards against partisan misuse of national security authorities. Because that’s what we’ll need if we want to keep those authorities on the books.

And more!


Download the 366th Episode (mp3)


You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!


The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

The Biden administration is pissing away one of the United States’ most important counterterrorism intelligence programs. At least that’s my conclusion  from this episode’s depressing review of the administrations halting and delusion-filled approach to the transatlantic data crisis. The EU thinks time is on its side, and it’s ignoring Jamil Jaffer’s heartfelt plea to be a better ally in the face of Russian and Chinese pressure. Every day, Silicon Valley companies whose data stores in the US have been a goldmine for counterterrorism are feeling legal pressure to move that data to Europe. Those companies care little whether the US gets good intelligence from its section 702 requests, at least compared to the prospects of massive fines and liability in Europe. So, unless the administration creates a countervailing incentive, the other actors will simply present Washington with a fait accompli. The Biden administration, like the Trump administration before it, seems unable to grasp the need for action. When Trump was in charge, we could call him incompetent. When we wake up to what we’ve lost under Biden, that’s what we’ll call him, too

For companies struggling with their role in this global drama, Charles Helleputte has moderately good news. The European Commission, contrary to the dogmatic approach of the data protection agencies, has opened a door for transfers using the new standard contractual clauses. If your data has not been requested by the U.S. under section 702 or similar intelligence programs and you can offer good reason to think they won’t be requested in the future, you could avoid the hammer of a data export ban.

In other news, Jamil and I cross swords on whether the Colonial pipeline hack should have ended TSA’s light-touch oversight of pipeline cybersecurity.

And Nate Jones and I dig deep into the state trend toward regulating police access to DNA ancestry databases. After some fireworks, we come close to agreement that some state law provision on database access is inevitable and workable, but that the Maryland law is so hostile to solving brutal crimes with DNA searches that it is hard to distinguish from a ban.

Jamil explains the Biden administration’s decision to provide a new foundation for the Trump ban on investment in Chinese military companies. Treasury will take the program away from DOD, which had handled its responsibilities with the delicacy of Edward Scissorhands.

Nate limbers up the DeHype Machine to put in perspective DOJ’s claim to be giving ransomware hacks the same priority as terrorism. Jamil takes on autonomous drones and pours cold water on the notion that DOD will be procuring some of its drones from China.

In a moment of weakness I fail to attack or even mock the UN GGE’s latest report on norms for cyberconflict.

And in a series of quick hits:

  • Jamil reviews Facebook’s latest antitrust problems in the EU and UK.
  • I bring back the “throuple” Congresswoman, whose failed pivot from abuser of power to victim of revenge porn has just cost her over $100,000.
  • In case you haven’t heard, Facebook might let Trump come back in January 2023, and his blog page has shut down for good.
  • The European Commission has proposed a trusted and secure Digital Identity for all Europeans but Charles thinks there’s less there than meets the eye.
  • And Nigeria has suspended Twitter after the platform shut down the President’s account for obliquely threatening military action against secessionists.
  • And more!


Download the 365th Episode (mp3)


You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!


The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

President Bill Clinton earned lasting notoriety for his explanation of why his statement denying a relationship with Monica Lewinsky was truthful (“it depends on what the meaning of the word ‘is’ is”). It is doubtful Justice Amy Coney Barrett’s majority opinion for the Supreme Court last week in Van Buren v. U.S. will earn as much ridicule from late-night comedians, despite putting so much questionable weight on a two-letter word (in this case, the word “so”). But the opinion does finally resolve an issue that has split lower courts and vexed employers, website operators, security researchers, and others for many years: whether the Computer Fraud and Abuse Act (CFAA) can be used to prosecute, or sue civilly, someone who accesses a computer with authorization, but uses that access for an improper purpose. The Court answered that question with a resounding, “No.” But the Court left unresolved a number of other questions, including what sorts of limits on access have to be transgressed in order to give rise to a CFAA violation.

The CFAA prohibits, among other things, intentionally accessing a computer “without authorization” or “exceed[ing] authorized access” and obtaining information. In Van Buren, a police officer had used his patrol car computer to access a law enforcement database to look up a license plate number in exchange for money from a private person who wanted information about a woman he had met at a strip club. The arrangement turned out to be an FBI sting, and after the officer used his valid credentials to look up the license plate number in the database, he was arrested and charged with violating the CFAA. The government alleged that the officer had exceeded his authorized access to the database by accessing it for an improper purpose—i.e., for personal use, in violation of police department policy. The officer was convicted and sentenced to 18 months in prison.

On appeal to the Eleventh Circuit, the officer argued that “exceeds authorized access” in the CFAA reaches only people who are authorized to access a computer, but then access information to which their authorized access does not extend. Several circuits have interpreted this clause in just this way. However, the Eleventh Circuit, like some others, adopted a broader view, holding that the clause also applies to someone who has authorization to access a computer but then uses that access for an inappropriate reason.

This broad interpretation has drawn a great deal of criticism, including by those who argue that it results in the criminalization of a great deal of everyday behavior. Anyone who violates a website’s terms of use (such as by using a pseudonym, or supplying a fake date of birth), or violates her company’s computer use policy by sending personal emails or composing personal documents on a workplace computer, would be violating the CFAA.

The Supreme Court cited such arguments as one reason the broad interpretation of “exceeds authorized access” is “implausib[le].” But the Court’s principal reason for adopting a narrow reading of the phrase turned on the word “so.” The CFAA defines “exceeds authorized access” as “access[ing] a computer with authorization and to use such access to obtain or alter information in the computer that the accesser is not entitled so to obtain or alter.” The Court devoted several pages of linguistic analysis to explaining why the word “so” must be read as restricting the entire definition to persons who are authorized to access a computer, but are not entitled to use that access to obtain or alter certain information, and why the clause cannot be read as applying to people who are authorized to obtain or alter that information but then do so for a prohibited purpose. One might charitably say that this is all a very lawyerly reading of the phrase (as was said about Mr. Clinton’s exegesis of the meaning of “is”). But whatever the case, it is now the law.

Fortunately, the Court ended its opinion with a clearer enunciation of its interpretation of “exceeds authorized access”: “In sum, an individual ‘exceeds authorized access’ when he accesses a computer with authorization but then obtains information located in particular areas of the computer—such as files, folders, or databases—that are off limits to him.” This makes clear that one cannot violate the CFAA—and therefore be subjected to criminal prosecution or a civil suit—merely by using his authorized access to obtain information for an improper purpose. This may make it more difficult for employers to use the CFAA to go after rogue employees who steal company information for a competing firm, or for website operators to sue competitors who abuse their authorized access to a site’s content by scraping it or otherwise mining it for commercial advantage.

Nevertheless, the Court’s opinion leaves some significant questions unresolved, and therefore still leaves room for effectively using the CFAA in such situations. Notably, the Court explicitly leaves open the question of how a computer owner may limit access to particular information in order to be able to sue for violations of those limits. Some will likely misread the opinion as requiring technological barriers to access. But it may be enough to impose carefully worded limits via contractual or policy terms, as long as they are focused on prohibiting access to the information, not on prohibiting certain uses. It may also be enough to impose limits on access by certain means, while allowing access by other means. Thus, for example, a competitor might have authorization to access a website’s content as a regular user, but if the website’s terms prohibit scraping the same content via automated bots, then such scraping may still give rise to a CFAA violation.

So—while Van Buren will be widely read as limiting the ability of computer owners to use the CFAA as a legal weapon, the reality—for now, at least—is that companies can still use that statute to protect their information, as long as they give careful thought to the ways they limit access to it.

We don’t get far into my interview with the authors of a widely publicized Ransomware Task Force report, before I object that most of its recommendations are “boring” procedural steps that don’t directly address the ransomware scourge. That prompts a vigorous dialogue with Philip Reiner, the Executive Director of the Institute for Security and Technology (IST), the report’s sponsoring organization, from Megan Stifel, of the Global Cyber Alliance, and Chris Painter, of The Global Forum on Cyber Expertise Foundation. And we in fact find several new and not at all boring recommendations among the nearly 50 put forward in the report.

In the news roundup, Dmitri Alperovitch has an answer to my question, “Is Putin getting a handle on U.S. social media?” Not just Putin, but every other large authoritarian government is finding ways to bring Google, Twitter, and Facebook to heel. In Russia’s case, the method is first a token fine, then a gradual throttling of service delivery that makes domestic competitors look better in comparison to the Silicon Valley brand.

Mark MacCarthy handicaps the Epic v. Apple lawsuit. The judge is clearly determined to give both sides reason to fear that the case won’t go well. And our best guess is that Epic might get some form of relief but not the kind of outcome they hoped for.

Dmitri and I marvel at the speed and consensus around regulatory approaches to the Colonial Pipeline ransomware event. It’s highly likely that the attack will spur legislation mandating reports of cyber incidents (and without any liability protection) as well as aggressive security regulation from the agency with jurisdiction – TSA. I offer a cynical Washington perspective on why TSA has acted so decisively.

Mark and I dig into the signing and immediate court filing against Florida’s social media regulation attacking common content moderation issues. Florida will face an uphill fight, but neither of us is persuaded by the tech press’s claim that the law will be “laughed out of court.” There is a serious case to be made for almost everything in the law, with the exception of the preposterous (and probably severable) exemption for owners of Florida theme parks.

Dmitri revs up the DeHyping Machine for reports that the Russians responded to Biden administration sanctions by delivering another cyberpunch in the form of hijacked USAID emails.  It turns out that the attack was garden variety cyberespionage, that the compromise didn’t involve access to USAID networks, that it was launched before sanctions, and that it didn’t get very far.

Jordan Schneider explains the impact of S. government policy on the cellular-equipment industry, and the appeal of Open RAN as a way of end-running the current incumbents. U.S. industrial policy could be transformed by the shape-shifting Endless Frontier Act.

Jordan and Dmitri explain how. I ask whether we’re seeing a deep convergence on industrial policy on both sides of the Pacific, now that President XI has given a speech on tech policy that could have been delivered by half a dozen Republican or Democratic senators.

Finally, Dmitri reviews the bidding in cryptocurrency regulation both at the White House and in London.

In short hits, we cover:

  • The European Court of Human Rights decision squeezing but not quite killing GCHQ’s mass data interception programs and cooperation with the U.S. I offer a possible explanation for the court’s caution.
  • A court filing strongly suggesting that the Biden administration will not be abandoning a controversial Trump administration rule that requires visa applicants to register their social media handles with the U.S. government.  I speculate on why.
  • A WhatsApp decision not to threaten its users to get them to accept the company’s new privacy terms. Instead, I suspect, WhatsApp will annoy them into submission.
  • And, finally, a festival of EU competition law Brussels attacks on Silicon Valley, from to Germany and France.

And more!


Download the 364th Episode (mp3)


You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!


The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.