The Biden administration’s effort to counter ransomware may not be especially creative, but it is comprehensive. The administration is pushing all the standard buttons on the interagency dashboard, including the usual high-level task force and a $10 million reward program (but not including hackback authority for victims, despite headlines suggesting otherwise. And all the noise seems to be having some effect, as the RE ransomware gang’s web sites have mysteriously shut down.

Our interview is with Josh Steinman, who served as the National Security Council’s cybersecurity senior director for the entire Trump administration. He offers his perspective on the issues and the personalities that drove cybersecurity policy in those chaotic years. As a bonus, Josh and I dig into his public effort to find a suitable startup, an effort we have to cut short as I start getting too close to one of the more promising possibilities.

Nick Weaver reminds us (in song, no less) that the government’s efforts to stop scourges like Trickbot have a distinct whiff of Whack-a-Mole, and the same may be true of REvil.

Maury Shenk covers the Biden administration’s belated but well-coordinated international response to China’s irresponsible Microsoft Exchange hack, including the surprising revelation that China may be back to hacking like it’s 1999 1999 – relying on criminal hackers to serve the government’s ends.

In other China news, Maury Shenk and Pete Jeydel catalog the many ways that the current regime is demonstrating its determination to bring China’s tech sector to heel. It’s punishing Didi in particular for doing a U.S. IPO despite go-slow signals from Beijing. It’s imposing cybersecurity reviews on other companies that IPO outside China. And it seems to be pressing for competition concessions that the big tech companies would have successfully resisted a few years ago.

It was a big week for state-sponsored attacks on secure communications. Nick and I dig in the FBI and Australian federal police coup in selling ANOM phones to criminal gangs. Previewing an article for Lawfare, I argue that the Australian police may have to answer tough questions about whether their legal authority for the phone’s architecture really avoided introducing a systemic weakness into the phone’s security.

Law enforcement agencies around the world could face even tougher questions if they’ve been relying on NSO or Candiru, Israeli firms that compromise mobile phones for governments.Both firms have been on the receiving end of harsh forensics analyses from Amnesty International and Citizen Lab. Nick thinks the highly specific and centralized target logs are particularly a problem for NSO’s claims that it doesn’t actually know the details of how its malware is deployed.

Pete Jeydel tells us that the administration is learning to walk and chew gum on cybersecurity at the same time. While coordinating pushes on Chinese and Russian hacks, it also managed to get big chunks of the government to turn in their federal cybersecurity homework on time. Pete talks us through one of those assignments, the NTIA’s paper setting minimum elements for a Software Bill of Materials.

It wouldn’t be the Cyberlaw Podcast without a brief rant on content moderation. The Surgeon General claimed this week that “Misinformation takes away our freedom to make informed decisions about our health.” He didn’t say that administration censorship would give us our freedom back, but that seems to be the administration’s confident view, as the President, no less, accuses Facebook of “killing people” by not jumping more quickly to toe the CDC’s official line.

And if you thought it would stop with social media, think again.  The White House is complaining that telecom carriers also should be screening text messages that are hostile to vaccinations.

Finally, just to show that the world has truly turned upside down, Maury reminds me that a German – German! – court has fined American social media for too enthusiastically censoring a lockdown protest video.

Pete tells us what’s in the new Colorado privacy bill. Short version: it joins Virginia’s in some of hosing down California’s excesses.

And in short takes:

  • Maury explains Vietnam’s version of China’s fifty-cent army.
  • Nick explains why Psiphon is a better tool for evading Cuban censorship that the sleaze-infested Tor system.
  • Maury updates me on the European Parliament LIBE committee’s latest proposal for accepting the U.S. intelligence community’s transatlantic surrender on data flows.
  • And Pete tells us that the SEC may finally be putting the screws to companies that have been lax about reporting breaches to their investors.
  • And more!

                                                                                                            

 

Download the 371st Episode (mp3)

 

You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

 

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

 

 

We begin the episode with the Biden administration’s options for responding to continued Russian ransomware outrages. Dmitri Alperovitch reprises his advice in the Washington Post that Putin will only respond to strength and U.S. pressure. I agree but raise the question whether the U.S. has the tools to enforce another set of alleged red lines, given Putin’s enthusiasm and talent for crossing them. If jumping U.S. red lines were an Olympic sport, Russia would have retired the gold by now. Dmitri reminds us that Russian cooperation against cybercrime remains a mirage. He also urges that we keep the focus on ransomware and not the more recent attempt to hack the Republican National Committee.

The Biden White House has been busy this week, or at least Tim Wu has. When Wu took a White House job as “Special Assistant to the President for Technology and Competition Policy,” some might have wondered why he did it. Now, Gus Hurwitz only after giving child abusers a six-month holiday from scrutiny tells us, it looks as though he was given carte blanche to turn his recent think tank paper into an Executive Order. Gus: Biden targets Big Tech in sweeping new Executive Order cracking down on anti-competitive practices. It’s a kitchen sink full of proposals, Mark MacCarthy notes, most of them more focused on regulation than competition. That observation leads to a historical diversion to the way Brandeisian competition policy aimed at smaller competitors and ended by creating bigger regulatory agencies and bigger companies to match.

We had to cover Donald Trump’s class actions against Twitter, Facebook, and Google, but if the time we devoted to the lawsuits was proportionate to their prospects for success, we’d have stopped talking in the first five seconds.

Mark gives more time to a House Republican leadership plan to break up Big Tech and stop censorship. But the plan (or, to be fair, the sketch) is hardly a dramatic rebuke to Silicon Valley – and despite that isn’t likely to get far. Divisions in both parties’ House caucuses now seem likely to doom any legislative move against Big Tech in this Congress.

The most interesting tech and policy story of the week is the Didi IPO in the U.S., and the harsh reaction to it in Beijing. Dmitri tells us that the government has banned new distributions of Didi’s ride-sharing app and opened a variety of punitive regulatory investigations into the company. This has dropped Didi’s stock price, punishing the U.S. investors who likely pressed Didi to launch the IPO despite negative signals from Beijing.

Meanwhile, more trouble looms for the tech giant, as Senate conservatives object to Didi benefiting from U.S. investment and China makes clear that Didi will not be allowed to provide the data needed to comply with U.S. stock exchange rules.

Mark and Gus explain why 37 U.S. states are taking Google to court over its Play Store rules and why, paradoxically, Google’s light hand in the Play store could expose it more to antitrust liability than Apple’s famously iron-fisted rule.

Dmitri notes the hand-wringing over the rise of autonomous drone weapons but dismisses the notion that there’s something uniquely new or bad about the weapons (we’ve had autonomous, or at least automatic, submarine weapons, he reminds us, since the invention of naval mines in the fourteenth century).

In quick hits, Gus and Dmitri offer dueling perspectives on the Pentagon’s proposal to cancel and subdivide the big DOD cloud contract.

Gus tells us about the other Fortnite lawsuit against Apple over it app policy; this one is in Australia and was recently revived.

As I suspected, Tucker Carlson has pretty much drained the drama from his tale of having his communications intercepted by NSA. Turns out he’s been seeking an interview with Putin. And no one should be surprised that the NSA might want to listen to Putin.

The Indian government is telling its courts that Twitter has lost its 230-style liability protection in that country. As a result, it looks as though Twitter is rushing to comply with Indian law requirements that it has blown off so far. Still, the best part of the story is Twitter’s appointment of a “grievance officer.” Really, what could be more Silicon Valley Woke? I predict it’s only a matter of months before the whole Valley fills with Chief Grievance Officers, after which the Biden administration will appoint one for the Executive Branch.

And, finally, I give the EU Parliament credit for doing the right thing in passing legislation that lets companies look for child abuse on their platforms. Readers may remember that the problem was EU privacy rules that threatened to end monitoring for abuse all around the world. To make sure we remembered that this is still the same feckless EU Parliament as always, the new authority was grudgingly adopted only after giving child abusers a six-month holiday from scrutiny. It was also limited to three years, after which the Parliament seems to think that efforts to stop the sexual abuse of children will no longer be needed.

And more!

                                                                                                                                                     

Download the 370th Episode (mp3)

 

You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

 

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

 

Ransomware attacks have been soaring in frequency and severity, affecting companies, government agencies, and nonprofits and leading to larger and larger ransom demands as a condition for unlocking the victim’s information systems. On June 30, 2021, the New York State Department of Financial Services (NYDFS) issued guidance on how potential victims can minimize the risk of a successful ransomware attack. While the controls are officially characterized as guidance, NYDFS makes clear that it “expects regulated companies to implement” the preventative controls, in particular, “whenever possible.” Companies not regulated by NYDFS should also consider implementing the guidance, since they are just as susceptible to ransomware attacks and the NYDFS guidance may be considered by other regulators and courts as contributing to a general standard of reasonable security in the face of this growing cyber threat.

NYDFS reported that from January 2020 through May 2021, NYDFS-regulated companies reported 74 ransomware attacks ranging “from crippling days-long shutdowns to minor disruption from temporary loss of a few computers.” In addition, NYDFS reported a “growing number of third-party Cybersecurity Events – where ransomware attacks against a critical vendor disrupt[ed] the operations of a regulated company.”

NYDFS’s guidance highlights nine controls to prevent or respond to ransomware attacks:

  • “Email Filtering and Anti-Phishing Training”
    • Companies should provide their workforce with “recurrent phishing training, including how to spot, avoid, and report phishing attempts.” They “should also conduct periodic phishing exercises and test whether employees will click on attachments and embedded links in fake emails, and remedial training for employees as necessary.” Lastly, companies should ensure that emails are “filtered to block spam and malicious attachments/links from reaching users.”
  • “Vulnerability and Patch Management”
    • Companies should implement “a documented program to identify, assess, track, and remediate vulnerabilities on all enterprise assets within their infrastructure.” This “program should include periodic penetration testing.” In addition, companies should ensure that “[v]ulnerability management include[s] requirements for timely application of security patches and updates” and “[w]herever possible … automatic updates” should be enabled.
  • “Multi-Factor Authentication (‘MFA’)”
    • The guidance reminds regulated companies that “MFA for remote access to the network and all externally exposed enterprise and third-party applications is required by” the NYDFS Cybersecurity Regulation. The ransomware guidance recommends that companies expand the use of MFA to “[a]ll logins to privileged accounts, whether remote or internal.”
  • “Disable RDP Access”
    • Remote Desktop Protocol (RDP) access should be disabled whenever possible. However, if “RDP access is deemed necessary, then access should be restricted to only approved (whitelisted) originating sources and [companies should] require MFA as well as strong passwords.”
  • “Password Management”
    • “Regulated companies should ensure that strong, unique passwords are used.” In particular, “passwords of at least 16 characters” should be used and “commonly used passwords” should be banned. Larger organizations should “consider a password vaulting PAM (privileged access management) solution” that would require “employees to request and check out passwords.” Finally, companies should disable “password caching” wherever possible.
  • “Privileged Access Management”
    • Companies should implement “the principle of least privileged access – each user or service account should be given the minimum level of access necessary to perform the job.” In addition, they “should universally require MFA and strong passwords” for privileged accounts and “maintain and periodically audit an inventory of all privileged accounts. Privileged accounts should be used only for tasks requiring elevated privileges, and administrators should have a second non-privileged account for all other tasks such as logging into their workstation, email, drafting documents, etc.”
  • “Monitoring and Response”
    • “Regulated companies must have a way to monitor their systems for intruders and respond to alerts of suspicious activity.” As part of such efforts, companies should “implement an Endpoint Detection and Response (‘EDR’) solution, which monitors for anomalous activity. … Companies with larger and more complex networks should also have lateral movement detection and a Security Information and Event Management (SIEM) solution that centralizes logging and security event alerting.”
  • “Tested and Segregated Backups”
    • “Regulated companies should maintain comprehensive, segregated backups that will allow recovery in the event of a ransomware attack.” In addition, “at least one set of backups should be segregated from the network and offline” to ensure backups are not compromised by the attack. Finally, companies should “periodically test backups by actually restoring critical systems from backups” so that backups “actually work when needed.”
  • “Incident Response Plan”
    • Companies should implement an “incident response plan that explicitly addresses ransomware attacks,” and regularly test the plan, with involvement of senior leadership.

NYDFS’ guidance also recommends against the payment of ransoms to attackers. NYDFS’ position on this issue is in line with that of the FBI, which also recommends against the payment of ransoms. Companies should also closely consider the advisories of the Treasury Department’s  Office of Foreign Assets Control (OFAC) and Financial Crimes Enforcement Network (FinCEN) on the sanctions and anti-money laundering (AML) risks of making or facilitating ransomware payments, as we have previously addressed.

Finally, NYDFS’ guidance advises that “any successful deployment of ransomware on a [regulated company’s] internal network should be reported to DFS ‘as promptly as possible and within 72 hours at the latest.'” It also recommends that “any intrusion where hackers gain access to privileged accounts” should be reported.

On July 7, 2021, Gov. Jared Polis signed into law the Colorado Privacy Act (CPA), which will go into effect on July 1, 2023. Like California’s and Virginia’s data privacy laws, the CPA aims to provide consumers with greater control over their data and enhanced transparency with respect to how their data is used. However, businesses should pay close attention to the unique nuances of the CPA, which are likely to complicate compliance strategies.

Scope

The CPA applies to data “controllers” that “conduct[] business in Colorado or produce[] or deliver[] commercial products or services that are intentionally targeted to residents of Colorado” and that:

  • “Control[] or process[] the personal data of [100,000 Colorado residents] or more during a calendar year,” or
  • “Derive[] revenue or receive[] a discount on the price of goods or services from the sale of personal data and process[] or control[] the personal data of [25,000 Colorado residents] or more.”

“Controller” is defined as “a person that, alone or jointly with others, determines the purposes for and means of processing personal data.”

The CPA covers “consumers” who are defined as “Colorado resident[s] acting only in an individual or household context.” Importantly, it does not cover individuals “acting in a commercial or employment context, as a job applicant or as a beneficiary of someone acting in an employment context.”

The CPA applies to “personal data” which is defined as “information that is linked or reasonably linkable to an identified or identifiable individual” and “[d]oes not include deidentified data or publicly available information.”

The CPA does not apply to:

  • Certain healthcare related information, including information related to HIPAA compliance;
  • “Activit[ies] involving the collection, maintenance, disclosure, sale, communication, or use of any personal data bearing on a consumer’s creditworthiness, credit standing, credit capacity, character, general reputation, personal characteristics, or mode of living by” (1) consumer reporting agencies, (2) furnishers of information for use in a consumer report, and (3) users of a consumer report;
  • Personal data in connection with certain state and federal laws, including:
    • The Colorado Health Benefit Exchange Act;
    • The federal Gramm-Leach-Bliley Act;
    • The federal Driver’s Privacy Protection Act of 1994;
    • The federal Children’s Online Privacy Protection Act of 1998; or
    • The federal Family Educational Rights and Privacy Act of 1974.
  • “Data maintained for employment records purposes;”
  • Air carriers;
  • A national securities association registered pursuant to the Securities Exchange Act of 1934;
  • Financial institutions covered under the federal Gramm-Leach-Bliley Act and implementing regulations;
  • “Customer data maintained by” public utilities or state body “if the data are not collected, maintained, disclosed, sold, communicated, or used except as authorized by state and federal law;” and
  • “Data maintained by” state and municipal government entities (including state institutions of higher education) “if the data is collected, maintained, disclosed, communicated, and used as authorized by state and federal law for noncommercial purposes.”

Rights and Requirements

The CPA provides consumers with the following rights:

  • “[T]he right to opt out of the processing of personal data concerning the consumer for the purposes of” (1) “targeted advertising,” (2) “sale of personal data,” and (3) “profiling in furtherance of decisions that produce legal or similarly significant effects concerning a consumer.”
  • “[T]he right to confirm whether a controller is processing personal data concerning the consumer and to access the consumer’s personal data.”
  • “[T]he right to obtain [their] personal data in a portable and, to the extent technically feasible, readily usable format that allows the consumer to transmit the data to another entity without hindrance.”
  • “[T]he right to correct inaccuracies in the consumer’s personal data, taking into account the nature of the personal data and the purposes of processing the consumer’s personal data.”
  • “[T]he right to delete personal data concerning the consumer.”

It bears emphasizing that the CPA’s right to deletion, like Virginia’s, is broader than California’s in that it applies to any personal data concerning the consumer, not just personal data collected from the consumer herself.

The CPA prohibits a controller from “increase[ing] the cost of, or decreas[ing] the availability of, [a] product or service” due solely to a consumer’s “exercise of a right” where such action by the controller is “unrelated to feasibility or the value of a service.”

The CPA requires a controller to “inform a consumer of any action taken on a request…without undue delay and, in any event, within [45] days after receipt of the request.” The CPA permits a controller to extend the 45-day period by an additional 45 days “where reasonably necessary, taking into account the complexity and number of the requests.” A controller exercising the additional 45-day period must inform the consumer of the extension within the initial 45 days following receipt of the request in addition to the reasons for the extension.

A controller is not required to comply with a consumer’s request “if the controller is unable to authenticate the request using commercially reasonable efforts, in which case the controller may request the provision of additional information reasonably necessary to authenticate the request.”

If a controller does not take action in response to a consumer request, the controller must “inform the consumer, without undue delay and, at the latest within [45] days after receipt of the request, of the reasons for not taking action and instructions for how to appeal the decision with the controller.”

The CPA requires controllers to “establish an internal process whereby consumers may appeal a refusal to take action on a request.” Controllers must respond to an appeal within 45 days of receipt with the ability to extend for an additional 60 days “taking into account the complexity and number of requests serving as the basis for the appeal.” A controller must inform a consumer of the extension within 45 days of receiving the appeal “together with the reasons for the delay.” A controller is also required to inform consumers of their right to contact the Colorado Attorney General regarding the result of their appeal.

The CPA requires controllers to adhere to the following requirements:

  • “[P]rovide consumers with a reasonably accessible, clear, and meaningful privacy notice” including:
    • “The categories of personal data collected or processed by the controller or a processor1;”
    • “The purposes for which the categories of personal data are processed;”
    • “How and where consumers may exercise their rights [under the CPA], including the controller’s contact information and how a consumer may appeal a controller’s action with regard to the consumer’s request;”
    • “The categories of personal data that the controller shares with third parties, if any;”
    • “The categories of third parties, if any, with whom the controller shares personal data.”
  • “[C]learly and conspicuously disclose the sale [of personal data to third parties] or processing [of personal data for targeted advertising], as well as the manner in which a consumer may exercise the right to opt out of the sale or processing.”
  • “[S]pecify the express purposes for which personal data are collected and processed.”
  • Ensure “collection of personal data [is] adequate, relevant, and limited to what is reasonably necessary in relation to the specified purposes for which the data are processed.”
  • “[N]ot process personal data for purposes that are not reasonably necessary to or compatible with the specified purposes for which the personal data are processed, unless the controller first obtains the consumer’s consent.”
  • “[T]ake reasonable measures to secure personal data during both storage and use from unauthorized acquisition.” Such measures “must be appropriate to the volume, scope, and nature of the personal data processed and the nature of the business.”
  • “[N]ot process personal data in violation of state or federal laws that prohibit unlawful discrimination against consumers.”
  • “[N]ot process a consumer’s sensitive data without first obtaining the consumer’s consent or, in the case of the processing of personal data concerning a known child, without first obtaining consent from the child’s parent or guardian.” “Sensitive data” is defined as (1) “[p]ersonal data revealing racial or ethnic origin, religious beliefs, a mental or physical health condition or diagnosis, sex life or sexual orientation, or citizenship or citizenship status,” (2) [g]enetic or biometric data that may be processed for the purpose of uniquely identifying an individual,” or (3) [p]ersonal data from a known child.” “‘Child’ means an individual under 13 years of age.”

In addition, the CPA prohibits controllers from “conduct[ing] processing that presents a heightened risk of harm to a consumer without conducting and documenting a data protection assessment of” such data processing activities. The data protection assessment requirement only applies to “personal data acquired on or after” the CPA’s effective date and to processing activities created or generated after July 1, 2023. The requirement is not retroactive. “Processing activities that present a heightened risk of harm to a consumer” include:

  • “Processing personal data for purposes of targeted advertising or for profiling if that profiling presents a reasonably foreseeable risk of:

(I) [u]nfair or deceptive treatment of, or unlawful disparate impact on, consumers; (II) [f]inancial or physical injury to consumers;

(III) [a] physical or other intrusion upon the solitude or seclusion, or the private affairs or concerns, of consumers if the intrusion would be offensive to a reasonable person; or

(IV) [o]ther substantial injury to consumers;”

  • “Selling personal data”; and
  • “Processing sensitive data.”

As part of the data protection assessment, controllers must “identify and weigh the benefits that may flow, directly and indirectly, from the processing to the controller, the consumer, and other stakeholders, and the public against potential risks to the rights of the consumer associated with the processing, as mitigated by the safeguards that the controller can employ to reduce the risks.” Controllers are directed to “factor into th[e] assessment the use of de-identified data and the reasonable expectations of consumers, as well as the context of processing and the relationship between the controller and the consumer whose personal data will be processed.”

Data processing assessments must be made available to the Colorado Attorney General upon request.

Enforcement and Penalties

The Colorado Attorney General and Colorado District Attorneys are granted exclusive authority to enforce the CPA. Violations of the CPA will constitute a deceptive trade practice carrying penalties of up to $2,000 per violation, but not more than $500,000 per series of violations. In addition to seeking monetary penalties, the Colorado Attorney General and Colorado District Attorneys are able seek injunctive relief to enjoin violations of the CPA. Importantly, the CPA explicitly excludes a private right of action.

Until January 1, 2025, controllers will receive an opportunity to cure violations within 60 days before facing an enforcement action.

Rulemaking

The Colorado Attorney General is also authorized to promulgate rules under the CPA. By July 1, 2023, the Attorney General must promulgate rules for universal opt-out mechanisms related to the processing of personal data for targeted advertising or the sale of personal data. By January 1, 2025, the Attorney General may adopt additional rules “govern[ing] the process of issuing opinion letters and interpretive guidance to develop an operational framework for business that includes a good faith reliance defense of an action that may otherwise constitute a violation of [the CPA].” Such rules must become effective by July 1, 2025.

Conclusion

While July 1, 2023 may seem far away, businesses should begin familiarizing themselves with the CPA and address necessary compliance measures at the same time they institute compliance measures for the Virginia Consumer Data Protection Act and the California Privacy Rights Act, which take effect January 1, 2023.

 

1 Processor is defined as “a person that processes personal data on behalf of a controller.”

We begin the episode with a review of the massive Kaseya ransomware attack.

Dave Aitel digs into the technical aspects while Paul Rosenzweig and Matthew Heiman explore the policy and political But either way, the news is bad.

Then we come to the Florida ‘deplatforming’ law, which a Clinton appointee dispatched in a cursory opinion last week. I’ve been in a small minority who thinks the law, far from being a joke, is likely to survive (at least in part) if it reaches the Supreme Court. Paul challenges me to put my money where my mouth is. Details to be worked out, but if a portion of the law survives in the top court, Paul will be sending a thousand bucks to Trumpista nonprofit. If not, I’ll likely be sending my money to the ACLU.

Surprisingly, our commentators mostly agree that both NSA and Tucker Carlson could be telling the truth, despite the insistence of their partisans that the other side must be lying. NSA gets unaccustomed praise for its … wait for it … rapid and PR-savvy response. That’s got to be a first.

Paul and I conclude that Maine, having passed in haste the strongest state facial recognition ban yet, will likely find itself repenting at leisure.

Matthew decodes Margrethe Vestager’s warning to Apple against using privacy, security to limit competition.

And I mock Apple for claiming to protect privacy while making employees wear body cams to preserve the element of surprise at the next Apple product unveiling. Not to mention the 2-billion-person asterisk attached to Apple’s commitment to privacy.

Dave praises NSA for its stewardship of a popular open source reverse engineering tool.

And everyone has a view about cops using YouTube’s crappy AI takedown engine to keep people from posting videos of their conversations with cops.

And more!

                                                                                                           

Download the 369th Episode (mp3)

 

You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

This episode offers an economical overview of the six antitrust reform bills reported out of the House Judiciary Committee last week. Michael Weiner and Mark MacCarthy give us the top line for all six (though only four would make substantial new policy). We then turn quickly to the odd-couple alliances supporting and opposing the bills, including my brief cameo appearance, in Rep. Jim Jordan’s opposition, on the gratifying ground (ok, among others) that Microsoft had never explained its suppression of my recent LinkedIn post. On the whole, I think Rep. Jordan is right; there’s very little in these bills that will encourage a diversity of viewpoints on social media or among its “trust and safety” bureaucrats.

Nick Weaver trashes the FBI for its prosecution of AnMing Hu. I’m more sympathetic, but neither of us thinks this will end well for the Bureau or the China Initiative.

Adam Candeub makes his second appearance and does a fine job unpacking three recent decisions on the scope of Section 230. The short version: Facebook only partly beat the rap for sex trafficking in the Texas Supreme Court; SnapChat got its head handed to it in the speed filter case; and all the Socials won but faced persuasive dissents in a case over assistance to terrorist groups.

The long version: Silicon Valley has sold the courts a bill of goods on Section 230 for reasons that sounded good when the Internet was shiny and democratic and new. Now that disillusion has set in, the sweeping subsidy conferred by the courts is looking a lot less plausible. The wheels aren’t coming off Section 230 yet, but the paint is peeling, and Big Tech’s failure to get their reading of the law blessed by the Supreme Court ten years ago is going to cost them – mainly because their reading is inconsistent with some basic rules of statutory interpretation.

Nick and I engage on the torture indictments of executives who sold internet wiretapping capabilities to the Qaddafi regime.

Mark is unable to hose down my rant over Canada’s bone-stupid effort to impose Canadian content quotas on the internet and to write An online hate speech law of monumental vagueness.

And in closing, Nick and I bid an appropriately raucous and conflicted adieu to the Hunter Thompson of Cybersecurity: John McAfee

And More!

                                                                                                           

Download the 368th Episode (mp3)

 

You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

 

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

We couldn’t avoid President Biden’s trip to Europe this week. He made news (but only a little progress) on cybersecurity at every stop. Nick Weaver and I dig into the President’s consultations with Vladimir Putin, which featured veiled threats and a modest agreement on some sort of continuing consultations on protecting critical infrastructure.

Jordan Schneider sums up the G7 and NATO statements aligning with U.S. criticisms of China.

And our newest contributor, Michael Ellis, critiques the EU-U.S. consultations on technology, which featured a complete lack of U.S. resolve on getting an outcome on transatlantic data flows that would preserve US intelligence capabilities.

Michael also recaps the latest fallout from the Colonial Pipeline ransomware shutdown – new regulatory initiatives from TSA and a lot of bipartisan regulatory proposals in Congress. I note the very unusual (or, maybe, all too usual) meaning given to “bipartisanship” on Capitol Hill.

Nick isn’t exactly mourning the multiple hits now being suffered by ransomware insurers, from unexpected losses to the ultimate in concentrated loss – gangs that hack the insurer first and then systematically extort all its ransomware insurance customers.

Jordan sums up China’s new data security law. He suggests that, despite the popular reporting on the law, which emphasizes the government control narrative, the motive for the law may be closer to the motive for data protection laws in the West – consumer suspicion over how private data is being used. I’m less convinced, but we have a nice discussion of how bureaucratic imperatives and competition work in the Peoples Republic of China.

Michael and Nick dig into the White Paper on FISA applications published by the outgoing chairman of the Privacy and Civil Liberties Oversight Board. Notably, in my mind, the White Paper does not cast doubt on the Justice Department’s rebuttal to a Justice Inspector General’s report suggesting that the FISA process is riddled with error. The paper also calls urgently for renewal of the expired FISA section 215 authority and suggests several constructive changes to the FISA paperwork flow.

In quick hits, Michael brings us up to date on the FCC’s contribution to technology decoupling from China: a unanimous vote to exclude Chinese companies from the U.S. telecom infrastructure and a Fifth Circuit decision upholding its decision to exclude Chinese companies from subsidized purchases by U.S. telecom carriers.  And Jordan reminds us just how much progress China has made in exploring space.

And more!

                                                                                                           

Download the 367th Episode (mp3)

 

You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

 

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

 

 

Just as retail stores, bars, restaurants, and entertainment venues in New York City have been authorized to relax COVID restrictions, they will soon have to confront a new set of requirements—this time focused on their collection of customers’ biometric information. On July 9, 2021, New York City’s new law addressing the collection and use of biometric identifier information will go into effect. The NYC Biometric Law is part of a broader trend of state and local governments adopting laws to regulate business’ collection and use of biometric information.

The NYC Biometric Law requires “[a]ny commercial establishment that collects, retains, converts, stores or shares biometric identifier information of customers” to provide notice of such practices by “placing a clear and conspicuous sign near all of the commercial establishment’s customer entrances notifying customers in plain, simple language…that customers’ biometric identifier information is being collected, retained, converted, stored or shared, as applicable.” The sign must adhere to a “a form and manner prescribed by the commissioner of consumer and worker protection by rule.”

In addition, the NYC Biometric Law prohibits “sell[ing], leas[ing], trad[ing], [or] shar[ing] in exchange for anything of value or otherwise profit[ing] from the transaction of biometric identifier information.”

“Commercial establishment” is defined as “a place of entertainment, a retail store, or a food and drink establishment.” The law does not apply to financial institutions.

“Biometric identifier information” is defined as “a physiological or biological characteristic that is used by or on behalf of a commercial establishment, singly or in combination, to identify, or assist in identifying, an individual, including, but not limited to: (i) a retina or iris scan, (ii) a fingerprint or voiceprint, (iii) a scan of hand or face geometry, or any other identifying characteristic.” Significantly, the law’s notice requirement does not apply “to [b]iometric identifier information collected through photographs or video recordings, if (i) the images or videos collected are not analyzed by software or applications that identify, or that assist with the identification of, individuals based on physiological or biological characteristics, and (ii) the images or video are not shared with, sold or leased to third-parties other than law enforcement agencies.”

The NYC Biometric Law contains a private right of action. Private plaintiffs may seek both monetary damages and injunctive relief. For violations of the notice requirement only, commercial establishments are provided with a 30-day cure period. For each violation of the notice requirement, businesses may be liable for damages of $500. Each negligent violation of the prohibition on selling, leasing, trading, or sharing biometric identifier information for anything of value or profit may result in damages of $500, while each intentional or reckless violation may lead to damages of $5,000. Plaintiffs may also recover reasonable attorney’s fees and costs.

Paul Rosenzweig lays out the much more careful, well-written, and a policy catastrophe in the making. The main problem? It tries to turn one of the most divisive issues in American life into a problem to be solved by technology. Apparently because that has worked so well in areas like content suppression. In fact, I argue, the report will be seen by many, especially in the center and on the right, as an effort to impose proportional representation quotas by stealth in a host of places that have never been the objects of such policies before. Less controversial, but only a little, is the U.S. government’s attempt to make government data available for training more AI algorithms. Jane more or less persuades me that this effort too will end in tears or stasis.

In cheerier news, the good guys got a couple of surprising wins this week. While encryption and bitcoin have posed a lot of problems for law enforcement in recent years, the FBI has responded with imagination and elan, at least if we can judge by two stories from last week. First, Nick Weaver takes us through the laugh-out-loud facts behind a, government-run encrypted phone for criminals complete with influencers, invitation-only membership, and nosebleed pricing to cement the phone’s exclusive status. Jane Bambauer unpacks some of the surprisingly complicated legal questions raised by the FBI’s creativity.

Paul Rosenzweig lays out the much more obscure facts underlying the FBI’s recovery of much of the ransom paid by Colonial Pipeline. There’s no doubt that the government surprised everyone by coming up with the private key controlling the bitcoin account. We’d like to celebrate the ingenuity behind the accomplishment, but the how it pulled it off, probably because it hopes to do the same thing again and can’t if it blows the secret. FBI isn’t actually explaining

The Biden administration is again taking a shaky and impromptu Trump policy and giving it a sober interagency foundation.  This time it’s the TikTok and WeChat bans; these have been rescinded. But a new process has been put in place that could restore and even expand those bans in a matter of months. Paul and I disagree about whether the Biden administration will end up applying the Trump policy to TikTok or WeChat or to a much larger group of Chinese apps.

For comic relief, Nick regales us with Brian Krebs’s wacky story of the FSB’s weird and counterproductive attempt to secure communications to the FSB’s web site.

Jane and I review the latest paper by Bruce Schneier (and Henry Farrell) on how to address the impact of technology on American democracy. We are not persuaded by its suggestion that our partisan divide can best be healed by more understanding, civility, and aggressive prosecutions of Republicans.

Finally, everyone confesses to some confusion about the claim that the Trump Justice Department breached norms in its criminal discovery motions that turned up records relating to prominent Democratic congressmen and at least one Trump administration official.

Best bet: this flap will turn out to be less interesting the more we learn. But I renew my appeal, this time aimed at outraged Democrats, for more statutory guardrails and safeguards against partisan misuse of national security authorities. Because that’s what we’ll need if we want to keep those authorities on the books.

And more!

                                                                                                           

Download the 366th Episode (mp3)

 

You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

 

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

The Biden administration is pissing away one of the United States’ most important counterterrorism intelligence programs. At least that’s my conclusion  from this episode’s depressing review of the administrations halting and delusion-filled approach to the transatlantic data crisis. The EU thinks time is on its side, and it’s ignoring Jamil Jaffer’s heartfelt plea to be a better ally in the face of Russian and Chinese pressure. Every day, Silicon Valley companies whose data stores in the US have been a goldmine for counterterrorism are feeling legal pressure to move that data to Europe. Those companies care little whether the US gets good intelligence from its section 702 requests, at least compared to the prospects of massive fines and liability in Europe. So, unless the administration creates a countervailing incentive, the other actors will simply present Washington with a fait accompli. The Biden administration, like the Trump administration before it, seems unable to grasp the need for action. When Trump was in charge, we could call him incompetent. When we wake up to what we’ve lost under Biden, that’s what we’ll call him, too

For companies struggling with their role in this global drama, Charles Helleputte has moderately good news. The European Commission, contrary to the dogmatic approach of the data protection agencies, has opened a door for transfers using the new standard contractual clauses. If your data has not been requested by the U.S. under section 702 or similar intelligence programs and you can offer good reason to think they won’t be requested in the future, you could avoid the hammer of a data export ban.

In other news, Jamil and I cross swords on whether the Colonial pipeline hack should have ended TSA’s light-touch oversight of pipeline cybersecurity.

And Nate Jones and I dig deep into the state trend toward regulating police access to DNA ancestry databases. After some fireworks, we come close to agreement that some state law provision on database access is inevitable and workable, but that the Maryland law is so hostile to solving brutal crimes with DNA searches that it is hard to distinguish from a ban.

Jamil explains the Biden administration’s decision to provide a new foundation for the Trump ban on investment in Chinese military companies. Treasury will take the program away from DOD, which had handled its responsibilities with the delicacy of Edward Scissorhands.

Nate limbers up the DeHype Machine to put in perspective DOJ’s claim to be giving ransomware hacks the same priority as terrorism. Jamil takes on autonomous drones and pours cold water on the notion that DOD will be procuring some of its drones from China.

In a moment of weakness I fail to attack or even mock the UN GGE’s latest report on norms for cyberconflict.

And in a series of quick hits:

  • Jamil reviews Facebook’s latest antitrust problems in the EU and UK.
  • I bring back the “throuple” Congresswoman, whose failed pivot from abuser of power to victim of revenge porn has just cost her over $100,000.
  • In case you haven’t heard, Facebook might let Trump come back in January 2023, and his blog page has shut down for good.
  • The European Commission has proposed a trusted and secure Digital Identity for all Europeans but Charles thinks there’s less there than meets the eye.
  • And Nigeria has suspended Twitter after the platform shut down the President’s account for obliquely threatening military action against secessionists.
  • And more!

                                                                                                            

Download the 365th Episode (mp3)

 

You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

 

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.