Jordan Schneider rejoins us after too long an absence to summarize the tech policy coming out of Beijing today: Any Chinese government agency with a beef against a tech company has carte blanche to at least try it out. From Didi and others being told to stop taking on subscribers to an end to Western IPOs, to the forced contributions to common welfare, China’s beefs with Big Tech sound a lot like those in the West (well, except for the complaints about AI-enabled censorship). What’s different is that China has freed up its agencies to actually throw sand in the gears of technology businesses. Jordan and I explore the downside of empowering agencies this way. First, it makes the Chinese government responsible for an enormous and hard to govern part of the economy, as the government’s problems with the overvalued property sector show. And it creates opportunities for companies that are better at politics than customer service to cripple their competitors.

Meanwhile, the U.S. government is trying out its own version of letting a thousand regulatory flowers bloom. Michael Weiner unpacks the new, amended complaint in FTC v. Facebook and concludes that the FTC has done a plausible job of meeting the objections that led the district court to throw out the first complaint.

Then he tells us the five buckets of sand the Biden administration is dumping into technology merger law in the hope of slowing a massive acquisition boom, from no longer granting early termination, insisting on future merger approvals in standard consent agreements, issuing “close at your own peril” letters when they haven’t finished their review, and replacing the Vertical Merger Guidelines issued in June 2020 with, uh, nothing.

Pete Jeydel takes us on a tour of Project Raven and the deferred prosecution agreements imposed on three former U.S. government hackers who sold their services too freely to the UAE. The cases raise several novel legal issues, but one of the mysteries is why the prosecutors ultimately settled the cases without jail time. My guess? Graymail.

In quick hits and updates we note: That TikTok faces an Irish General Data Protection Regulation (“GDPR”) probe over children’s data and – more significantly – its transfers of data to China. What’s most remarkable to me is how long TikTok has staved off this scrutiny. Who says Donald Trump was bad for Chinese tech companies?

President Biden has nominated a 5th Federal Trade Commission Commissioner. Alvaro Bedoya is a Georgetown Law professor who writes about privacy and face recognition. There’s a lot of dumb stuff out there about AI bias and face recognition, but I’m pleased to say that it doesn’t look as though Prof. Bedoya wrote any of it.

The special prosecutor for Russia-Russia-Russia-gate has indicted a Perkins Coie lawyer for lying to the FBI general counsel while turning over a bunch of bogus evidence of Donald Trump’s ties to Russia. Turns out, I know all of the principals in this drama, and it’s uncomfortable.

Captain Obvious, speaking for the FBI, acknowledged that there is “no indication” Russia has cracked down on ransomware gangs after President Biden yelled at Vladimir Putin about them.

The 4th Circuit has tossed Wikimedia’s money-wasting lawsuit against the National Security Agency for its collection of overseas intelligence in the U.S.

And the Bolsonaro’s ban on social media censorship of politicians has been doubly overturned by the Brazilian Senate and its Supreme Court, leaving Bolsonaro’s decree in the same place as Florida’s (and, probably soon, Texas’s) effort to do something similar.

And More!

                                                                                                                                     

Download the 375th Episode (mp3)

You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

 

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

 

The district court has ruled in the lawsuit between Epic and Apple over access to the Apple app store. Apple is claiming victory and Epic is appealing. But Apple’s victory is not complete, and may have a worm at its core. Jamil Jaffer explains.

Surprised that ransomware gangs REvil and Groove are back – and thumbing their noses at President Biden? Dmitri Alperovitch isn’t. He explains why U.S. ransomware policy has failed so far.

WhatsApp has finally figured out how to let users encrypt their chat backups in the cloud, to the surprise of many users who didn’t realize their backups weren’t encrypted.

Speaking of the encryption debate, Dmitri notes that Proton Mail joined the scrum this week, in a way it no doubt regrets. After all its bragging that mail users’ privacy is “protected by Swiss law,” Proton Mail disclosed that Swiss law can be surprisingly law enforcement friendly. Responding to a French request through Europol, Swiss authorities ordered the service to collect metadata on a particular account and overrode what had been seen as a Swiss legal requirement that users be notified promptly of such actions.

Is China suffering from GRU Russia’s Main Intelligence Directorate (“GRU“) envy? I ask and David Kris answers: It sure looks that way, as China has begun trying to rally Chinese in America to support Chinese government positions on things like the origin of COVID. So far, China’s record of success is as dismal as the GRU’s but I argue that it poses a bigger problem for the body politic and Chinese American interest groups.

Who’d have guessed? Turns out that the EU’s always-flakey General Data Protection Regulation (“GDPR”) provision against allowing automated decision making that affects people isn’t just a charming nostalgia act; it’s yet another reason for Europe to be left behind in the technology race. Jamil reports on a high-powered UK task force recommendation that the Brits dump the provision in order to allow for the growth of an AI industry.

David and I debate the meaning of Brazilian President Jair Bolsonaro banning social networks from removing political posts.

And in a few quick hits:

  • I praise the Biden administration (faintly) for finally kicking off serious negotiations with the EU about transatlantic data transfer.
  • Dmitri dissects the undiplomatic speech of China’s ambassador to the U.S.
  • David downloads the inside poop on smart toilets. Among other things, they’ll be identifying us with, uh, let’s just call it the opposite of facial recognition. Which raises the question: How long before “woke” toilet engineers start canceling toilet users guilty of the microaggression of leaving the toilet seat up.
  • And Dmitri offers a solution for the dual European Community (“EC”) encryption story.

And More!

                                                                                                             

Download the 374th Episode (mp3)

You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

Back at last from hiatus, the podcast finds a host of hot issues to cover. Matthew Heiman walks us through all the ways that China and the US found to get in each other’s way on technology. China’s new data security and privacy laws take effect this fall, and in keeping with a longstanding theme of the podcast – that privacy law is mostly about protecting the privilege of the powerful – we muse on the ways that legal innovations in the West have empowered China’s rulers. The SEC is tightening the screws on Chinese companies that want to list on American exchanges. Meanwhile, SenseTime is going forward with a $2 billion IPO in Hong Kong despite being subject to the stiffest possible Commerce Department sanctions. Talk about decoupling!

In Washington, remarkably, a bipartisan breach notification law is moving “We Can’t Run a Twelfth-Century Regime Without WhatsApp!” through both House and Senate. Michael Ellis explains the unorthodox (but hardly unprecedented) path the law is likely to take – a “preconference” followed by attachment to the defense authorization bill scheduled to pass this fall.

I ask Brian Egan for the tech fallout from the fall of the U.S.-backed regime in Afghanistan. All things considered, it’s modest. Despite hand-wringing over data left behind, that data may not be really accessible. Google isn’t likely to turn over government emails to the new regime, if only because US sanctions make that legally risky. The Taliban’s use of WhatsApp is likely to suffer from the same sanctions barrier. I predict a Taliban complaint that it’s being forced to run a thirteenth century regime with twelfth century technology.

Meanwhile, Texas Republicans are on a roll, as Dems forced to return to the State House sit on their hands. They’ve adopted a creative and aggressive antiabortion law that has proven a challenge to tech companies, which responded by canceling tech services for pro-life groups and promising to defend gig workers who are caught up in litigation. Texas has kept pace, adopting a bill that limits Silicon Valley censorship of political speech; it raises many of the same issues as the Florida statute, but without the embarrassing prostration before the Disney theme park empire. I ask whether Texas could have used the same tactics for its interpretation of section 230 that it used in the abortion bill – authorizing private suits but not government enforcement. Such tactics work when there is a real possibility that the Supreme Court will overturn some settled circuit rulings, and section 230 is ripe for exactly that.

Matthew Heiman and I debate whether the Justice Department’s dropping of several Chinese visa fraud cases heralds a retrenchment in Justice’s China Initiative.

Michael and I dig into the Apple decision to alienate the Guardians of Privacy in an effort to do something about child sex abuse material on iPhones – and Apple’s recent decision to alienate the rest of the country by casting doubt on whether it would ever do something about child sex abuse material on its phones.

Finally, in quick hits, Brian doubts the significance of claims that the Israeli government is launching an investigation of NSO Group over spyware abuse. Michael picks apart the Cyberspace Solarium Commission’s report card on Congress’s progress implementing its recommendations. And Brian highlights the UK’s new and much tougher version of CFIUS, the National Security and Investment Act 2021. I turn that into career advice for our listeners.

And More!

                                                                                                             

Download the 373rd Episode (mp3)

You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

Blockchain takes over the cyberlaw podcast again! This episode of the cyberlaw podcast is a roundtable discussion of the various new regulations that have been brought into effect in the latter part of 2020 and the first half of 2021. There was a flurry of last-minute rulemakings at the end of the previous administration, many of which spilled over into 2021; an expert panel of lawyers from Steptoe’s blockchain and cryptocurrency practices takes the audience through them.

First, the Treasury Department, through the Financial Crimes Enforcement Network (FinCEN) and the Office of Financial Assets Control (OFAC), have been very busy in the first half of 2021. Evan Abrams discusses the various rulemakings that have been implemented, and how they effect the blockchain and cryptocurrency space.

One proposed rulemaking, the Self-Hosted Wallet NPRM, generated a significant amount of pushback from the industry. All the activity seemed to have an effect, as several extensions to the comment period were granted, and the final rule has not yet been implemented. Evan walks us through the effect that the NPRM would have on the industry, and why the pushback was so strong. Evan also gives us an update on the status of the Travel Rule NPRM that was introduced in late 2020. The comment periods for both rules have closed, and Evan expects that final rules could be issued shortly (or not).

Evan and host Alan Cohn also discuss recent OFAC enforcement actions regarding sanctions compliance. There were two main blockchain enforcement actions, one coming at the end of 2020 against BitGo, and one early this year against BitPay, making these the first enforcement actions by OFAC against blockchain companies. Both involve US companies with individuals using their platforms who are located in comprehensively sanctioned jurisdictions. This shows an increased focus on digital assets and their potential use by sanctioned persons or by persons in sanctioned jurisdictions.

In other news, Matt Kulkin discusses all things Commodity Futures Trading Commission (CFTC), including the jurisdictional reach of the CFTC and recent comments by a CFTC commissioner regarding decentralized finance (DeFi).

Matt, a former CFTC division director, provides a high-level overview of CFTC jurisdiction over cryptocurrencies. The CFTC takes a broad interpretation of the definition of commodity, and has asserted jurisdiction over cryptocurrencies. However, where customers receive actual delivery of the cryptocurrency (or other commodity) within a specific amount of time (called “spot trading”), trading is not subject to CFTC jurisdiction. By meeting this time window and not offering margin trading or leverage, cryptocurrency trading platforms in the US generally don’t need to register with the CFTC as a trading platform.

Matt describes how the CFTC is often one of the first US regulators to issue published statements about how emerging crypto assets fit into existing CFTC regulation. To this end, Matt describes the comments relating to DeFi made by Commissioner Dan Berkovitz. Berkovitz commented that he considers unlicensed DeFi markets for derivatives to be a bad idea, and does not see how they are legal under the Commodities Exchange Act. Matt breaks down the arguments for and against this view.

Next, Lizzie Baird brings us up to date with the way in which the Securities Exchange Commission (SEC) is thinking about crypto-assets generally.

Lizzie, a former SEC deputy division director, describes the way in which the SEC applies the Howey test to crypto-assets in order to determine if a token is a security, and therefore whether the SEC has jurisdiction. The SEC has issued a framework to help token issuers determine whether they are issuing a security. The SEC has previously issued no action letters to entities that issue stable-value coins used solely within closed platforms, such as TurnKey Jet. Lizzie notes that the SEC has justified a relatively ambiguous application of securities regulation in an effort to promote innovation (and preserve the ability to judge token projects on a case-by-case basis), with clearer guidelines being drawn as blockchain technology matures.

Lizzie notes that this may have changed in the recent SEC enforcement action against CoinSchedule.com, a UK company that the SEC determined had violated anti-touting laws. This enforcement action appears to show that from the SEC’s perspective, the evolution of the Howey framework has ended, and that the SEC is becoming more creative in its enforcement of securities laws in order to go after any conduct that it believes to be a violation. CoinSchedule also demonstrates the SEC’s view on extending jurisdiction to non-US entities that have US customers.

In a roundup of other notable securities-related activities, Alan highlighted:

  • SEC v Ripple: This is an interesting case that goes beyond just simply tokens that conducted initial coin offerings (“ICOs”), with the SEC taking new positions that it eschewed in previous enforcement cases. For example, in this case the SEC uses actions that took place before the 21(a) Report on the DAO token in July 2017, which the SEC had previously asserted put the industry on notice that crypto assets could be categorized as securities. The parties are locked in discovery battles, with the SEC trying to rely on ostensibly attorney-client privileged information, and Ripple gaining the ability to depose SEC officials from the 2018 timeframe.
  • BlockFi: Five state securities regulators issued cease-and-desist letters against BlockFi for marketing interest-bearing cryptocurrency deposit accounts as the regulators allege they are securities.
  • The SEC seems to also be specifically targeting trading platforms, as shown by documents recently filed by Circle regarding the Poloniex trading platform that Circle previously owned, in connection with Circle’s efforts to go public via a SPAC transaction.

Finally, the roundtable discusses what they believe to be upcoming trends in the industry, including:

  • Lizzie and Matt discuss SEC and CFTC enforcement focus moving forward, including potential DeFi enforcement trends.
  • Evan outlines the potential next steps for the Treasury, FinCEN, OFAC, and the Financial Action Task Force (FATF) relating to DeFi and virtual asset service providers.

And More!

                                                                                                            

 

Download the 372nd Episode (mp3)

You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

The Biden administration’s effort to counter ransomware may not be especially creative, but it is comprehensive. The administration is pushing all the standard buttons on the interagency dashboard, including the usual high-level task force and a $10 million reward program (but not including hackback authority for victims, despite headlines suggesting otherwise. And all the noise seems to be having some effect, as the RE ransomware gang’s web sites have mysteriously shut down.

Our interview is with Josh Steinman, who served as the National Security Council’s cybersecurity senior director for the entire Trump administration. He offers his perspective on the issues and the personalities that drove cybersecurity policy in those chaotic years. As a bonus, Josh and I dig into his public effort to find a suitable startup, an effort we have to cut short as I start getting too close to one of the more promising possibilities.

Nick Weaver reminds us (in song, no less) that the government’s efforts to stop scourges like Trickbot have a distinct whiff of Whack-a-Mole, and the same may be true of REvil.

Maury Shenk covers the Biden administration’s belated but well-coordinated international response to China’s irresponsible Microsoft Exchange hack, including the surprising revelation that China may be back to hacking like it’s 1999 1999 – relying on criminal hackers to serve the government’s ends.

In other China news, Maury Shenk and Pete Jeydel catalog the many ways that the current regime is demonstrating its determination to bring China’s tech sector to heel. It’s punishing Didi in particular for doing a U.S. IPO despite go-slow signals from Beijing. It’s imposing cybersecurity reviews on other companies that IPO outside China. And it seems to be pressing for competition concessions that the big tech companies would have successfully resisted a few years ago.

It was a big week for state-sponsored attacks on secure communications. Nick and I dig in the FBI and Australian federal police coup in selling ANOM phones to criminal gangs. Previewing an article for Lawfare, I argue that the Australian police may have to answer tough questions about whether their legal authority for the phone’s architecture really avoided introducing a systemic weakness into the phone’s security.

Law enforcement agencies around the world could face even tougher questions if they’ve been relying on NSO or Candiru, Israeli firms that compromise mobile phones for governments.Both firms have been on the receiving end of harsh forensics analyses from Amnesty International and Citizen Lab. Nick thinks the highly specific and centralized target logs are particularly a problem for NSO’s claims that it doesn’t actually know the details of how its malware is deployed.

Pete Jeydel tells us that the administration is learning to walk and chew gum on cybersecurity at the same time. While coordinating pushes on Chinese and Russian hacks, it also managed to get big chunks of the government to turn in their federal cybersecurity homework on time. Pete talks us through one of those assignments, the NTIA’s paper setting minimum elements for a Software Bill of Materials.

It wouldn’t be the Cyberlaw Podcast without a brief rant on content moderation. The Surgeon General claimed this week that “Misinformation takes away our freedom to make informed decisions about our health.” He didn’t say that administration censorship would give us our freedom back, but that seems to be the administration’s confident view, as the President, no less, accuses Facebook of “killing people” by not jumping more quickly to toe the CDC’s official line.

And if you thought it would stop with social media, think again.  The White House is complaining that telecom carriers also should be screening text messages that are hostile to vaccinations.

Finally, just to show that the world has truly turned upside down, Maury reminds me that a German – German! – court has fined American social media for too enthusiastically censoring a lockdown protest video.

Pete tells us what’s in the new Colorado privacy bill. Short version: it joins Virginia’s in some of hosing down California’s excesses.

And in short takes:

  • Maury explains Vietnam’s version of China’s fifty-cent army.
  • Nick explains why Psiphon is a better tool for evading Cuban censorship that the sleaze-infested Tor system.
  • Maury updates me on the European Parliament LIBE committee’s latest proposal for accepting the U.S. intelligence community’s transatlantic surrender on data flows.
  • And Pete tells us that the SEC may finally be putting the screws to companies that have been lax about reporting breaches to their investors.
  • And more!

                                                                                                            

 

Download the 371st Episode (mp3)

 

You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

 

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

 

 

We begin the episode with the Biden administration’s options for responding to continued Russian ransomware outrages. Dmitri Alperovitch reprises his advice in the Washington Post that Putin will only respond to strength and U.S. pressure. I agree but raise the question whether the U.S. has the tools to enforce another set of alleged red lines, given Putin’s enthusiasm and talent for crossing them. If jumping U.S. red lines were an Olympic sport, Russia would have retired the gold by now. Dmitri reminds us that Russian cooperation against cybercrime remains a mirage. He also urges that we keep the focus on ransomware and not the more recent attempt to hack the Republican National Committee.

The Biden White House has been busy this week, or at least Tim Wu has. When Wu took a White House job as “Special Assistant to the President for Technology and Competition Policy,” some might have wondered why he did it. Now, Gus Hurwitz only after giving child abusers a six-month holiday from scrutiny tells us, it looks as though he was given carte blanche to turn his recent think tank paper into an Executive Order. Gus: Biden targets Big Tech in sweeping new Executive Order cracking down on anti-competitive practices. It’s a kitchen sink full of proposals, Mark MacCarthy notes, most of them more focused on regulation than competition. That observation leads to a historical diversion to the way Brandeisian competition policy aimed at smaller competitors and ended by creating bigger regulatory agencies and bigger companies to match.

We had to cover Donald Trump’s class actions against Twitter, Facebook, and Google, but if the time we devoted to the lawsuits was proportionate to their prospects for success, we’d have stopped talking in the first five seconds.

Mark gives more time to a House Republican leadership plan to break up Big Tech and stop censorship. But the plan (or, to be fair, the sketch) is hardly a dramatic rebuke to Silicon Valley – and despite that isn’t likely to get far. Divisions in both parties’ House caucuses now seem likely to doom any legislative move against Big Tech in this Congress.

The most interesting tech and policy story of the week is the Didi IPO in the U.S., and the harsh reaction to it in Beijing. Dmitri tells us that the government has banned new distributions of Didi’s ride-sharing app and opened a variety of punitive regulatory investigations into the company. This has dropped Didi’s stock price, punishing the U.S. investors who likely pressed Didi to launch the IPO despite negative signals from Beijing.

Meanwhile, more trouble looms for the tech giant, as Senate conservatives object to Didi benefiting from U.S. investment and China makes clear that Didi will not be allowed to provide the data needed to comply with U.S. stock exchange rules.

Mark and Gus explain why 37 U.S. states are taking Google to court over its Play Store rules and why, paradoxically, Google’s light hand in the Play store could expose it more to antitrust liability than Apple’s famously iron-fisted rule.

Dmitri notes the hand-wringing over the rise of autonomous drone weapons but dismisses the notion that there’s something uniquely new or bad about the weapons (we’ve had autonomous, or at least automatic, submarine weapons, he reminds us, since the invention of naval mines in the fourteenth century).

In quick hits, Gus and Dmitri offer dueling perspectives on the Pentagon’s proposal to cancel and subdivide the big DOD cloud contract.

Gus tells us about the other Fortnite lawsuit against Apple over it app policy; this one is in Australia and was recently revived.

As I suspected, Tucker Carlson has pretty much drained the drama from his tale of having his communications intercepted by NSA. Turns out he’s been seeking an interview with Putin. And no one should be surprised that the NSA might want to listen to Putin.

The Indian government is telling its courts that Twitter has lost its 230-style liability protection in that country. As a result, it looks as though Twitter is rushing to comply with Indian law requirements that it has blown off so far. Still, the best part of the story is Twitter’s appointment of a “grievance officer.” Really, what could be more Silicon Valley Woke? I predict it’s only a matter of months before the whole Valley fills with Chief Grievance Officers, after which the Biden administration will appoint one for the Executive Branch.

And, finally, I give the EU Parliament credit for doing the right thing in passing legislation that lets companies look for child abuse on their platforms. Readers may remember that the problem was EU privacy rules that threatened to end monitoring for abuse all around the world. To make sure we remembered that this is still the same feckless EU Parliament as always, the new authority was grudgingly adopted only after giving child abusers a six-month holiday from scrutiny. It was also limited to three years, after which the Parliament seems to think that efforts to stop the sexual abuse of children will no longer be needed.

And more!

                                                                                                                                                     

Download the 370th Episode (mp3)

 

You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

 

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.

 

Ransomware attacks have been soaring in frequency and severity, affecting companies, government agencies, and nonprofits and leading to larger and larger ransom demands as a condition for unlocking the victim’s information systems. On June 30, 2021, the New York State Department of Financial Services (NYDFS) issued guidance on how potential victims can minimize the risk of a successful ransomware attack. While the controls are officially characterized as guidance, NYDFS makes clear that it “expects regulated companies to implement” the preventative controls, in particular, “whenever possible.” Companies not regulated by NYDFS should also consider implementing the guidance, since they are just as susceptible to ransomware attacks and the NYDFS guidance may be considered by other regulators and courts as contributing to a general standard of reasonable security in the face of this growing cyber threat.

NYDFS reported that from January 2020 through May 2021, NYDFS-regulated companies reported 74 ransomware attacks ranging “from crippling days-long shutdowns to minor disruption from temporary loss of a few computers.” In addition, NYDFS reported a “growing number of third-party Cybersecurity Events – where ransomware attacks against a critical vendor disrupt[ed] the operations of a regulated company.”

NYDFS’s guidance highlights nine controls to prevent or respond to ransomware attacks:

  • “Email Filtering and Anti-Phishing Training”
    • Companies should provide their workforce with “recurrent phishing training, including how to spot, avoid, and report phishing attempts.” They “should also conduct periodic phishing exercises and test whether employees will click on attachments and embedded links in fake emails, and remedial training for employees as necessary.” Lastly, companies should ensure that emails are “filtered to block spam and malicious attachments/links from reaching users.”
  • “Vulnerability and Patch Management”
    • Companies should implement “a documented program to identify, assess, track, and remediate vulnerabilities on all enterprise assets within their infrastructure.” This “program should include periodic penetration testing.” In addition, companies should ensure that “[v]ulnerability management include[s] requirements for timely application of security patches and updates” and “[w]herever possible … automatic updates” should be enabled.
  • “Multi-Factor Authentication (‘MFA’)”
    • The guidance reminds regulated companies that “MFA for remote access to the network and all externally exposed enterprise and third-party applications is required by” the NYDFS Cybersecurity Regulation. The ransomware guidance recommends that companies expand the use of MFA to “[a]ll logins to privileged accounts, whether remote or internal.”
  • “Disable RDP Access”
    • Remote Desktop Protocol (RDP) access should be disabled whenever possible. However, if “RDP access is deemed necessary, then access should be restricted to only approved (whitelisted) originating sources and [companies should] require MFA as well as strong passwords.”
  • “Password Management”
    • “Regulated companies should ensure that strong, unique passwords are used.” In particular, “passwords of at least 16 characters” should be used and “commonly used passwords” should be banned. Larger organizations should “consider a password vaulting PAM (privileged access management) solution” that would require “employees to request and check out passwords.” Finally, companies should disable “password caching” wherever possible.
  • “Privileged Access Management”
    • Companies should implement “the principle of least privileged access – each user or service account should be given the minimum level of access necessary to perform the job.” In addition, they “should universally require MFA and strong passwords” for privileged accounts and “maintain and periodically audit an inventory of all privileged accounts. Privileged accounts should be used only for tasks requiring elevated privileges, and administrators should have a second non-privileged account for all other tasks such as logging into their workstation, email, drafting documents, etc.”
  • “Monitoring and Response”
    • “Regulated companies must have a way to monitor their systems for intruders and respond to alerts of suspicious activity.” As part of such efforts, companies should “implement an Endpoint Detection and Response (‘EDR’) solution, which monitors for anomalous activity. … Companies with larger and more complex networks should also have lateral movement detection and a Security Information and Event Management (SIEM) solution that centralizes logging and security event alerting.”
  • “Tested and Segregated Backups”
    • “Regulated companies should maintain comprehensive, segregated backups that will allow recovery in the event of a ransomware attack.” In addition, “at least one set of backups should be segregated from the network and offline” to ensure backups are not compromised by the attack. Finally, companies should “periodically test backups by actually restoring critical systems from backups” so that backups “actually work when needed.”
  • “Incident Response Plan”
    • Companies should implement an “incident response plan that explicitly addresses ransomware attacks,” and regularly test the plan, with involvement of senior leadership.

NYDFS’ guidance also recommends against the payment of ransoms to attackers. NYDFS’ position on this issue is in line with that of the FBI, which also recommends against the payment of ransoms. Companies should also closely consider the advisories of the Treasury Department’s  Office of Foreign Assets Control (OFAC) and Financial Crimes Enforcement Network (FinCEN) on the sanctions and anti-money laundering (AML) risks of making or facilitating ransomware payments, as we have previously addressed.

Finally, NYDFS’ guidance advises that “any successful deployment of ransomware on a [regulated company’s] internal network should be reported to DFS ‘as promptly as possible and within 72 hours at the latest.'” It also recommends that “any intrusion where hackers gain access to privileged accounts” should be reported.

On July 7, 2021, Gov. Jared Polis signed into law the Colorado Privacy Act (CPA), which will go into effect on July 1, 2023. Like California’s and Virginia’s data privacy laws, the CPA aims to provide consumers with greater control over their data and enhanced transparency with respect to how their data is used. However, businesses should pay close attention to the unique nuances of the CPA, which are likely to complicate compliance strategies.

Scope

The CPA applies to data “controllers” that “conduct[] business in Colorado or produce[] or deliver[] commercial products or services that are intentionally targeted to residents of Colorado” and that:

  • “Control[] or process[] the personal data of [100,000 Colorado residents] or more during a calendar year,” or
  • “Derive[] revenue or receive[] a discount on the price of goods or services from the sale of personal data and process[] or control[] the personal data of [25,000 Colorado residents] or more.”

“Controller” is defined as “a person that, alone or jointly with others, determines the purposes for and means of processing personal data.”

The CPA covers “consumers” who are defined as “Colorado resident[s] acting only in an individual or household context.” Importantly, it does not cover individuals “acting in a commercial or employment context, as a job applicant or as a beneficiary of someone acting in an employment context.”

The CPA applies to “personal data” which is defined as “information that is linked or reasonably linkable to an identified or identifiable individual” and “[d]oes not include deidentified data or publicly available information.”

The CPA does not apply to:

  • Certain healthcare related information, including information related to HIPAA compliance;
  • “Activit[ies] involving the collection, maintenance, disclosure, sale, communication, or use of any personal data bearing on a consumer’s creditworthiness, credit standing, credit capacity, character, general reputation, personal characteristics, or mode of living by” (1) consumer reporting agencies, (2) furnishers of information for use in a consumer report, and (3) users of a consumer report;
  • Personal data in connection with certain state and federal laws, including:
    • The Colorado Health Benefit Exchange Act;
    • The federal Gramm-Leach-Bliley Act;
    • The federal Driver’s Privacy Protection Act of 1994;
    • The federal Children’s Online Privacy Protection Act of 1998; or
    • The federal Family Educational Rights and Privacy Act of 1974.
  • “Data maintained for employment records purposes;”
  • Air carriers;
  • A national securities association registered pursuant to the Securities Exchange Act of 1934;
  • Financial institutions covered under the federal Gramm-Leach-Bliley Act and implementing regulations;
  • “Customer data maintained by” public utilities or state body “if the data are not collected, maintained, disclosed, sold, communicated, or used except as authorized by state and federal law;” and
  • “Data maintained by” state and municipal government entities (including state institutions of higher education) “if the data is collected, maintained, disclosed, communicated, and used as authorized by state and federal law for noncommercial purposes.”

Rights and Requirements

The CPA provides consumers with the following rights:

  • “[T]he right to opt out of the processing of personal data concerning the consumer for the purposes of” (1) “targeted advertising,” (2) “sale of personal data,” and (3) “profiling in furtherance of decisions that produce legal or similarly significant effects concerning a consumer.”
  • “[T]he right to confirm whether a controller is processing personal data concerning the consumer and to access the consumer’s personal data.”
  • “[T]he right to obtain [their] personal data in a portable and, to the extent technically feasible, readily usable format that allows the consumer to transmit the data to another entity without hindrance.”
  • “[T]he right to correct inaccuracies in the consumer’s personal data, taking into account the nature of the personal data and the purposes of processing the consumer’s personal data.”
  • “[T]he right to delete personal data concerning the consumer.”

It bears emphasizing that the CPA’s right to deletion, like Virginia’s, is broader than California’s in that it applies to any personal data concerning the consumer, not just personal data collected from the consumer herself.

The CPA prohibits a controller from “increase[ing] the cost of, or decreas[ing] the availability of, [a] product or service” due solely to a consumer’s “exercise of a right” where such action by the controller is “unrelated to feasibility or the value of a service.”

The CPA requires a controller to “inform a consumer of any action taken on a request…without undue delay and, in any event, within [45] days after receipt of the request.” The CPA permits a controller to extend the 45-day period by an additional 45 days “where reasonably necessary, taking into account the complexity and number of the requests.” A controller exercising the additional 45-day period must inform the consumer of the extension within the initial 45 days following receipt of the request in addition to the reasons for the extension.

A controller is not required to comply with a consumer’s request “if the controller is unable to authenticate the request using commercially reasonable efforts, in which case the controller may request the provision of additional information reasonably necessary to authenticate the request.”

If a controller does not take action in response to a consumer request, the controller must “inform the consumer, without undue delay and, at the latest within [45] days after receipt of the request, of the reasons for not taking action and instructions for how to appeal the decision with the controller.”

The CPA requires controllers to “establish an internal process whereby consumers may appeal a refusal to take action on a request.” Controllers must respond to an appeal within 45 days of receipt with the ability to extend for an additional 60 days “taking into account the complexity and number of requests serving as the basis for the appeal.” A controller must inform a consumer of the extension within 45 days of receiving the appeal “together with the reasons for the delay.” A controller is also required to inform consumers of their right to contact the Colorado Attorney General regarding the result of their appeal.

The CPA requires controllers to adhere to the following requirements:

  • “[P]rovide consumers with a reasonably accessible, clear, and meaningful privacy notice” including:
    • “The categories of personal data collected or processed by the controller or a processor1;”
    • “The purposes for which the categories of personal data are processed;”
    • “How and where consumers may exercise their rights [under the CPA], including the controller’s contact information and how a consumer may appeal a controller’s action with regard to the consumer’s request;”
    • “The categories of personal data that the controller shares with third parties, if any;”
    • “The categories of third parties, if any, with whom the controller shares personal data.”
  • “[C]learly and conspicuously disclose the sale [of personal data to third parties] or processing [of personal data for targeted advertising], as well as the manner in which a consumer may exercise the right to opt out of the sale or processing.”
  • “[S]pecify the express purposes for which personal data are collected and processed.”
  • Ensure “collection of personal data [is] adequate, relevant, and limited to what is reasonably necessary in relation to the specified purposes for which the data are processed.”
  • “[N]ot process personal data for purposes that are not reasonably necessary to or compatible with the specified purposes for which the personal data are processed, unless the controller first obtains the consumer’s consent.”
  • “[T]ake reasonable measures to secure personal data during both storage and use from unauthorized acquisition.” Such measures “must be appropriate to the volume, scope, and nature of the personal data processed and the nature of the business.”
  • “[N]ot process personal data in violation of state or federal laws that prohibit unlawful discrimination against consumers.”
  • “[N]ot process a consumer’s sensitive data without first obtaining the consumer’s consent or, in the case of the processing of personal data concerning a known child, without first obtaining consent from the child’s parent or guardian.” “Sensitive data” is defined as (1) “[p]ersonal data revealing racial or ethnic origin, religious beliefs, a mental or physical health condition or diagnosis, sex life or sexual orientation, or citizenship or citizenship status,” (2) [g]enetic or biometric data that may be processed for the purpose of uniquely identifying an individual,” or (3) [p]ersonal data from a known child.” “‘Child’ means an individual under 13 years of age.”

In addition, the CPA prohibits controllers from “conduct[ing] processing that presents a heightened risk of harm to a consumer without conducting and documenting a data protection assessment of” such data processing activities. The data protection assessment requirement only applies to “personal data acquired on or after” the CPA’s effective date and to processing activities created or generated after July 1, 2023. The requirement is not retroactive. “Processing activities that present a heightened risk of harm to a consumer” include:

  • “Processing personal data for purposes of targeted advertising or for profiling if that profiling presents a reasonably foreseeable risk of:

(I) [u]nfair or deceptive treatment of, or unlawful disparate impact on, consumers; (II) [f]inancial or physical injury to consumers;

(III) [a] physical or other intrusion upon the solitude or seclusion, or the private affairs or concerns, of consumers if the intrusion would be offensive to a reasonable person; or

(IV) [o]ther substantial injury to consumers;”

  • “Selling personal data”; and
  • “Processing sensitive data.”

As part of the data protection assessment, controllers must “identify and weigh the benefits that may flow, directly and indirectly, from the processing to the controller, the consumer, and other stakeholders, and the public against potential risks to the rights of the consumer associated with the processing, as mitigated by the safeguards that the controller can employ to reduce the risks.” Controllers are directed to “factor into th[e] assessment the use of de-identified data and the reasonable expectations of consumers, as well as the context of processing and the relationship between the controller and the consumer whose personal data will be processed.”

Data processing assessments must be made available to the Colorado Attorney General upon request.

Enforcement and Penalties

The Colorado Attorney General and Colorado District Attorneys are granted exclusive authority to enforce the CPA. Violations of the CPA will constitute a deceptive trade practice carrying penalties of up to $2,000 per violation, but not more than $500,000 per series of violations. In addition to seeking monetary penalties, the Colorado Attorney General and Colorado District Attorneys are able seek injunctive relief to enjoin violations of the CPA. Importantly, the CPA explicitly excludes a private right of action.

Until January 1, 2025, controllers will receive an opportunity to cure violations within 60 days before facing an enforcement action.

Rulemaking

The Colorado Attorney General is also authorized to promulgate rules under the CPA. By July 1, 2023, the Attorney General must promulgate rules for universal opt-out mechanisms related to the processing of personal data for targeted advertising or the sale of personal data. By January 1, 2025, the Attorney General may adopt additional rules “govern[ing] the process of issuing opinion letters and interpretive guidance to develop an operational framework for business that includes a good faith reliance defense of an action that may otherwise constitute a violation of [the CPA].” Such rules must become effective by July 1, 2025.

Conclusion

While July 1, 2023 may seem far away, businesses should begin familiarizing themselves with the CPA and address necessary compliance measures at the same time they institute compliance measures for the Virginia Consumer Data Protection Act and the California Privacy Rights Act, which take effect January 1, 2023.

 

1 Processor is defined as “a person that processes personal data on behalf of a controller.”

We begin the episode with a review of the massive Kaseya ransomware attack.

Dave Aitel digs into the technical aspects while Paul Rosenzweig and Matthew Heiman explore the policy and political But either way, the news is bad.

Then we come to the Florida ‘deplatforming’ law, which a Clinton appointee dispatched in a cursory opinion last week. I’ve been in a small minority who thinks the law, far from being a joke, is likely to survive (at least in part) if it reaches the Supreme Court. Paul challenges me to put my money where my mouth is. Details to be worked out, but if a portion of the law survives in the top court, Paul will be sending a thousand bucks to Trumpista nonprofit. If not, I’ll likely be sending my money to the ACLU.

Surprisingly, our commentators mostly agree that both NSA and Tucker Carlson could be telling the truth, despite the insistence of their partisans that the other side must be lying. NSA gets unaccustomed praise for its … wait for it … rapid and PR-savvy response. That’s got to be a first.

Paul and I conclude that Maine, having passed in haste the strongest state facial recognition ban yet, will likely find itself repenting at leisure.

Matthew decodes Margrethe Vestager’s warning to Apple against using privacy, security to limit competition.

And I mock Apple for claiming to protect privacy while making employees wear body cams to preserve the element of surprise at the next Apple product unveiling. Not to mention the 2-billion-person asterisk attached to Apple’s commitment to privacy.

Dave praises NSA for its stewardship of a popular open source reverse engineering tool.

And everyone has a view about cops using YouTube’s crappy AI takedown engine to keep people from posting videos of their conversations with cops.

And more!

                                                                                                           

Download the 369th Episode (mp3)

 

You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

This episode offers an economical overview of the six antitrust reform bills reported out of the House Judiciary Committee last week. Michael Weiner and Mark MacCarthy give us the top line for all six (though only four would make substantial new policy). We then turn quickly to the odd-couple alliances supporting and opposing the bills, including my brief cameo appearance, in Rep. Jim Jordan’s opposition, on the gratifying ground (ok, among others) that Microsoft had never explained its suppression of my recent LinkedIn post. On the whole, I think Rep. Jordan is right; there’s very little in these bills that will encourage a diversity of viewpoints on social media or among its “trust and safety” bureaucrats.

Nick Weaver trashes the FBI for its prosecution of AnMing Hu. I’m more sympathetic, but neither of us thinks this will end well for the Bureau or the China Initiative.

Adam Candeub makes his second appearance and does a fine job unpacking three recent decisions on the scope of Section 230. The short version: Facebook only partly beat the rap for sex trafficking in the Texas Supreme Court; SnapChat got its head handed to it in the speed filter case; and all the Socials won but faced persuasive dissents in a case over assistance to terrorist groups.

The long version: Silicon Valley has sold the courts a bill of goods on Section 230 for reasons that sounded good when the Internet was shiny and democratic and new. Now that disillusion has set in, the sweeping subsidy conferred by the courts is looking a lot less plausible. The wheels aren’t coming off Section 230 yet, but the paint is peeling, and Big Tech’s failure to get their reading of the law blessed by the Supreme Court ten years ago is going to cost them – mainly because their reading is inconsistent with some basic rules of statutory interpretation.

Nick and I engage on the torture indictments of executives who sold internet wiretapping capabilities to the Qaddafi regime.

Mark is unable to hose down my rant over Canada’s bone-stupid effort to impose Canadian content quotas on the internet and to write An online hate speech law of monumental vagueness.

And in closing, Nick and I bid an appropriately raucous and conflicted adieu to the Hunter Thompson of Cybersecurity: John McAfee

And More!

                                                                                                           

Download the 368th Episode (mp3)

 

You can subscribe to The Cyberlaw Podcast using iTunes, Google Play, Spotify, Pocket Casts, or our RSS feed. As always, The Cyberlaw Podcast is open to feedback. Be sure to engage with @stewartbaker on Twitter. Send your questions, comments, and suggestions for topics or interviewees to CyberlawPodcast@steptoe.com. Remember: If your suggested guest appears on the show, we will send you a highly coveted Cyberlaw Podcast mug!

 

The views expressed in this podcast are those of the speakers and do not reflect the opinions of their institutions, clients, friends, families, or pets.