In my first post about NIST’s draft cybersecurity framework I explained its basic problem as a spur to better security: It doesn’t actually require companies to do much to improve their network security.
My second post argued that the framework’s privacy appendix, under the guise of protecting cybersecurity, actually creates a tough new privacy requirement for industry by smuggling the Fair Information Practice Principles into the law. In doing so, it clearly goes beyond the scope of the cybersecurity executive order, which is focused on protecting critical infrastructure. When was the last time lost PII caused “catastrophic regional or national effects on public health or safety, economic security, or national security?”
The reason is simply stated. If you want more of something, you don’t raise its cost. But by grafting strong privacy mandates to its weak cybersecurity standards, the privacy appendix raises the cost of putting cybersecurity measures in place. It’s like a ship design that requires the builder to pay for the installation of barnacles before launch.
That disincentive will be easy to heed. Taken as a whole, the message of the framework is, “You don’t have to implement any particular cybersecurity measures, but if you do, you’d better implement a bunch of privacy measures along with them.” This tempts network professionals to do less security, thus saving them as well the hassles that the framework’s privacy appendix calls for.
There are a lot of examples. Let’s start with network audits and monitoring. These are absolutely essential cybersecurity tools in today’s environment. They give a detailed picture of everything – and everyone – operating on the network. But for that reason, the NIST privacy appendix treats them as suspect — measures to be strictly limited. They are to be used only if their effectiveness is regularly demonstrated and they are regularly scrubbed to bring their privacy impact to a minimum: “When performing monitoring that involves individuals or PII, organizations should regularly evaluate the effectiveness of their practices and tailor the scope to produce the least intrusive method of monitoring.” If I’m right about the legal effect of these standards, the failure to observe this rule will lead to negligence or regulatory liability. But a lawyer asked to avoid that liability will be appalled at the requirement to produce the “least intrusive method of monitoring.” Lawyers understand that, with hindsight, plaintiffs and regulators can often point to some method of monitoring that would have been less intrusive and that might have worked just as well. Avoiding liability under such a rule is more a matter of luck than planning.
Audits get the same suspect treatment under the appendix. Companies that record personal data as part of a network audit are told to consider “how such PII could be minimized while still implementing the cybersecurity activity effectively.” Again, it will always be possible after the fact to discover a way to reduce a little more the personal data used in an audit.” Lawyers can flyspeck the audit plan forever without eliminating the risk.
The privacy appendix also prescribes yet more privacy assessments for cybersecurity detection and filtering. Companies “should regularly review the scope of detection and filtering methods to prevent the collection or retention of PII that is not relevant to the cybersecurity event.” Instead of poring over logs, looking for intruders, cybersecurity professionals are to pore over them for personal data that “is not relevant.” In another liability magnet, companies are instructed to adopt policies “to ensure that any PII that is collected, used, disclosed, or retained is accurate and complete.” That language will give employees who violate network rules new ways to challenge disciplinary actions.
Even in the middle of responding to a breach, the NIST appendix expects security staff to prioritize privacy: “When considering methods of incident containment, organizations should assess the impact on individuals’ privacy and civil liberties,” and “when PII is used for recovery, an organization may need to consider how to minimize the use of PII to protect an individual’s privacy or civil liberties.”
Perhaps worst of all, the privacy appendix imposes a heavy new legal and practical burden on cybersecurity information-sharing. It calls on companies to scrub any forensic data they may collect before they share it with others: “When voluntarily sharing information about cybersecurity incidents, organizations should ensure that only PII that is relevant to the incidents is disclosed”; and “When performing forensics, organizations should only retain PII that is relevant to the investigation.” Today, companies quickly share information with each other about new threats, including “personal” data like the IP addresses or the email accounts that are spreading malware. They face no real risk of liability for such sharing, at least as long as they keep the government out of the sharing arrangement. Once the NIST privacy appendix takes effect, though, even such private cybersecurity sharing will slow to a crawl as lawyers try to anticipate whether every piece of data has been screened for PII and for relevance.
In short, under the NIST framework, pretty much every serious cybersecurity measure in use today will come with new limits and possibly new liability. This is especially troubling because the framework does not prescribe any particular security measures, which means that companies that want to escape the new liabilities can simply decide not to implement the security measures. Rather than deal with the barnacles, they can just scuttle the ship.
Let’s hope that NIST scuttles the privacy appendix instead.