NIST has revised the draft cybersecurity framework that it released in August. What it published today is a " preliminary cybersecurity framework." After comments, a final framework will be released in February.
I've been very critical of the draft released in August. NIST clearly worked to address the criticisms.
The result is a mixed bag, but the document is still a net loss for security.
What's improved? First, in an effort to introduce flexibility into the document, NIST deleted all the "should" language from the privacy standards.
Second, it added a paragraph that asserts the "flexibility" that organizations have to implement the privacy provisions:
Third, NIST responded to my concern that the "governance" section of the appendix would smuggle into the rules governing private companies all of the fair information practice principles, or FIPPs, that govern federal agencies. NIST narrowed the scope of the governance section by tying it to the actual PII being used for cybersecurity. See the bold language below.
That's a substantial improvement.
What's wrong with the new version? Well, the first change, dropping the "should"s, is well-intended but largely cosmetic. In fact, it arguably makes the rules harsher, not more flexible. That's because, instead of telling companies what they "should" do to protect privacy, the appendix now just commands them to do those things. You can see that in the example above. Also in this one:
(As an aside, note the other change in the new version, which is pretty clearly the result of privacy groups' comments. It tells companies to protect communications content, not just PII. But that change is only needed if the companies are sharing content that can't be traced to a person. So it seems to mean that companies who share information about spam should minimize the amount of spam they quote when trying to tell other companies which messages to block. That's dumb. More broadly, why should such a mandate be added to a standard that insists that it's about PII?)
That brings me to my biggest concern. Despite NIST's claim that it has left companies lots of flexibility, you can't really find flexibility in the language of the privacy appendix. So I continue to fear that the net result of the package will be to impose a "privacy tax" on cybersecurity, adding to the cost of security measures by tying those measures to expensive privacy obligations whose value is unproven. For example:
The new language is slightly less demanding, but it still calls on companies that share information about malware and intrusions to make determinations about which information is "necessary" to describe or mitigate the incident. If the company guesses wrong about a couple of bits of information, and someone later decides that those bits weren't strictly necessary to mitigate the incident, then the standard has been violated and liability is much more likely. At a minimum, lawyers have to review every category of data that is being shared and write rules for when it is necessary and when it isn't. It takes heroic ignorance to believe that a requirement like that won't reduce the sharing that's already occurring, even among private enterprises.
Finally, NIST took a further step that has heightened my concern that this appendix is going to impose the FIPPs on the entire US private sector. That's because the only "reference" standard offered by NIST to explain and implement the appendix is a document that is plainly written for government agencies trying to implement federal privacy standards. In the absence of any other reference, the pressure will be great to follow the government rules.
So, to return to the example above, suppose you're a company that wants to implement privacy-compliant information sharing. You consult the "reference" standard, and here's what you're told:
MINIMIZATION OF PERSONALLY IDENTIFIABLE INFORMATION
None of this is good for quick and easy cybersecurity information sharing. It seems to suggest that each sharing company has to evaluate its cybersecurity data and minimize, perhaps even anonymize, the data it keeps and to get rid of anything it isn't sure it needs. The data will have to be scrubbed for accuracy and completeness. To make that decision, the guidance creates a committee that includes not just the lawyers but top officials and a privacy officer, further clogging and bureaucratizing what should be an instantaneous exchange of threat data. This raises the cost of information sharing, which is what you do only if you want less of something.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.
We operate a free-to-view policy, asking only that you register in order to read all of our content. Please login or register to view the rest of this article.