Implementing the NIST Privacy Framework – Identify Function
The National Institute of Standards and Technology (NIST) Privacy Framework, published in January 2020, is quickly becoming the mainstream control set for organizations to align with when assessing their data privacy posture, developing readiness roadmaps, and maturing their privacy program.
We have previously written about how the controls in the NIST Privacy can be mapped to sovereign privacy laws such as the General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA). As part of our project work, we have mapped the NIST Privacy Framework to requirements in 23 different active and proposed privacy laws.
Perhaps more important for US organizations is the safe harbor trend we are seeing in proposed state privacy laws. For example, Ohio’s proposed privacy law includes a safe harbor clause for organizations which create, maintain, and demonstrate compliance with a written privacy program that reasonably conforms to the NIST Privacy Framework.
The NIST Privacy Framework was published in January of 2020 and consists of 18 categories and 100 subcategories within 5 core functions:
Identify
Govern
Control
Communicate
Protect
Given the complexity of the NIST Privacy Framework, we are publishing a series of articles focused on implementing the framework, which includes privacy management activities to consider as a basis for aligning with the NIST Privacy Framework.
This first article focuses on the Identify function of controls. NIST defines the Identify function as being able to “Develop the organizational understanding to manage privacy risk for individuals arising from data processing.”
The Identify function includes four categories: Inventory and Mapping; Business Environment; Risk Assessment; and Data Processing Ecosystem Risk Management. Each of these categories are further defined with 21 subcategory controls as listed below in Table 1.
Category
Inventory and Mapping (ID.IM-P): Data processing by systems, products, or services is understood and informs the management of privacy risk.
Subcategory
ID.IM-P1: Systems/products/services that process data are inventoried.
ID.IM-P2: Owners or operators (e.g., the organization or third parties such as service providers, partners, customers, and developers) and their roles with respect to the systems/products/services and components (e.g., internal or external) that process data are inventoried.
ID.IM-P3: Categories of individuals (e.g., customers, employees or prospective employees, consumers) whose data are being processed are inventoried.
ID.IM-P4: Data actions of the systems/products/services are inventoried.
ID.IM-P5: The purposes for the data actions are inventoried.
ID.IM-P6: Data elements within the data actions are inventoried.
ID.IM-P7: The data processing environment is identified (e.g., geographic location, internal, cloud, third parties).
ID.IM-P8: Data processing is mapped, illustrating the data actions and associated data elements for systems/products/services, including components; roles of the component owners/operators; and interactions of individuals or third parties with the systems/products/services.
Business Environment (ID.BE-P): The organization’s mission, objectives, stakeholders, and activities are understood and prioritized; this information is used to inform privacy roles, responsibilities, and risk management decisions.
ID.BE-P1: The organization’s role(s) in the data processing ecosystem are identified and communicated.
ID.BE-P2: Priorities for organizational mission, objectives, and activities are established and communicated.
ID.BE-P3: Systems/products/services that support organizational priorities are identified and key requirements communicated.
Risk Assessment (ID.RA-P): The organization understands the privacy risks to individuals and how such privacy risks may create follow-on impacts on organizational operations, including mission, functions, other risk management priorities (e.g., compliance, financial), reputation, workforce, and culture.
ID.RA-P1: Contextual factors related to the systems/products/services and the data actions are identified (e.g., individuals’ demographics and privacy interests or perceptions, data sensitivity and/or types, visibility of data processing to individuals and third parties).
ID.RA-P2: Data analytic inputs and outputs are identified and evaluated for bias.
ID.RA-P3: Potential problematic data actions and associated problems are identified.
ID.RA-P4: Problematic data actions, likelihoods, and impacts are used to determine and prioritize risk.
ID.RA-P5: Risk responses are identified, prioritized, and implemented.
Data Processing Ecosystem Risk Management (ID.DE-P): The organization’s priorities, constraints, risk tolerance, and assumptions are established and used to support risk decisions associated with managing privacy risk and third parties within the data processing ecosystem. The organization has established and implemented the processes to identify, assess, and manage privacy risks within the data processing ecosystem.
ID.DE-P1: Data processing ecosystem risk management policies, processes, and procedures are identified, established, assessed, managed, and agreed to by organizational stakeholders.
ID.DE-P2: Data processing ecosystem parties (e.g., service providers, customers, partners, product manufacturers, application developers) are identified, prioritized, and assessed using a privacy risk assessment process.
ID.DE-P3: Contracts with data processing ecosystem parties are used to implement appropriate measures designed to meet the objectives of an organization’s privacy program.
ID.DE-P4: Interoperability frameworks or similar multi-party approaches are used to manage data processing ecosystem privacy risks.
ID.DE-P5: Data processing ecosystem parties are routinely assessed using audits, test results, or other forms of evaluations to confirm they are meeting their contractual, interoperability framework, or other obligations.
How can we translate the NIST Privacy Controls into simple discovery questions?
Below are example questions we focus on when assessing an organization’s current privacy posture relative to the Identify function within the NIST Privacy Framework.
Does the organization maintain a data inventory of personal information which includes both processing activities and assets which contain personal information?
Does the organization maintain a list of vendors that store or access personal data on behalf of the organization? If so, does the organization maintain formal mapping of the vendors and their roles with respect to the systems/products/services and components that process personal information?
How has the organization integrated privacy into your mission and critical business objectives? How has this been communicated to employees and customers?
Has the organization previously conducted a privacy risk assessment at the enterprise or product levels? Can you describe the assessment, findings, and outcomes?
Can you describe the third-party risk management process and how new vendors are evaluated and risks are identified, tracked, and reported?
What should we consider implementing to support alignment with the NIST Privacy Framework?
After assessing responses to our discovery questions, below are examples of privacy management activities we assist organizations implement to remediate gaps or demonstrate privacy readiness and maturity.
Creating a data inventory or data mapping of personal information
Creating data flows and evaluating data classification schemes
Implementing a process documenting and assigning legal basis for processing operations
Implementing privacy impact assessments into system, process, product life cycles
Evaluation of processors and third-party vendors, insourcing and outsourcing privacy risks, including rules for international data transfer
The data privacy risk profile is different for every organization. It is important to evaluate privacy risk or gaps relative to an accepted privacy framework such as the NIST Privacy, and then prioritize which privacy management activities should be implemented.
This article is written by Reagan Bachman, David Manek, and Kenric Tom. We received permission from Ankura to republish it for the ADCG community.