California’s Proposed Cybersecurity Audit and Risk Assessment Regulations Could Create Significant Compliance Burdens for Companies
On September 8, the board of the California Privacy Protection Agency (the Agency) met and provided insights on the draft regulations the Agency is formulating on cybersecurity audits and risk assessments for businesses subject to the California Consumer Privacy Act (CCPA). The Agency is required to adopt such regulations pursuant to the CCPA amendments enacted under the California Privacy Rights Act of 2020. As stated in the statute, those regulations must require “businesses whose processing of consumers’ personal information presents significant risk to consumers’ privacy or security” to (1) perform annual cybersecurity audits and (2) submit to the Agency “on a regular basis” risk assessments of the business’s processing of personal information.
At the September 8 meeting, the Agency board considered the first drafts of the Draft Cybersecurity Audit Regulations and the Draft Risk Assessment Regulations. Although the draft regulations resemble similar requirements in other privacy laws, such as the European Union’s General Data Protection Regulation, they are particularly detailed and prescriptive. If adopted as proposed, the regulations may require many CCPA-regulated businesses to invest significantly in new data privacy and security procedures, even businesses that already conduct cybersecurity audits and risk assessments under other privacy regimes.
When Would Businesses Need To Perform Cybersecurity Audits or Risk Assessments?
The CCPA does not establish criteria for what constitutes a “significant risk” requiring either a cybersecurity audit or risk assessment. The Draft Cybersecurity Audit Regulations and the Draft Risk Assessment Regulations propose such criteria, but the criteria under the two sets of regulations would not be the same, so it is possible that a business might be required to conduct a cybersecurity audit but not a risk assessment, or vice versa.
Under the Draft Cybersecurity Audit Regulations, all businesses that derive 50% of their revenue from selling or sharing personal information would be required to conduct cybersecurity audits. Additionally, cybersecurity audits would be required for businesses meeting certain (to-be-determined) thresholds. The Agency is considering three potential thresholds: (1) US$25 million in annual revenue plus either a to-be-determined volume of personal information processed or the processing of personal information of persons under 16; (2) a to-be-determined gross revenue threshold; or (3) a to-be-determined threshold number of employees. The Agency board appeared to be leaning toward the first option.
Processing activities posing a “significant risk to consumers’ privacy” (thus requiring a risk assessment) focus on the processing activities or data sets themselves, as opposed to containing a revenue or size threshold for part of the scoping exercise, and include, briefly: (1) selling or sharing for cross-context behavioral advertising personal information; (2) processing sensitive personal information; (3) using automated decision-making technology; (4) processing information of minors under age 16; (5) monitoring employees, independent contractors, job applicants, or students; (6) using technology to monitor consumers in public; and (7) training artificial intelligence or automated decision-making (ADM) technology.
What Would Be the Scope of Cybersecurity Audits?
The scope of the audit requirement is not yet determined but could (1) entail evaluations of how the cybersecurity program protects against specific negative impacts on consumers’ security and various forms of harm resulting from them or (2) entail a more general evaluation of any risks that have “materially affected or are reasonably likely to materially affect consumers.” At the September 8 meeting, some Agency board members suggested combining the two options by relying on a general materiality trigger with the specific negative impacts under the first option listed as examples of materiality. The drafters agreed to revise the relevant rule and revert to the Agency board.
With respect to audit procedures, the Draft Cybersecurity Audit Regulations go into great detail on what the audit would need to document, including the business’s cybersecurity program and its implementation of 18 specific safeguards (e.g., encryption, access controls, and account management). Although a business would not need to implement all 18 safeguards, it would need to document why any omitted safeguards are not necessary and how other safeguards provide equivalent security. The audit documentation would also need to detail the effectiveness and weaknesses of the business’s cybersecurity program, whether the business has previously been required to notify a government authority in any jurisdiction of a cybersecurity breach, whether the business has ever notified consumers of a breach under California’s breach notification law (Cal. Civ. Code § 1798.82), and whether the business has otherwise experienced any personal-information security breaches subject to the CCPA.
How Would Cybersecurity Audits Have To Be Performed?
A cybersecurity audit may be performed by an internal or external auditor, provided the auditor can exercise independent judgment. If an internal resource is used, the auditor (1) cannot have any role in developing or maintaining the cybersecurity program or performing activities subject to the audit and (2) must report directly to the business’ board of directors or governing body or, if there is no board, the highest-ranking officer in the business who does not have direct responsibility for the business’s cybersecurity program.
As proposed, businesses will have 24 months to complete their first audit and then must conduct subsequent audits annually thereafter. A written acknowledgment or certification signed by a member of the business’ board or other governing body must be submitted annually to the Agency to show the business complied with the cybersecurity audit regulations (or that it did not fully comply).
What Would Be the Scope of Risk Assessments?
The Draft Risk Assessment Regulations are also highly detailed. They prescribe at least 13 areas of assessment and many sub-areas. For example, a business would need to evaluate: (1) consumers’ reasonable expectations of why and how their personal information is being processed; (2) the benefits of the types of processing being conducted to the business, the consumer, other stakeholders, and the public; and (3) the negative impacts to consumers’ privacy associated with the processing, including the sources of these negative impacts. At least 10 potential sources of negative impacts would have to be considered, including constitutional harms, discrimination harms, economic harms, and psychological harms. At the September 8 meeting, the Agency board proposed requiring that the assessment also detail whether the business financially profits from the processing activity or sells personal information.
Would Automated Decision-Making and Artificial Intelligence Be Subject To Special Requirements?
Yes, there would be additional requirements for businesses that either use ADM technology or process personal information to train AI or ADM systems. AI is broadly defined and includes technology designed to operate with varying levels of autonomy and generate outputs (such as predictions, recommendations, or decisions). This definition is supplemented with examples such as generative models and facial or voice detection and recognition. ADM technology uses computation to make or execute a decision or facilitate human decision-making. The draft definition specifically includes profiling as an example of such technology. At the September 8 meeting, Agency board members noted the broad scope of these definitions. The representative from the drafting team commented that these definitions are intentionally broad but suggested that the rules themselves would narrow their application.
Risk assessments of AI or ADM would need to include a wide array of additional elements, including, in the case of ADM technology that is subject to the CCPA’s opt-out rights, plain-language explanations of why the business is using such technology, the outputs of the technology, the logic of the technology, and how the business evaluates the technology for validity, reliability, and fairness (which have specific definitions). For training AI or ADM systems, if the business intends to make the technology available to other persons for their own use (e.g., a publicly available generative AI tool), the risk assessment would need to explain how the business plans to provide consumers whose personal information is used for training with an explanation of how the AI or ADM will be used. Similarly, if the business intends to offer the AI or ADM systems to another business (e.g., a facial-recognition security tool), the risk assessment will need to document how the business plans to provide the recipient business with the necessary facts for that business to conduct its own risk assessment.
What Would Be the Process for Documenting and Reporting Risk Assessments?
Risk assessments may need to be updated either once every three years or simply as necessary in addition to needing to be updated whenever there is a material change in the personal information-processing activities of the business. Any change would require a new assessment if it diminishes the benefits of the processing activity, creates new negative impacts, increases the magnitude or likelihood of already-identified negative impacts, or diminishes the effectiveness of data protection safeguards.
The Draft Risk Assessment Regulations would require businesses to annually submit to the Agency (1) the business’ privacy risk assessments in an abridged form and (2) a certification by a designated executive that the business has complied with the requirements of the risk assessment regulations. Businesses would also need to make full risk assessments available to the Agency or the California Attorney General upon request.
The draft regulations set forth highly detailed requirements that even sophisticated businesses may not have incorporated into their privacy and cybersecurity compliance programs. Given the status of the regulatory process, there is still an opportunity to influence the ultimate provisions of the regulations. The Agency’s drafting team will next come back to the Agency board with an updated draft after the public comment period. Then there will be another round of public comment and potentially another draft. The Agency board will need to approve the draft, after which the California Office of Administrative Law will need to approve the regulations before they go into effect.
Organizations that have questions about their obligations under the planned regulations, or under the CCPA more generally, may contact any of the authors of this Advisory or their usual Arnold & Porter contact. Our Privacy, Cybersecurity & Data Strategy team would be pleased to assist with any questions about privacy compliance and enforcement.
© Arnold & Porter Kaye Scholer LLP 2023 All Rights Reserved. This Advisory is intended to be a general summary of the law and does not constitute legal advice. You should consult with counsel to determine applicable legal requirements in a specific fact situation.