Skip to main content
All
October 6, 2021

Opportunity to Influence Rulemaking Under the California Privacy Rights Act

Advisory

In a recently issued invitation for preliminary comments, the California Privacy Protection Agency (Agency) has solicited input on issues it should address in upcoming rulemaking pursuant to the California Privacy Rights Act of 2020 (CPRA). The Agency, which was established by the CPRA (adopted by state ballot initiative in November 2020), is tasked with implementing and enforcing the California Privacy Protection Act (CCPA), as amended by the CPRA. The Agency’s invitation is an important opportunity for businesses regulated by the CCPA—which includes any business with more than $25 million in annual revenue that collects personal information from at least one California resident—as well as privacy advocates, to stake out a position on the regulatory framework that will govern interpretation and enforcement of the amended CCPA.

The Agency has suggested topics for comments that it considers particularly ripe for clarification in rulemaking. But it is inviting comments on any aspect of the CCPA/CPRA of significance to interested parties. Among other things, this is an opportunity to seek clarification of the Agency’s own authority as investigator and enforcer. The deadline to submit comments is November 8, 2021.

Topics Outlined in the Invitation

In its invitation, the Agency outlined eight general topic areas for possible comments. Within each area, the Agency posed one or more questions that commenters might wish to address, which generally track the CPRA provisions directing the Agency to undertake rulemaking on specific issues. The eight areas covered are:

  1. Processing that Presents a Significant Risk to Consumers’ Privacy or Security: Cybersecurity Audits and Risk Assessments Performed by Businesses
  2. Automated Decision-making
  3. Audits Performed by the Agency
  4. Consumers’ Rights to Delete, Right to Correct, and Right to Know
  5. Consumers’ Rights to Opt-Out of the Selling or Sharing of Their Personal Information and to Limit the Use and Disclosure of their Sensitive Personal Information
  6. Consumers’ Rights to Limit the Use and Disclosure of Sensitive Personal Information
  7. Information to be Provided in Response to a Consumer Request to Know
  8. Definitions and Categories

Key Questions

Although each of the topic areas merits and likely will receive commentary, the questions raised in certain of those areas stand out as ones that beg for public input. How the Agency resolves these questions in regulations critically impact what covered businesses need to do to steer clear of Agency enforcement actions in the years to come. Among these key questions are the following:

  • What constitutes a “significant risk” triggering cybersecurity audits and risk assessments? Under the CPRA, the Agency may require businesses to undertake cybersecurity audits and risk assessments only where “processing of consumers’ personal information presents significant risk to consumers’ privacy or security.” The Agency is seeking commentary on what constitutes a “significant risk” triggering these obligations. Although some businesses may already perform these activities in the ordinary course, a broad interpretation of “significant risk” could impose expensive audit requirements on businesses where the benefits of the audits could be marginal. Providing the Agency with real-world risk-experience examples and data on the costs and benefits of cybersecurity audits and assessments could help prevent the Agency from imposing unrealistic and/or unduly burdensome mandates.
  • When would the risks to the privacy of the consumer outweigh the benefits of a business processing the consumer’s personal information, such that such processing should be restricted or prohibited? This question stems from the CPRA’s statement that the Agency’s rules should require businesses to include in their risk assessments an analysis “identifying and weighing” the benefits of the processing to the business, the consumer, other stakeholders, and the public, against the potential risks to the rights of the consumer, “with the goal of restricting or prohibiting such processing if the risks to the consumer outweigh the benefits.” (Cal. Civ. Code § 1798.185(a)(19)(C).) The indication is that it is the obligation of each business to determine, based on its risk-benefit assessment, whether there should be any restriction on processing, but presumably the Agency could second-guess such an assessment. The CCPA, even as amended by the CPRA, does not restrict or prohibit processing of personal information; it grants consumers rights to do so in certain limited circumstances. By posing this question, is the Agency suggesting that it might, through its scrutiny of businesses’ risks assessments, effectively create restrictions and prohibitions on its own? This begs for clarification and perhaps comments on the Agency’s authority under the CPRA.
  • What activities should be deemed to constitute “automated decision-making technology” and/or “profiling”? Many businesses use personal information to drive multiple types of internal and external decision-making. Whether such activities constitute “automated” decision-making and/or profiling will largely determine whether a business may need to honor access and opt-out requests. The Agency’s definition of these terms should be informed by industry understandings, which could be provided in comments from businesses as well as academics or others.
  • What information must businesses provide to consumers in response to access requests in order to provide “meaningful information about the logic” involved in the automated decision-making process? The CPRA and other privacy laws aim to provide consumers with more transparency about how their personal information is used in automated decision-making. Where should the line be drawn between what consumers need and deserve from a privacy perspective and the protection of what often may be trade secret information? Is the CPRA’s reference to “meaningful information” in this context directed at privacy protection or rather fairness in decision-making? Commenters may want to address that point.
  • What should be the scope of the Agency’s audit authority? The Agency’s solicitation of comment on its own audit authority provides an unusual opportunity for regulated businesses to provide information on what government auditing is reasonable and helpful, and when audits may be unjustifiably intrusive and burdensome. Once the Agency takes a position on this in proposed regulations, it is likely to adhere to it going forward.
  • Should the definition of “sensitive personal information” be updated or supplemented? The CPRA gives consumers the right to request that a business limit the use and disclosure of their “sensitive personal information,” but businesses need not honor such requests where the information is “collected or processed without the purpose of inferring characteristics about a consumer.” It could meaningfully assist businesses if the Agency were to clarify, among other things: (1) what constitutes “inferring characteristics;” (2) what types of information in “the contents of a consumer’s mail, email and text messages” (which is a category of “sensitive personal information” as defined in the CPRA) would be “sensitive” personal information; and what it means to “analyze” a person’s sexual orientation such that personal information collected on sexual orientation would be “sensitive personal information.”
  • Should the examples of “personal information” be updated? One area of confusion in the CCPA lies in the fact that the enumerated examples of types of information that may be identifiable are not always identifiable and may not, therefore, be per se “personal information.” The Agency could potentially clarify whether certain of the exemplary elements of information are inherently identifiable to a consumer and therefore constitute “personal information.”
  • Are updates to the definition of “deidentified” and/or “unique identifier” merited? Currently, it is unclear whether “deidentified” data is (or should be) limited to data that was previously identifiable, or whether deidentified data is any information that cannot be linked to an individual, regardless of its history. Similarly, there is good reason to update the examples and definitions of “unique identifiers” to exclude, for example, dynamic IP addresses and other temporary identifiers that lose their identification potential after a short period of time.

Again, there is a very short timeframe in which to submit comments: the deadline for submission is November 8, 2021. We are available to advise if you are interested in submitting comments.

© Arnold & Porter Kaye Scholer LLP 2021 All Rights Reserved. This Advisory is intended to be a general summary of the law and does not constitute legal advice. You should consult with counsel to determine applicable legal requirements in a specific fact situation.