California Edges Closer to Regulating Automated Decision-Making Technology Using Personal Information
The board of the California Privacy Protection Agency (the Agency) recently met to discuss the Agency’s draft regulations on automated decision-making technology (ADMT), which will implement provisions of the California Privacy Rights Act amendments to the California Consumer Privacy Act (CCPA). The meeting marked an important step toward adoption of the regulations (Draft ADMT Rules); the next step is the formal rulemaking process, which is expected to commence in early 2024.
The ADMT regulations will place significant limitations on the use of personal information of California residents (Consumers) for automated decision-making by businesses subject to the CCPA (Businesses). “Consumers” in this context include not only individual customers of a Business, but also a Business’ employees, job applicants, and business-to-business contacts. As currently drafted, the rules would, in certain contexts (1) require Businesses to notify Consumers prior to applying ADMT to their personal information; (2) allow Consumers to opt out of certain uses of ADMT; and (3) allow Consumers to access information about the Business’ use of ADMT. Businesses that use or anticipate using personal information of California residents in ADMT should consider how they might be able to impact the details of the final CCPA ADMT regulations during the formal rulemaking process in the coming year.
The Draft ADMT Rules define ADMT as “any system, software, or process — including one derived from machine learning — that processes personal information and uses computation as whole or part of a system to make or execute a decision or facilitate human decisionmaking.” Under this definition, the CCPA regulations would have a markedly broader application than, for example, automated decision-making rules in the European Union, which apply solely to decisions made using automated processing without human intervention. In the Agency board’s recent meeting, one board member raised concerns about the breadth of this definition, observing that it could encompass nearly any piece of software.
ADMT under the Draft ADMT Rules encompasses, but is not limited to, “profiling,” which the CCPA defines as “any form of automated processing of personal information to evaluate certain personal aspects relating to a natural person and in particular to analyze or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.” As with the definition of ADMT, concerns were raised at the December 8 meeting about the breadth of activities that could be considered “profiling” under the Draft ADMT Rules.
As noted, the Draft ADMT Rules would require a Business to provide Consumers with specific notice before it uses their personal information in ADMTs under any of three circumstances when the Business uses ADMT to (1) make decisions producing “legal or similarly significant effects” on a Consumer; (2) profile a Consumer who is acting in their capacity as an employee, independent contractor, job applicant, or student; and (3) profile a Consumer in a publicly accessible place.
The Draft ADMT Rules clarify that a decision producing “legal or similarly significant effects” is one “that results in access to, or the provision or denial of, financial or lending services, housing, insurance, education enrollment or opportunity, criminal justice, employment or independent contracting opportunities or compensation, healthcare services, or essential goods and services.” The pre-use notice would need to explain how the Business plans to use ADMT, how the Consumer may opt out of such uses, and how the Consumer may access additional information about the Business’ use of ADMT. Importantly, a Business would not need to include a description of opt-out rights if its use of ADMT falls entirely within one or more exemptions, as discussed further below.
With respect to notifying employees, independent contractors, job applicants, and students of the use of ADMT for profiling, the Draft ADMT Rules explain that profiling in this context would include the use of “keystroke loggers, productivity or attention monitors, video or audio recording or live-streaming, facial- or speech-recognition or -detection, automated emotion assessment, location trackers, speed trackers, and web-browsing, mobile-application, or social-media monitoring tools” that would capture such individuals’ personal information. At the Agency’s recent board meeting, one member raised concerns about considering all of these activities to be “profiling,” noting that this obligation could require prior notice of typical safety and productivity activities, such as monitoring transportation workers for alertness, and suggested that a “reasonable expectation” threshold apply to notice and opt-out requirements in the profiling context.
Regarding profiling in a publicly accessible place, of which prior notice also would be required, the Draft ADMT rules provide that this would include, for example “using wi-fi or Bluetooth tracking, radio frequency identification, drones, video or audio recording or live-streaming, facial- or speech-recognition or -detection, automated emotion assessment, geofencing, location trackers, or license-plate recognition.” A “publicly accessible place” not only would mean public spaces such as parks or sidewalks, but also privately-owned locations such as shopping malls, movie theaters, stadiums, and hospitals.
Under the Draft ADMT Rules, Businesses would be required to provide Consumers a mechanism to opt out of the Business’ use of ADMT in the same three situations where pre-use notice is required, along with one or more of three possible additional circumstances (presented for the Agency board’s consideration) (1) where ADMT is used to profile Consumers for behavioral advertising (not limited to “cross-context behavioral advertising,” which is already subject to opt-out requirements); (2) where ADMT is used to profile a Consumer that the Business has actual knowledge is under the age of 16; and (3) where a Consumer’s personal information is used to train ADMT. At the Agency board’s recent meeting, the impact on employee data and an employee’s right to opt out of the use of ADMT was a point of contention.
Under these circumstances, Businesses generally would be required to provide Consumers with two or more methods to opt out of the use of ADMT, at least one of which must reflect how the Business generally interacts with Consumers (an online-only Business would be required to accept opt-out requests through an interactive form linked in its pre-use notice). Except where the opt-out request is for behavioral advertising, a Business would be permitted to require verification if it determines that Consumers would be negatively impacted by honoring fraudulent opt-out requests. If a Business believes a Consumer’s request to opt out is fraudulent, it would be permitted to deny the request.
The timing of a Consumer’s opt-out request would impact whether a Business may begin processing the Consumer’s data at all. If a Consumer submits an opt-out request before the Business has begun using their personal information for ADMT, the Business would not be allowed to begin processing that information using ADMT. If the Consumer did not opt out via the pre-use notice and the Business has already begun using their data for ADMT, the Business would be required to (1) stop processing the Consumer’s personal information within 15 business days and not retain the Consumer’s personal information and (2) communicate the Consumer’s opt-out to relevant service providers and third parties to effectuate the opt-out. After a Consumer has opted out, a Business would be required to wait at least one year before asking that Consumer to consent to the use of ADMT again.
A Business would be exempted from the opt-out requirement when its use of ADMT complies with Section 7002 of the CCPA regulations (Restrictions on the Collection and Use of Personal Information) and is necessary to achieve and is solely for (1) preventing and investigating security incidents; (2) resisting malicious, deceptive, fraudulent, or illegal actions; (3) protecting the life and physical safety of Consumers; or (4) providing the good or service specifically requested by the Consumer, provided that the Business has no reasonable alternative means of providing the good or service.
Right to Access Information About ADMT
Under the Draft ADMT Rules, upon the Consumer’s request, a Business would be required to provide (1) a plain language explanation of the purpose for which the Business utilizes ADMT; (2) the output of the ADMT with respect to the Consumer; (3) how the Business used (or plans to use) the output to make a decision with respect to the Consumer; (4) how the ADMT worked regarding that Consumer; (5) how the Consumer can obtain the entire range of possible outputs;1 (6) how the Consumer can exercise other CCPA rights; and (7) how the Consumer can submit a complaint to the Business regarding its use of ADMT.
If a Business has used ADMT to make a decision regarding a Consumer resulting in the denial of goods or services, it would be required to notify the Consumer (1) that the Business has made a decision about them; (2) that the Consumer has a right to access information about the Business’ use of ADMT; (3) how the Consumer can exercise their right to access additional information about the Business’ use of ADMT; and (4) that the Consumer can file a complaint regarding the Business’ use of ADMT with the Agency and the California Attorney General.
Takeaways for Businesses
The Draft ADMT Rules could have a substantial impact on Businesses processing personal information of California residents, including employees and Business contacts. The scope of activities encompassed by these rules would be much broader than those covered by many other consumer privacy laws, both within and outside the United States. For example, Businesses that monitor employee productivity via ADMT may be required to honor opt-out requests, potentially hamstringing a broad range of safety- and productivity-monitoring tools. And Businesses employing behavioral advertising — even within their own websites — would need to navigate a special set of rules while already being subject to rules regarding opt-outs for sharing personal information for cross-context behavioral advertising.
As noted above, the Draft ADMT Rules will need to undergo a formal rulemaking process prior to finalization and adoption. The Agency board directed the regulatory drafting team to consult with individual board members and present a new draft at a future board meeting. The board may then agree to submit the draft for public comment, after which yet another draft may be published reflecting consideration of the comments received. When the board agrees on a final version of the regulations, they will be submitted to the California Office of Administrative Law for review and will not be formally adopted absent approval by that Office.
Organizations that have an interest in the outcome of the ADMT regulations may wish to consider preparing comments on the proposed version of the regulations when issued for public comment. Organizations with such an interest, and the CCPA more generally, should feel free to contact any of the authors of this Advisory or their usual Arnold & Porter contact. The firm’s Privacy, Cybersecurity & Data Strategy team would be pleased to consult about the submission of comments, as well as to assist with any questions about privacy compliance and enforcement more broadly.
* Claire Fahlman contributed to this Advisory. Claire is a graduate of Georgetown University Law Center and is employed at Arnold & Porter's New York office. Claire is not admitted to the practice of law in New York.
© Arnold & Porter Kaye Scholer LLP 2024 All Rights Reserved. This Advisory is intended to be a general summary of the law and does not constitute legal advice. You should consult with counsel to determine applicable legal requirements in a specific fact situation.
In its draft risk assessment regulations, the Agency indicated that outputs may include text, images, audio, or video that makes predictions, recommendations, or decisions that influence physical or virtual environments. For example, ATDS may produce an output consisting of a spreadsheet analyzing employee productivity metrics that a company could use to determine compensation for its employees.