In advance of the California Privacy Protection Agency’s (CPPA) December 8 Board meeting, the Agency has published new draft automated decisionmaking technology (ADMT) regulations, as well as revisions to draft regulations on risk assessments and cybersecurity audits previously discussed at the CPPA’s September Board meeting. Though these draft regulations have not yet entered the formal rulemaking process, they provide a strong indication of how the CPPA may ultimately seek to regulate in these areas.
The new materials offer much to discuss at the December 8 Board meeting. The primary focus of discussion will likely be the ADMT regulations, which were not discussed at the September meeting. Those regulations would impose three broad categories of ADMT-related requirements on businesses, including providing consumers with a “Pre-use Notice” explaining the business’s use of ADMT, honoring consumers’ requests to opt-out of the business’s use of ADMT in relation to their personal information, and responding to consumer requests to access information about the business’s use of ADMT.
The ADMT regulations are particularly noteworthy for companies engaging in the use and development of artificial intelligence (AI). These regulations provide concrete rules for businesses that potentially use automated decisionmaking tools in a manner that leads to decisions that produce “legal or similarly significant effects concerning a consumer” (i.e., impact a consumer in areas of healthcare services, employment, insurance, and other regulated areas). These regulations are particularly important for companies that potentially use AI tools for direct-to-consumer offerings because they may need to provide an opt-out in compliance with the proposed rules (to the extent that a business’s use case falls under the above-referenced definition).
The risk assessment and cybersecurity audit regulations also include several revisions worth noting, as well. The risk assessment regulations, for instance, include new provisions related to risk assessments of ADMT, specifically, as well as revisions to existing provisions regarding assessment frequency and reporting to the CPPA. Meanwhile, the Agency’s revisions to the cybersecurity audit regulations focused on three areas: (1) the applicability thresholds dictating what businesses are subject to the regulations; (2) the cybersecurity threats and harms that these audits should consider; and (3) the scope of these audits in relation to contractors and other non-employee personnel.
In this post, we summarize key elements of the CPPA’s new draft cybersecurity regulations. We are happy to answer any questions you may have about how these proposed regulations would affect your privacy and cybersecurity compliance programs.
Automated Decisionmaking Technology Regulations
1. Key Definitions: The regulations’ scope of applicability is dictated largely by the CPPA’s definitions of “automated decisionmaking technology,” “decision that produces legal or similarly significant effects concerning a consumer,” and “profiling.” The regulations define each of these terms broadly, as follows:
- Automated decisionmaking technology: “any system, software, or process—including one derived from machine-learning, statistics, or other data-processing or artificial intelligence—that processes personal information and uses computation as whole or part of a system to make or execute a decision or facilitate human decisionmaking … includ[ing] profiling.”
- Decision that produces legal or similarly significant effects concerning a consumer: “a decision that results in access to, or the provision or denial of, financial or lending services, housing, insurance, education enrollment or opportunity, criminal justice, employment or independent contracting opportunities or compensation, healthcare services, or essential goods or services.”
- Profiling: “any form of automated processing of personal information to evaluate certain personal aspects relating to a natural person and in particular to analyze or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.”
2. Applicability Thresholds: The draft regulations’ three key requirements — pre-use notice, the opt-out right, and the access right (described below) — only apply in relation to specified uses of ADMT. The draft regulations state that these uses include decisions that produce legal or similarly significant effects concerning a consumer; profiling a consumer acting in their capacity as an employee, independent contractor, job applicant, or student; and profiling a consumer in a publicly accessible place. The draft regulations also present, for Board consideration, several additional applicability thresholds, including profiling consumers for behavioral advertising, profiling consumers under the age of 16, and processing personal information to train ADMT. In its separate overview document, the Agency flags these additional thresholds as key topics for discussion during the upcoming Board meeting.
3. Pre-Use Notice (Sec. 7017): The draft regulations require that businesses provide consumers with a “Pre-use Notice” that includes, among other things, an explanation of the purpose for which the business uses ADMT, a description of the consumer’s right to opt-out of the business’s use of ADMT, and a description of the consumer’s right to access information about the business’s use of ADMT. The Pre-use Notice must also allow the consumer to obtain additional information about the business’s use of ADMT, including the ADMT’s logic and outputs, the role that ADMT outputs play in the business’s decisionmaking processes, and the results of any evaluation that the business has conducted of the validity, reliability, or fairness of its use of ADMT.
4. Opt-Out Right (Sec. 7030): The draft regulations establish a right for consumers to opt-out of a business’s use of ADMT. Businesses that receive opt-out requests are required to cease processing the relevant consumer’s personal information using ADMT no later than 15 business days from the date the request is received. Further, the business is prohibited from using or retaining personal information previously processed by the ADMT and must flow down the opt-out request to service providers, contractors, and other parties as appropriate.
- Exceptions: The draft regulations state that a business is not required to comply with an opt-out request if its use of ADMT “is necessary to achieve, and is used solely for,” one or more specified purposes, including: (1) preventing, detecting, and investigating security incidents; (2) “resist[ing] malicious, deceptive, fraudulent, or illegal actions directed at the business”; (3) “protect[ing] the life and physical safety of consumers”; and (4) providing a good or service specifically requested by the consumer, provided that the business has “no reasonable alternative method” of conducting the relevant processing. In demonstrating that “no reasonable alternative method” exists, the business must show that developing or using an alternative method would be futile; that such an alternative method would not be as valid, reliable, or fair as the ADMT; or that use of an alternative method would impose “extreme hardship” on the business.
- The Agency has flagged these exceptions as another key topic for discussion during the December 8 Board meeting.
5. Access Right (Sec. 7031): The draft regulations establish a right for consumers to access information about a business’s use of ADMT. Upon receiving a consumer’s request to exercise their access right, a business must provide the consumer with explanations of (1) the purpose for which the business used ADMT; (2) the output of the ADMT with respect to the consumer; (3) how the business used that output in relation to decisionmaking concerning the consumer; (4) how the ADMT worked in relation to the consumer (e.g., logic, key parameters); (5) the ADMT’s range of possible outputs; and (6) how the consumer can exercise their other CCPA rights or file a complaint concerning the business’s use of ADMT.
- Affirmative Obligation to Notify: The draft regulations also provide that, if a business uses ADMT to deny a good or service to a consumer, it must notify the consumer of the decision and provide the consumer with information about how to exercise their access right and file a complaint with the CPPA or California Attorney General
Risk Assessment Regulations
The new draft risk assessment regulations feature four main areas of revisions:
1. Applicability to Use of Personal Information for ADMT/AI Training: The revised draft regulations include, for Board discussion, clarifying language regarding the regulations’ applicability to the processing of personal information for ADMT/AI training purposes. Specifically, the regulations now include language specifying the types of ADMT/AI for which the use of personal information for training purposes would fall within the scope of the risk assessment regulations, including several types of ADMT within the ambit of the ADMT regulations (see above), as well as ADMT and AI that can be used to “[e]stablish individual identity on the basis of biometric information”; conduct facial, speech, and emotion detection; generate deep fakes; or facilitate the operation of generative models (including large language models).
2. Consultation With External Parties Regarding Uses of ADMT/AI: The revised draft regulations now require that, if a business using ADMT or AI has not consulted with external parties (e.g., service providers, contractors, academics, civil society organizations) in the execution of its risk assessment, it must include in its risk assessment an explanation of why it did not do so, as well as the “safeguards it has implemented to address risks to consumers’ privacy that may arise from the lack of external party consultation.”
3. Timing: The revised draft regulations include new provisions related to how frequently businesses must review and update their risk assessments. Specifically, Section 7156 now states that businesses must review and update their risk assessments at least once every three years. This section also includes a proposal for Board discussion that would require businesses to review and update their ADMT-specific risk assessments on a to-be-determined cadence.
4. Submission of Risk Assessment Materials to Agency: The revised draft regulations include, for Board discussion, more-detailed guidance (Section 7158) about the risk assessment materials that businesses must annually submit to the CPPA, including a certification of compliance and abridged risk assessment.
Cybersecurity Audit Regulations
The new draft cybersecurity audit regulations feature three key areas of revisions:
1. Applicability Thresholds: The cybersecurity audit regulations’ applicability thresholds (Section 7120) were a key area of discussion during the September Board meeting. Unsurprisingly, then, this section was subject to significant revisions in the latest draft regulations. The revised version retains the provision making the regulations applicable to data brokers (i.e., entities deriving at least 50% of their annual revenue from selling or sharing personal information). However, the latest draft narrows the options for how the regulations might apply to non-data brokers. Specifically, the latest revision eliminates options that would have applied the regulations to certain businesses based solely on their gross revenue or number of employees. Instead, the latest draft proposes an applicability framework based on a combination of a revenue threshold and three personal information processing thresholds (for personal information generally, sensitive personal information, and personal information of consumers under the age of 16, respectively). Thus, a business would have to satisfy the revenue threshold and one of the three information processing thresholds in order to be subject to the cybersecurity audit regulations. The revised draft regulations include various options for what, precisely, these thresholds should be. For example, the options for the general personal information processing threshold range from 250,000 to 1,000,000 consumers or households. It is likely that these thresholds will be a topic of discussion during the December 8 meeting.
2. Cybersecurity Threats and Harms Considered: The previous draft of the cybersecurity audit regulations included, in Section 7123(b), two options for a provision regarding the types of cybersecurity threats and harms that audits should account for. One option consisted of relatively lengthy guidance regarding specific types of cybersecurity harms that audits should assess, including, for example, unauthorized access, economic harm, physical harm, and reputational harm. The other option was significantly shorter, requiring only that the audits “assess and document any risks from cybersecurity threats, including as a result of any cybersecurity incidents, that have materially affected or are reasonably likely to materially affect consumers.” The latest revised draft regulations adopt the latter, shorter approach. However, the draft does note that CPPA staff will “propose further revisions to this subsection based on Board feedback,” meaning that this provision is likely to be discussed on December 8.
3. Accounting for Contractors and Other Non-Employee Personnel: The draft regulations include a section (Section 7123(c)(2)) that specifies the types of security safeguards that cybersecurity audits should assess. In its latest revisions, the Agency makes clear that audits should assess these safeguards as they apply not just to the business itself, but also to the business’s contractors and other non-employee personnel. For instance, the latest draft broadens the regulations’ section on assessment of account management and access controls (Section 7123(c)(2)(D)) to apply to “employee[s], independent contractor[s], [and] any other personnel,” rather than just employees. Similarly, in Section 7123(c)(2)(M), the revisions call for audits to assess whether the business provides cybersecurity training for “employee[s], independent contractor[s], and any other personnel.”