Calif. Privacy Board Weighs Automated Decision Rules Next Week
The California Privacy Protection Agency (CPPA) board plans to meet April 4 at 8:30 a.m. PT to discuss and possibly act on proposed regulations on automated decision-making technology (ADMT), cybersecurity audits, insurance and other California Consumer Protection Act (CCPA) rule updates, the agency said Monday. The CPPA unveiled draft rules revisions -- and plans to discuss bigger possible changes -- in meeting materials released the same day.
Sign up for a free preview to unlock the rest of this article
If your job depends on informed compliance, you need International Trade Today. Delivered every business day and available any time online, only International Trade Today helps you stay current on the increasingly complex international trade regulatory environment.
“The Agency staff has proposed various operationally significant modifications to the draft regulations, but on balance they wouldn't shift the overall scope and thrust of the regulations,” Future of Privacy Forum Senior Director Keir Lamont said in an email Tuesday. The official also highlighted key changes in a LinkedIn post.
Lamont predicted stakeholders will closely watch the board’s discussion of a presentation “on some major potential revisions to the draft regulations, including narrowing the scope of key definitions such as ‘automated decisionmaking technology’ and ‘consequential decision’ as well as walking back several more controversial elements of the proposed regulations, such as extending the new rules to ‘behavioral advertising’ based on first-party data and ‘training’ ADMT systems.”
The agency clarified aspects of proposed cybersecurity audit requirements, including by adding a definition of “cybersecurity audit report” and specifying due dates, according to a document explaining the latest modifications.
For example, the rules now say that a business must complete its first audit not later than Jan. 1, 2028, if its data processing is determined to present significant risk to consumers’ security on or before the rules’ effective date.
Also, the CPPA added neural data to the definition of sensitive personal data. Under another change, the agency included an “exception to ‘physical or biological identification or profiling’ to clarify that processing physical or biological characteristics that do not identify, and cannot reasonably be linked with, a particular consumer is not in scope of the definition.”
Meanwhile, the CPPA’s presentation tees up board discussion on the definition of ADMT and significant decisions and thresholds for behavioral advertising, work or educational profiling and public profiling and training, as well as risk assessment submissions to the agency.
Under the current draft, ADMT means “any technology that processes personal information and uses computation to execute a decision, replace human decisionmaking or substantially facilitate human decisionmaking.” Under an alternative in the presentation, ADMT would instead mean “a computational process derived from machine learning, statistical modeling, data analytics, or artificial intelligence that issues simplified output, including a score, classification, or recommendation, that processes personal information and that is used to assist or replace discretionary human decisionmaking and materially impacts consumers.”
Under a second alternative, ADMT would mean “any technology that processes personal information and uses computation to replace human decisionmaking or substantially replace human decisionmaking,” with the latter phrase meaning that “a business uses the technology’s output to make a decision without human involvement.”
Under alternative three, ADMT would mean “any technology that processes personal information and uses computation to replace human decisionmaking for the purpose of making a solely-automated significant decision about a consumer,” the presentation said.
The presentation also seeks discussion on whether to remove behavioral advertising and profiling and training thresholds from the scope of ADMT and risk-assessment requirements.
Last month, a bipartisan group of 18 California state legislators told the CPPA that the organization lacks authority to regulate AI and should scale back proposed ADMT rules (see 2502200025). Small businesses and industry groups also raised red flags at hearings in January and February (see 2501150017), though labor representatives and consumer privacy advocates testified that the state privacy agency possesses authority and should regulate ADMT.
"The current draft regulations give people some rights if AI is used in the criminal legal system, like being notified that AI is being used and getting an explanation of the output," Jake Snow, technology and civil liberties attorney at the American Civil Liberties Union of Northern California, said in a written statement Tuesday. "But the board is considering removing those protections ... People shouldn’t be profiled, investigated, imprisoned, kept in prison, or deprived while incarcerated on the basis of black-box algorithms designed by companies trying to make a buck off government contracts. The proposed change would leave some of the most marginalized Californians behind, who are disproportionately Black and Latinx."