Calif. Privacy Board Debates Breadth of Automated Decision-Making Rules
California should be cautious about adopting limits from other jurisdictions’ privacy laws as it decides how to apply rules to AI, California Privacy Protection Agency Chairperson Jennifer Urban said Friday. But board member Alastair Mactaggart raised concerns that the CPPA is proposing too broad a definition of automated decision-making technology (ADMT) during the board’s virtual meeting. The CPPA board discussed pre-rulemaking proposals on cybersecurity audits, risk assessments and ADMT that privacy experts say could affect many industries, including communications and the internet (see 2312060021).
Sign up for a free preview to unlock the rest of this article
Export Compliance Daily combines U.S. export control news, foreign border import regulation and policy developments into a single daily information service that reliably informs its trade professional readers about important current issues affecting their operations.
The board voted 5-0 to direct staff to advance the cybersecurity audits draft to formal rulemaking. Members decided unanimously to further revise the risk assessment and ADMT proposals and possibly advance those at a later meeting.
Urban questioned using the phrase “legal or similarly significant effects” as a threshold for identifying practices where risk assessment and ADMT rules would apply. That phrase is in the EU’s general data protection regulation and Colorado’s privacy law, but not in California’s law, she said. Urban has concerns about putting in "a limitation that doesn't exist in our law, using language that exists in another law that has a different set of defaults." Board member Vinhcent Le said the phrase isn’t “extremely necessary” to keep, but the goal was to have an “easily named right.”
Mactaggart said ADMT's proposed definition seems to cover any software. The CPPA proposal would define it as "any system, software, or process … that processes personal information and uses computation as whole or part of a system to make or execute a decision or facilitate human decisionmaking.” Mactaggart asked, “We're going to be saying to every business, essentially, why are you using your software?" That probably doesn’t enhance privacy or security, he said. Mactaggart supported including a limit for decisions that have legal or significant effects.
A large business using a system that controls access to healthcare or employment needs to know whether that system is “reliable, valid or fair,” said Le. “It’s incumbent on you if you’re making these critical decisions that you know if your system works or not.” Added Urban, it’s better to start with a protective model and pull back later if businesses say it’s not working. “It is always much easier to bake privacy in than to bolt it on afterwards.”
It could “break a lot of stuff” if employees are allowed to opt out of workplace monitoring and other HR policies, as seems to be proposed under the ADMT draft, Mactaggart said. Businesses have those to ensure safe operations and fair treatment of customers, he said. Employees should know what kind of monitoring is taking place, though, he said.
The board’s Jeffrey Worthe is concerned about creating a loophole with a proposed exemption that would allow companies to prohibit customers from opting out of ADMT if the business needs it to provide a requested service. Earlier, Worthe asked if proposed thresholds for determining whether a business must perform a risk assessment are written too specifically. The proposal would cover use of biometrics, facial recognition, generating deep fakes and using generative models. Worthe asked if that’s "broad enough to cover where this language needs to be years out." Because they are rules, the board can update them in a few years, replied the board’s Lydia de la Torre. Also, she said the state’s Office of Administrative Law could object to the rules if they lack specificity.
Board members also debated what thresholds would be used to determine what businesses must conduct cybersecurity audits. Requiring them for businesses with at least $25 million in annual gross revenue, as currently proposed, would cover 20,000 to 30,000 California businesses at most, said CPPA General Counsel Phillip Laird, cautioning that estimates are preliminary. Raising that to at least $50 million would reduce the number of businesses covered to between 10,000 and 20,000, he said. Applying another proposed threshold on how much personal information is processed didn’t affect the ranges, he added.
The CPPA could try carving out small businesses by adding a threshold on employee count, suggested de la Torre. But Urban, Worthe and Mactaggart disagreed. There are companies with few employees that have great revenue and handle a lot of personal information, said Urban: And the chairperson wouldn’t want to incent businesses to have fewer employees.
Worthe wants to better understand the cost of cybersecurity audits that would be required, he said. Also, he favored giving companies 24 months to submit their first risk assessment, saying that would be “helpful to businesses starting something new.” Other board members also supported giving companies two years.
The CPPA should ultimately handle the three proposals on ADMT, risk assessments and cybersecurity audits through a single rulemaking package, Laird recommended. The package should also include proposed insurance rules and updates to existing rules, he said. The meeting continued after our deadline.