Latest news with #CaliforniaPrivacyProtectionAgency


Business Journals
30-05-2025
- Business
- Business Journals
Understanding how the latest changes to California privacy law may impact New York companies
Businesses located in and outside of California may be subject to additional obligations pursuant to the California Consumer Privacy Act (CCPA), as amended this year. The amendments include steeper fines for violations of the CCPA and its accompanying regulations. The CCPA amendments also modify existing rights, while additional proposed regulatory changes impose new obligations regarding cybersecurity audit record retention, risk assessment deadlines, and procedures for utilizing automated decision-making technology (ADMT), among other things. This article highlights some amendments of interest that took effect on Jan. 1, 2025, as well as regulatory proposals that may take effect as early as Oct. 31, 2025. Covered businesses that meet certain threshold revenue and activity requirements, share common branding with a business subject to the CCPA, or have certain business relationships with other companies subject to the CCPA, should pay attention to these amendments, with more on the horizon. Increased fines Fines for certain violations increased as follows: Unintentional violation: Fine increased from $2,500 to $2,663 per violation. Intentional violation: Fine increased from $7,500 to $7,988 per violation. Intentional violations involving minors: Fine of $7,988 per violation for those involving minors under 16 years of age. Civil penalties: Civil penalties for each person per incident range from $107 to $799, whichever is greater. New obligations of covered businesses The amendments also modify existing rights of and add obligations imposed on businesses. Those obligations include: Neural data: This information, generated by measuring activity of the nervous system, is considered 'sensitive personal information.' The same privacy protections afforded to sensitive personal information (e.g., precise geolocation, citizenship, racial or ethnic origin) extend to neural data, including consent to collect or use and complying with requests to delete or opt out of sharing. Opt-out in mergers: Entities that acquire other businesses through mergers and acquisitions must honor opt-out requests made to the acquired company. The California Privacy Protection Agency (CPPA), a state agency established to implement and enforce the CCPA, also proposed regulatory changes that would create new obligations on businesses which may take effect later this year: Audit record retention: A covered business, not just the auditor, must now keep a record of its annual cybersecurity audits for at least five years. Risk assessment: While no deadline previously existed, covered businesses must now update their privacy risk assessments within 45 days of any material change (that introduces new risks or may weaken personal data protections) in data processing activities. ADMT: Covered businesses will be required to provide information about their use of ADMT in significant decision-making (e.g., financial services, employment screening, pricing) upon a resident's request. Businesses must also accommodate a resident's appeal of the business's use of ADMT or opt out of ADMT. The proposed regulatory amendments are subject to change based on comments submitted to the CPPA after the time of writing. Compliance strategy Businesses need to determine whether they are subject to the CCPA directly or through entities with which they have business relationships. To assist in this analysis and in developing a compliance program, businesses should consider their data collection, processing and transfer activities, evaluate sufficiency of risk assessment and audit procedures, and review opt-out mechanisms. To assist in this process, experts who are well-versed in these issues and your industry may be particularly helpful. Anna Mercado Clark, Partner and Chief Information Security Officer at Phillips Lytle, is the Co-Leader of the firm's Technology Industry Team. She can be reached at aclark@ or 212-508-0466.


Business Mayor
08-05-2025
- Business
- Business Mayor
California regulator weakens AI rules, giving Big Tech more leeway to track you
California's first-in-the-nation privacy agency is retreating from an attempt to regulate artificial intelligence and other forms of computer automation. The California Privacy Protection Agency was under pressure to back away from rules it drafted. Business groups, lawmakers, and Gov. Gavin Newsom said they would be costly to businesses, potentially stifle innovation, and usurp the authority of the legislature, where proposed AI regulations have proliferated. In a unanimous vote last week, the agency's board watered down the rules, which impose safeguards on AI-like systems. Agency staff estimate that the changes reduce the cost for businesses to comply in the first year of enforcement from $834 million to $143 million and predict that 90% percent of businesses initially required to comply will no longer have to do so. The retreat marks an important turn in an ongoing and heated debate over the board's role. Created following the passage of state privacy legislation by lawmakers in 2018 and voters in 2020, the agency is the only body of its kind in the United States. The draft rules have been in the works for more than three years, but were revisited after a series of changes at the agency in recent months, including the departure of two leaders seen as pro-consumer, including Vinhcent Le, a board member who led the AI rules drafting process, and Ashkan Soltani, the agency's executive director. Consumer advocacy groups worry that the recent shifts mean the agency is deferring excessively to businesses, particularly tech giants. The changes approved last week mean the agency's draft rules no longer regulate behavioral advertising, which targets people based on profiles built up from their online activity and personal information. In a prior draft of the rules, businesses would have had to conduct risk assessments before using or implementing such advertising. Behavioral advertising is used by companies like Google, Meta, and TikTok and their business clients. It can perpetuate inequality, pose a threat to national security, and put children at risk. The revised draft rules also eliminate use of the phrase 'artificial intelligence' and narrow the range of business activity regulated as 'automated decisionmaking,' which also requires assessments of the risks in processing personal information and the safeguards put in place to mitigate them. Supporters of stronger rules say the narrower definition of 'automated decisionmaking' allows employers and corporations to opt out of the rules by claiming that an algorithmic tool is only advisory to human decision making. 'My one concern is that if we're just calling on industry to identify what a risk assessment looks like in practice, we could reach a position by which they're writing the exam by which they're graded,' said board member Brandie Nonnecke during the meeting. 'The CPPA is charged with protecting the data privacy of Californians, and watering down its proposed rules to benefit Big Tech does nothing to achieve that goal,' said Sacha Haworth, executive director of Tech Oversight Project, an advocacy group focused on challenging policy that reinforces Big Tech power, said in a statement to CalMatters. 'By the time these rules are published, what will have been the point?' The draft rules retain some protections for workers and students in instances when a fully automated system determines outcomes in finance and lending services, housing, and health care without a human in the decisionmaking loop. Businesses and the organizations that represent them made up 90% of comments about the draft rules before the agency held listening sessions across the state last year, Soltani said in a meeting last year. Read More Environment Current Affairs - GKToday In April, following pressure from business groups and legislators to weaken the rules, a coalition of nearly 30 unions, digital rights, and privacy groups wrote a letter together urging the agency to continue work to regulate AI and protect consumers, students, and workers. 'With each iteration they've gotten weaker and weaker.' Kara Williams, law fellow, Electronic Privacy Information Center, on draft AI rules from California's privacy regulator Roughly a week later, Gov. Newsom intervened, sending the agency a letter stating that he agreed with critics that the rules overstepped the agency's authority and supported a proposal to roll them back. Newsom cited Proposition 24, the 2020 ballot measure that paved the way for the agency. 'The agency can fulfill its obligations to issue the regulations called for by Proposition 24 without venturing into areas beyond its mandate,' the governor wrote. The original draft rules were great, said Kara Williams, a law fellow at the advocacy group Electronic Privacy Information Center. On a phone call ahead of the vote, she added that 'with each iteration they've gotten weaker and weaker, and that seems to correlate pretty directly with pressure from the tech industry and trade association groups so that these regulations are less and less protective for consumers.' The public has until June 2 to comment on the alteration to draft rules. Companies must comply with automated decisionmaking rules by 2027. Prior to voting to water down its own regulation last week, at the same meeting the agency board voted to throw its support behind four draft bills in the California Legislature, including one that protects the privacy of people who connect computing devices to their brain and another that prohibits the collection of location data without permission.


Associated Press
07-05-2025
- Business
- Associated Press
California regulator weakens AI rules, giving Big Tech more leeway to track you
California's first-in-the-nation privacy agency is retreating from an attempt to regulate artificial intelligence and other forms of computer automation. The California Privacy Protection Agency was under pressure to back away from rules it drafted. Business groups, lawmakers , and Gov. Gavin Newsom said they would be costly to businesses, potentially stifle innovation, and usurp the authority of the legislature, where proposed AI regulations have proliferated. In a unanimous vote last week, the agency's board watered down the rules, which impose safeguards on AI-like systems. Agency staff estimate that the changes reduce the cost for businesses to comply in the first year of enforcement from $834 million to $143 million and predict that 90% percent of businesses initially required to comply will no longer have to do so. The retreat marks an important turn in an ongoing and heated debate over the board's role. Created following the passage of state privacy legislation by lawmakers in 2018 and voters in 2020, the agency is the only body of its kind in the United States. The draft rules have been in the works for more than three years , but were revisited after a series of changes at the agency in recent months, including the departure of two leaders seen as pro-consumer, including Vinhcent Le, a board member who led the AI rules drafting process, and Ashkan Soltani, the agency's executive director. Consumer advocacy groups worry that the recent shifts mean the agency is deferring excessively to businesses, particularly tech giants. The changes approved last week mean the agency's draft rules no longer regulate behavioral advertising, which targets people based on profiles built up from their online activity and personal information. In a prior draft of the rules, businesses would have had to conduct risk assessments before using or implementing such advertising. Behavioral advertising is used by companies like Google, Meta, and TikTok and their business clients. It can perpetuate inequality , pose a threat to national security , and put children at risk . The revised draft rules also eliminate use of the phrase 'artificial intelligence' and narrow the range of business activity regulated as 'automated decisionmaking,' which also requires assessments of the risks in processing personal information and the safeguards put in place to mitigate them. Supporters of stronger rules say the narrower definition of 'automated decisionmaking' allows employers and corporations to opt out of the rules by claiming that an algorithmic tool is only advisory to human decision making. 'My one concern is that if we're just calling on industry to identify what a risk assessment looks like in practice, we could reach a position by which they're writing the exam by which they're graded,' said board member Brandie Nonnecke during the meeting. 'The CPPA is charged with protecting the data privacy of Californians, and watering down its proposed rules to benefit Big Tech does nothing to achieve that goal,' said Sacha Haworth, executive director of Tech Oversight Project, an advocacy group focused on challenging policy that reinforces Big Tech power, said in a statement to CalMatters. 'By the time these rules are published, what will have been the point?' The draft rules retain some protections for workers and students in instances when a fully automated system determines outcomes in finance and lending services, housing, and health care without a human in the decisionmaking loop. Businesses and the organizations that represent them made up 90% of comments about the draft rules before the agency held listening sessions across the state last year, Soltani said in a meeting last year. In April, following pressure from business groups and legislators to weaken the rules, a coalition of nearly 30 unions, digital rights, and privacy groups wrote a letter together urging the agency to continue work to regulate AI and protect consumers, students, and workers. Roughly a week later, Gov. Newsom intervened, sending the agency a letter stating that he agreed with critics that the rules overstepped the agency's authority and supported a proposal to roll them back. Newsom cited Proposition 24, the 2020 ballot measure that paved the way for the agency. 'The agency can fulfill its obligations to issue the regulations called for by Proposition 24 without venturing into areas beyond its mandate,' the governor wrote. The original draft rules were great, said Kara Williams, a law fellow at the advocacy group Electronic Privacy Information Center. On a phone call ahead of the vote, she added that 'with each iteration they've gotten weaker and weaker, and that seems to correlate pretty directly with pressure from the tech industry and trade association groups so that these regulations are less and less protective for consumers.' The public has until June 2 to comment on the alteration to draft rules. Companies must comply with automated decisionmaking rules by 2027. Prior to voting to water down its own regulation last week, at the same meeting the agency board voted to throw its support behind four draft bills in the California Legislature, including one that protects the privacy of people who connect computing devices to their brain and another that prohibits the collection of location data without permission . ___ This story was originally published by CalMatters and distributed through a partnership with The Associated Press.