First Consumer-Facing AI Governance Rules Enacted in U.S.

As an important development in U.S. AI regulation, California enacted its automated decisionmaking technology (ADMT) rules in September 2025. These are the first enacted, broadly scoped, consumer-facing AI governance rules in the country. They offer opt-out rights and logic disclosures for AI-driven significant decisions affecting consumers. The rules took effect on October 1, 2025, with compliance required by January 1, 2027, for covered businesses that use ADMT in significant decisions before that date. Read the full Alert on the Duane Morris website.

Updated Artificial Intelligence Regulations for California Employers

With artificial intelligence developing at breakneck speed, California employment regulations are following right behind. Updated regulations issued by the California Civil Rights Council address the use of artificial intelligence, machine learning, algorithms, statistics and other automated-decision systems (ADS) used to make employment-based decisions. The updated rules, which took effect October 1, 2025, amend existing regulations, Cal. Code Regs., tit. 2, and are designed to protect against potential employment discrimination. The regulations apply to all employers with at least five employees working anywhere and at least one located within California. Read the full Alert on the Duane Morris website.

Artificial Intelligence Errors for Construction Contractors

In a recent Commercial Construction Renovation article, Duane Morris attorneys Robert H. Bell and Michael Ferri write:

Artificial intelligence (“AI”) is rapidly making its way into the construction bidding process. Contractors now use AI-powered estimating software to perform quantity takeoffs and analyze costs with unprecedented speed. According to the drafting and engineering software giant Autodesk, estimating teams are increasingly using AI and automation, particularly for quantity takeoffs, cost forecasting, and speeding up bid creation. Yet as digital tools become routine, legal rules governing bids still rely on traditional principles. This raises a pressing question: if an AI tool makes a costly error in a bid, will the legal system treat that mistake any differently than a human error? Courts are only beginning to grapple with AI-related mishaps, but early indications suggest AI errors will be handled much like any other bidding mistake. In other words, contractors will likely be held responsible for errors made by their AI tools, just as they are responsible for the mistakes of human estimators or means and methods under their control.

“Responsible Use of AI in Healthcare” Guidance

On September 17, 2025, the Joint Commission and Coalition for Health AI issued a joint guidance document entitled “Responsible Use of AI in Healthcare” to help providers implement AI while mitigating the risks of its use. The guidance provides seven elements that constitute responsible AI use in healthcare and discusses how provider organizations can implement them. Read the full Alert on the Duane Morris website.

Managing Compliance Challenges of Artificial Intelligence Pricing Tools

Duane Morris special counsel Justin Donoho authored the Journal of Robotics, Artificial Intelligence & Law article, “Ten Design Guidelines to Mitigate the Risk of AI Pricing Tool Noncompliance with the Federal Trade Commission Act, Sherman Act, and Colorado AI Act.” The article is available here and is a must-read for corporate counsel involved with development or deployment of AI pricing tools.

Northern District of California Allows CIPA Claims Against AI Pizza Ordering Assistant to Proceed

On August 11, 2025, Judge Susan Illston of the Northern District of California denied a motion to dismiss in Taylor v. ConverseNow Technologies, Inc. (Case No. 25-cv-00990-SI), allowing claims under California’s Invasion of Privacy Act (CIPA) Sections 631 and 632 to move forward against an AI voice assistant provider. ConverseNow provides artificial intelligence voice assistant technology that restaurants, including Domino’s, use to answer phone calls, process orders and capture customer information. The plaintiff alleged that when she placed a pizza order by phone, her call was intercepted and routed through ConverseNow’s servers, where her name, address and credit card details were recorded without her knowledge or consent. Read the full Alert on the Duane Morris website.

Third Circuit Clarifies Standing Requirements for Session Replay Privacy Claims

The United States Court of Appeals for the Third Circuit issued a decision on August 7, 2025, in Cook v. GameStop, Inc. that provides important guidance on Article III standing for session replay technology challenges, affirming dismissal of a putative class action. The ruling offers clarity for companies deploying website analytics tools while establishing clearer pleading requirements for privacy plaintiffs. Read the full Alert on the Duane Morris website.

Court Revives Wiretap and CDAFA Claims Against Retailer Over Use of Embedded Website Tracking Code

A California federal court has allowed privacy claims to proceed against Rack Room Shoes based on its use of embedded tracking tools on its website—signaling that companies may face liability under both state and federal privacy laws, even where data collection is disclosed in a privacy policy. In Smith v. Rack Room Shoes, Inc. (2025 WL 2210002), decided August 4, 2025, Judge Rita Lin of the Northern District of California declined to dismiss claims brought under the federal Wiretap Act and California’s Comprehensive Computer Data Access and Fraud Act (CDAFA). Read the full Alert on the Duane Morris website.

© 2009-2025 Duane Morris LLP. Duane Morris is a registered service mark of Duane Morris LLP.

The opinions expressed on this blog are those of the author and are not to be construed as legal advice.

Proudly powered by WordPress