In Rodgers v. Christie, a recent non-precedential decision, the United States Court of Appeals for the Third Circuit examined whether traditional strict products liability doctrines apply to artificial intelligence-based software. 2020 WL 1079233 (3d Cir. Mar. 6, 2020). There, plaintiffs asserted claims under the New Jersey Products Liability Act (“PLA”), arising from the State’s Public Safety Assessment (“PSA”). Id. at *1. The PSA is a “data-based” risk assessment algorithm which provides quantitative scores and a “decision-making framework” to assist courts in “assess[ing] the risk that [a] criminal defendant will fail to appear for future court appearances or commit additional crimes and/or violent crimes if released.” See Roders v. Laura and John Arnold Foundation, 2019 WL 2429574, at *1 (D.N.J. June 11, 2019), aff’d sub nom. Roders v. Christie, 2020 WL 1079233. Plaintiffs’ strict products liability claims put the PSA at issue, claiming the algorithm had assigned an erroneously low score to a convicted felon, who allegedly murdered their son three days after he was released from detention on non-monetary conditions. 2020 WL 1079233, at *1.
The trial court granted defendants’ motion to dismiss those claims on the basis that an algorithm, such as the PSA, cannot be considered a “product” subject to the PLA. In so holding, the trial court looked to the Restatement (Third) of Torts, which articulates two categories of products: (1) “tangible personal property distributed commercially”; and (2) “[o]ther items, such as property and electricity . . . when the context of their distribution and use is sufficiently analogous to . . . tangible personal property.” Id. citing Restatement (Third) of Torts 19. Thus, the court reasoned, because the PSA does not fit into either of these categories, it is not a “product” subject to the PLA, and plaintiffs’ claims could not proceed. 2019 WL 2429574, at *2-3.
On appeal, the Third Circuit upheld the dismissal of plaintiffs’ claims, holding that the PSA does not fit the definition of a “product” for purposes of the PLA for two reasons. First, the PSA, as a tool designed to assist courts, is not distributed commercially, and second, because “information, guidance, ideas, and recommendations are not products under the Third Restatement, both as a definitional matter and because extending strict liability to the distribution of ideas would raise serious First Amendment concerns.” 2020 WL 1079233, at *2 (internal quotations omitted). Importantly, the Court did not adopt a bright line rule barring the application of strict products liability claims for all artificial intelligence-based software – only those which do not fit the Restatement’s definition.
While the Third Circuit’s decision Rodgers is non-precedential, it addresses a question many have flagged as central to the development of legal norms around emerging artificial intelligence-based products: whether artificial intelligence software is a product at all? As the Court astutely noted, this is a thorny question, which implicates concerns, such as the First Amendment, far beyond standard tort claims. All manner of commercial and consumer products are incorporating artificial intelligence, and courts around the country will be forced to answer this same question to determine how laws can appropriately address injuries arising from such products.