Data Privacy Class Action Alleges Insurers Improperly Collected The Data Of 40 Million Users Through Third-Party Applications

By Gerald L. Maatman, Jr., Justin Donoho, George J. Schaller, Ryan T. Garippo

Duane Morris Takeaways: In Mahoney, et al. v. The Allstate Corp, et al., 25-CV-01465 (N.D. Ill. Feb. 11, 2025), Plaintiffs Michael Mahoney and Scott Schultz (collectively, “Plaintiffs”) filed a putative class action lawsuit asserting Allstate, and its subsidiary Arity, illegally obtained personal driving data of 40 million policyholders through third-party mobile application software.  The case is pending in the U.S. District Court for the Northern District of Illinois before Judge Steven C. Seeger.This is the third lawsuit in a series of lawsuits alleging class-wide allegations based on Allstate’s alleged data collection practices.  See Sims et al. v. The Allstate Corp. et al., 1:25-CV-00407 (N.D. Ill. Jan. 14, 2025) (alleging data collection through third party application Sirius XM); see also Arellano et al. v. The Allstate Corp. et al., 1:25-CV-01256, (N.D. Ill. Feb. 5, 2025) (alleging data collection through third party applications Life360, GasBuddy, and Fuel Rewards). 

Mahoney, Sims, and Arellano, represent a triumvirate of data privacy class actions centered on allegations of improper data collection through third-party applications.  Companies will be well-served monitor these cases for their novel assertions in trending data privacy litigation.

Complaint Allegations

Michael Mahoney resides in San Francisco, California, and he downloaded the GasBuddy application in 2011 to “find competitive gas prices.”  Mahoney, 25-CV-01465, ECF No. 1 § III ¶ 14 (N.D. Ill. Feb. 11, 2025).  Scott Schultz resides in Highland Park, Illinois, and he downloaded the GasBuddy application in 2021 and used it “in his own and other people’s vehicles to find competitive gas prices.”  Id. § III ¶ 15.

Plaintiffs collectively allege that Allstate and its subsidiary Arity (collectively, “Defendants”) “conspired to collect drivers’ geolocation data and movement data from mobile devices, in-car devices, and vehicles.”  Id. § IV ¶ 7.  Plaintiffs allege Defendants designed a software development kit that could be integrated into third-party mobile applications such as “Routely, Life360, GasBuddy, and Fuel Rewards.”  Id.  § IV ¶ 8.  Plaintiffs further allege Defendant advertised that they “collect data ‘every 15 seconds or less’ from 40 million ‘active mobile connections’ and ‘derive[] unique insights that help insurers, developers, marketers, and communities understand and predict driving behavior at scale.”  Id. § IV ¶ 24.

Plaintiffs contend Defendants’ software development kit was “designed to and does collect data” including “Geolocation data and ‘GPS Points,’” “cellphone accelerometer, magnetometer, and gyroscopic data,” “Trip attributes” data (including start and end locations, trip distances, trip duration), “Derived events” data (including acceleration, speeding, distracted driving, crash detection), and “Metadata.”  Id. § IV ¶ 11 (A) – (E).  Plaintiffs further assert that when using these third-party applications “Defendants could collect real-time data on their locations and movements and surreptitiously collect highly sensitive and valuable data directly from Plaintiffs’ mobile phones.”  Id. § IV ¶ 16.

It is also important to note that Plaintiffs maintain that Defendants used their personal data to “develop, advertise, and sell several products and services to third parties, including insurance companies . . .” and used the purchased consumer data for “[Defendants’] own underwriting purposes.”  Id. § IV ¶ 23.  Plaintiffs, ultimately, assert that Defendants real purpose in using this data is for their “own financial and commercial benefit” and to obtain “substantial profit.”  Id. § V ¶ 49.  They ultimately assert via their nine-count Complaint that this technology amounts to a wiretapping of their personal information which entitles them, inter alia, to a sum of “$100 per day per violation or $10,000” per class member whichever is greater.  Id. § V ¶ 51.

Implications For Companies

Although such data collection lawsuits are no longer a new phenomenon, their scope has become far more aggressive as the plaintiffs’ bar continues to look for ways to monetize lawsuits against corporations using such technologies.

Take for example the dilemma presented by Mahoney.  In that case, it is likely that Defendants will have strong defenses to this action.  For example, Plaintiffs admit that Defendants’ purpose in using this technology was to earn “substantial profit.”  Id. § V ¶ 49.  Based on similar allegations, many courts have found that these purposes are insufficient for a plaintiff to avail itself of such wiretapping statutes.  See, e.g., Katz-Lacabe v. Oracle Am., Inc., 668 F. Supp. 3d 928, 945 (N.D. Cal. 2023) (dismissing wiretap claim because defendant’s “purpose has plainly not been to perpetuate torts on millions of Internet users, but to make money.”).

There are, however, enough court rulings that come out in the opposite direction to give a corporate defendant pause.  See, e.g., R.S. v. Prime Healthcare Services, Inc., No. 24-CV-00330, 2025 WL 103488, at *6-7 (C.D. Cal. Jan. 13, 2025) (recognizing the split and siding with the plaintiffs).  And, if Plaintiffs are correct that there are 40 million individuals in the class, and that each class member is entitled to $10,000 at a minimum, then this lawsuit alleges at least $400 billion dollars in liability.  Even if there is a 1% chance of success on these claims, it would suggest that the completely unrealistic figure of $4 billion dollars is on the table.

Corporations in these types of class actions are faced with the difficult choice of settling the claims for an astronomical figure based on the use of technologies which are ubiquitous in nature (like software development kits for mobile applications) or defend a $400 billion lawsuit based on defenses in an area of the law which is not fully developed.  It will be interesting to see how the Mahoney defendants balance these concerns as the case progresses, because many twists and turns lie ahead.

In the meantime, corporate counsel should take the opportunity to evaluate their companies’ data collection and privacy policies to make sure their companies are not easy targets.  If the allegations in Mahoney are any example, the mere threat of one of these lawsuits should be enough to keep corporate counsel up at night.  And, if their companies are ultimately sued in one of these lawsuits, they should ensure that an experienced defense team has its hands on the steering wheel. 

The Class Action Weekly Wire – Episode 89: Key Trends In Privacy Class Actions

Duane Morris Takeaway: This week’s episode of the Class Action Weekly Wire features Duane Morris partner Jerry Maatman, special counsel Justin Donoho, and senior associate Tyler Zmick with their discussion of the key trends analyzed in the 2025 edition of the Duane Morris Privacy Class Action Review, including the major settlements and cutting-edge litigation theories percolating in a variety of privacy-related class actions, including the Biometric Information Privacy Act (“BIPA”), advertising technologies (“adtech”), and artificial intelligence tools.

Bookmark or download the Privacy Class Action Review e-book here, which is fully searchable and accessible from any device.

Check out today’s episode and subscribe to our show from your preferred podcast platform: Spotify, Amazon Music, Apple Podcasts, Samsung Podcasts, Podcast Index, Tune In, Listen Notes, iHeartRadio, Deezer, and YouTube.

Episode Transcript

Jerry Maatman: Welcome, loyal listeners, to the next installment of the Class Action Weekly Wire. My name is Jerry Maatman, I’m a partner at Duane Morris, and joining me today are my colleagues, Justin and Tyler.

Justin Donoho: Thank you, Jerry, happy to be part of the podcast.

Tyler Zmick: Thanks, Jerry. I’m glad to be here.

Jerry: Today on our podcast we’re discussing the recent publication of this year’s edition of the Duane Morris Privacy Class Action Review. Our loyal listeners can download the desk reference from our blog, the Duane Morris Class Action Defense Blog. Justin, can you tell our listeners a little bit about our desk reference?

Justin: Yes, and thank you. Last year saw a continued explosion in privacy class action litigation. As a result, it is imperative that companies beef up their efforts to comply with privacy laws in the many ways that companies interact with employees, customers, and others. To that end, the class action team at Duane Morris is pleased to present the Privacy Class Action Review – 2025. This publication analyzes the key privacy-related rulings and developments in 2024, and the significant legal decisions and trends impacting privacy class action litigation for 2025 in a variety of different privacy-related subject areas. We hope that companies and employers will benefit from this resource in their compliance with these evolving laws and standards.

Jerry: Well, just on this podcast I know the assembled speakers have over 60 years of experience in dealing with these issues. But I’d have to say 2024 was a year of incredible change and flux. Tyler, what are some of the key guideposts out there in the case law over the past 12 months?

Tyler: So, there’s been an explosion of class action lawsuits in recent years, including 2024, involving adtech technologies. And of course, biometric data. I think the biggest driver is the fact that we are operating in a legal environment that is evolving so quickly that said technology has far outpaced the law, especially when it comes to new tools like Meta Pixel, Google Analytics, and other adtech technologies. While these tools are innovative in many ways that benefit businesses, they’re also collecting massive amounts of sensitive data – data that consumers may have never explicitly agreed to share. The courts are now grappling with outdated statutes, such as old wiretapping and eavesdropping laws, and trying to apply them to modern technologies.

Justin: Absolutely. Businesses that rely on these technologies have often done so without thinking through a variety of ways that they can mitigate the risk of noncompliance or mitigate the risk of facing any class action lawsuit in the first place, by modernizing their terms of service and data privacy policies. The rise in class actions is directly related to an increased public awareness about data privacy, and of course, the increased aggressiveness of plaintiffs’ attorneys trying to expand the application of the Illinois Biometric Information Privacy Act, for example, with high-profile cases alleging violations of various AI technologies that perform functions other than facial recognition or any kind of person recognition.

Jerry: Speaking of BIPA – 2024 certainly saw a mixed bag of rulings related to biometric data collection, particularly on the issue of facial analysis technologies. So, how does one make sense, if you’re a corporate decision-maker, of what businesses are facing and the risks that are out there, given these murky waters with the case law developments?

Tyler: That’s a great question. The mixed rulings obviously create an atmosphere of uncertainty. And that’s what I think is driving so much of the litigation companies are basically being forced to decide whether to settle or to litigate these cases and risk very high damage awards, because often there are substantial penalties for violations when courts release decisions on issues where there’s no clear-cut answer, and when the decisions are often conflicting, such as on the issue you mentioned about whether certain types of data count as biologically unique. It leaves businesses with many gray areas to navigate, and this is only compounded by the reality that these technologies evolve faster than courts can keep up.

Justin: Yes, and from the business side, companies are being forced to take a much more cautious approach when it comes to how they collect and process biometric data. For example, they’re revisiting their privacy policies in terms of service and taking a closer look at the technologies they use, too. Some companies, especially larger ones, like Google, Meta, and Oracle, have already settled for significant amounts, which sends a clear signal to others that ignoring these issues is just simply too costly.

Jerry: Let’s talk about settlements. So, the plaintiffs’ mantra is file the case, certify the case, then monetize the case. Certainly, in the last 12 months we saw some eye-popping settlements, particularly the $1.4 billion deal between Meta and the State of Texas. What does this tell us about the broader implications of these settlements and what it means for companies operating in this sort of environment?

Justin: Yeah, the size of these settlements is indicative of the stakes involved for sure. As you mentioned, the Meta settlement alone was huge, and it’s reflective of the kind of high-dollar cases we are now seeing across the board. Privacy class action litigation has outpaced other areas of law in terms of growth. And as companies continue to allegedly violate privacy laws, there’s real financial risk involved statutory damages in some of these privacy laws can reach up to $5,000 per violation, which to a plaintiff means per website visit of millions of visitors. And with class actions these violations multiply quickly. This creates significant potential liability for companies.

Tyler: I think that’s exactly right. and it’s not just the monetary cost. These cases also damage a company’s reputation in the world we live in. Consumers are more aware than ever of how their data is used. And if you’re a company in a settlement like that, it’s not just about paying a fine – you’ve also potentially lost consumer trust, and that can have long term business implications.

Jerry: Well, we’ve certainly seen a rise in filings of privacy-related class actions, but we’re also seeing an increase in the skill and ability of the plaintiffs’ bar to secure certification in these class actions. Do you expect this trend to continue during 2025?

Justin: Well, at least the rise in privacy class actions I expect to continue. I mean, it’s been going like this, and it’s going to keep going. We’ll see about the certification decisions as more consumers become aware of their rights, and as data privacy laws continue to evolve. I think we’ll continue to see an uptick in class action filings for sure. Privacy law is still in its infancy in many respects. and many of the current legal frameworks just don’t fully cover the realities of all the new technologies, and how data is being used today, and how data science is evolving the ambiguity is creating fertile grounds for litigation, and I expect that to keep growing.

Tyler: And from a litigation standpoint – yes, we’ll likely continue to see class actions. However, I do think that courts will eventually have to provide more clarity on some of these unsettled issues. We’ve got one of the first federal appeals brewing soon, for example, regarding whether online advertising technology violates the Federal Wiretap Act. As things currently stand, though, the litigation landscape in this area and many other areas of privacy law remain in flux, and there’s still a lot of uncertainty about certain privacy laws, and how they will be applied.

Jerry: Well, I guess the bottom line is we’ve reached a pivot point, certainly a pivotal moment in the intersection of technology and privacy law. Well, thank you, Justin and Tyler, for being here today, and thank you to our loyal listeners for participating in this week’s Class Action Weekly Wire. Please stop by and visit our blog for a free copy that you can download of the Privacy Class Action Review e-book.

Tyler: Thank you for having me, Jerry, and thank you, Listeners.

Justin: Thank you so much, everybody.

It’s Here! The Duane Morris Privacy Class Action Review – 2025

By Gerald L. Maatman, Jr., Jennifer A. Riley, Alex W. Karasik, Gregory Tsonis, Justin Donoho, and Tyler Zmick

Duane Morris Takeaways: The last year saw a virtual explosion in privacy class action litigation. As a result, compliance with privacy laws in the myriad of ways that companies interact with employees, customers, and third parties is a corporate imperative. To that end, the class action team at Duane Morris is pleased to present the second edition of the Privacy Class Action Review – 2025. This publication analyzes the key privacy-related rulings and developments in 2024 and the significant legal decisions and trends impacting privacy class action litigation for 2025. We hope that companies and employers will benefit from this resource in their compliance with these evolving laws and standards.

Click here to bookmark or download a copy of the Privacy Class Action Review – 2025 e-book. Look forward to an episode on the Review coming soon on the Class Action Weekly Wire!

Tennessee Federal Court Rejects Certification Of Breach Of Contract Class Action

By Gerald L. Maatman, Jr., Justin R. Donoho, and George Schaller

Duane Morris Takeaways:  On February 10, 2025, Judge Aleta A. Trauger of the U.S. District Court for the Middle District of Tennessee denied class certification in a case involving breach of contract and a disputed element of mutual assent a/k/a meeting of the minds, in Hall v. Warner Music Group Corp., No. 22-CV-0047 (M.D. Tenn. Feb. 10, 2025).  The ruling is significant as it shows that plaintiffs who file class action complaints alleging breach of contract cannot satisfy Rule 23’s commonality requirement where the issue of whether the parties agreed to a material term of contract requires individualized inquiry into the parties’ minds and whether they met. 

Background

This case involving lack of mutual assent is one of the many since the famous case of Raffles v. Wichelhaus, 159 Eng.Rep 375 (1864), in which the defendant agreed to purchase cotton arriving in a ship named “Peerless” arriving while cotton prices were low, whereas the plaintiff seller had in mind a different ship by the same name arriving while cotton prices were high.  (And where the English High Court found no binding contract).

In Hall, the plaintiffs, two musical artists, sued for breach of implied contract against a record label.  The parties had entered into a written recording agreement providing for the payment of 8% royalties at a time before the invention of digital streaming and not expressly covering distribution through digital streaming.  Hall, slip op. at 2.  In 2005, when the label started streaming plaintiffs’ music digitally both domestically and internationally, it began to pay the plaintiffs at the higher rate appearing on their royalty statements of 50%.  Id. at 3, 14.  For foreign digital streaming, the 50% rate was applied after the deduction of a payment to the foreign distributor.  Id. at 12-13.  It was common in the industry and a consistent course of dealing of the defendant to apply royalty rates to digital streaming revenues received only after payment to the foreign distributor.  Id.  The plaintiffs accepted these digital streaming royalty payments for years without viewing the royalty statements or “attempting to identify the revenue base against which a royalty rate for foreign streaming was applied . . . until [one of the plaintiff’s] first discussion with one of his attorneys in this case.”  Id. at 15. 

The plaintiffs moved for class certification under Rule 23.  The plaintiffs maintained that they met the commonality requirement because they and other artists with legacy contracts received royalty payments for foreign streaming sales with statements indicating an unqualified 50% royalty.  Id. at 10-11.  In contrast, the record label maintained that a claim for breach of implied contract requires the plaintiffs to prove that a valid and enforceable contract was formed between the label and “each class member, which will require an individualized inquiry into the knowledge, understanding, and intent of the artists, including whether the artist even looked at the royalty statements, whether the artists construed them to offer an implied amendment, what exactly the artist believed those implied terms to be, whether the artist had a good-faith belief about a possible rescission claim, whether the artist would have rescinded unless paid at the source, whether the artist intended to forbear, and when (if ever) these events occurred.”  Id. at 11 (emphasis in original).  In other words, according to the record label, the common question, “was an implied contract formed?” could not be answered by a simple yes or no without such an individualized inquiry.  Id.

The Court’s Decision

The Court agreed with defendants and held that plaintiffs did not carry their burden of showing commonality.

Central to Court’s holding was the “problematic question of mutual assent.”  Id. at 18.  As the Court explained, “even if the court presumes that other putative class members’ royalty statements look like the plaintiffs’ and that there are common questions regarding the defendants’ conduct that may yield common answers (i.e., that the royalty statements do not expressly reflect that the royalties are calculated based [after paying the foreign distributor]), it is clear that the threshold question of whether an implied contract between [the label] and each putative class member was formed does not yield a common answer but, instead, will depend entirely on the particularized circumstances of each artist whose contract, like the plaintiffs’, does not expressly provide for royalties on foreign digital streaming.”  Id.

In short, the Court reasoned that “the named plaintiffs’ particularized circumstances show that they simply never thought about whether an implied contract had been formed or its terms until approached by lawyers.  Other artists may have paid closer attention to their business arrangements.”  Id.

In conclusion, the Court noted that, “to the extent there are questions of fact or law common to the plaintiffs and all putative class members, the relative importance of these common questions pales in comparison to the importance of those that do not yield a common answer — primarily the question of whether implied contracts were formed at all.”  Id. at 23.

Implications For Companies

The Hall decison is a win for defendants of breach of contract class actions involving the issue of whether the parties had a meeting of the minds on a material term of contract.  In such cases, the Hall decision can be cited as useful precedent for showing that the commonality requirement is not met because individualized inquiries predominate when it comes to analyzing evidence regarding a meeting of the minds. 

The Court’s reasoning in Hall applies not only in cases involving: (1) commercial form contracts, like in Hall, but also (2) alleged employment contracts, see Cutler v. Wal-Mart Stores, Inc., 927 A.2d 1 (Md. Ct. App. 2007) (affirming denial of motion for class certification, stating, “Any determination concerning a ‘meeting of the minds’ necessarily requires an individual inquiry into what each class member, as well as the [employer’s] employee who allegedly made the offer, said and did”); In re Wal-Mart Wage & Hour Emp. Pracs. Litig., 2008 WL 3179315, at *19 (D. Nev. June 20, 2008) (denying motion for class certification, stating, “Plaintiffs’ breach of contract claims would involve particularized inquiry into contract formation, including such issues as meeting of the minds”); (3) form real estate contracts, see Haines v. Fid. Nat’l Title of Fla., Inc., 2022 WL 1095961, at *17 (M.D. Fla. Feb. 17, 2022) (denying motion for class certification, stating, “If a buyer and seller interpreted [the agreement] the way [seller] interprets the provision, their meeting of the minds would have a significant impact upon any potential liability for [seller]. In that regard, the buyer’s and seller’s state of mind for each transaction are relevant . . . individualized discovery and factfinding regarding each buyer’s and seller’s intent and understanding would be required”); and (4) alleged contracts regarding the use of AI, see Lokken v. UnitedHealth Group, Inc., 2025 WL 491148, at *8 (D. Minn. Feb. 13, 2025) (finding insureds’ claim against health insurer for breach of contract regarding insurer’s use of AI-based automated decision making technologies not preempted by the Medicare Act and therefore allowed to proceed to discovery, raising the question of whether parties’ minds met via the insurer’s explicit descriptions of its “claim decisions as being made by ‘clinical services staff’ and ‘physicians,’ without mention of any artificial intelligence”).

The Class Action Weekly Wire – Episode 88: Key Trends In Data Breach Class Actions

Duane Morris Takeaway: This week’s episode of the Class Action Weekly Wire features Duane Morris partners Jerry Maatman and Jennifer Riley, special counsel Justin Donoho, and associate Ryan Garippo with their discussion of the key trends analyzed in the 2025 edition of the Duane Morris Data Breach Class Action Review, including the contributing factors in the exponential growth of data breach class action filings, the sophistication of the plaintiffs’ bar litigation theories, and the chart-topping settlements in this area.  

Bookmark or download the Data Breach Class Action Review e-book here, which is fully searchable and accessible from any device.

Check out today’s episode and subscribe to our show from your preferred podcast platform: Spotify, Amazon Music, Apple Podcasts, Samsung Podcasts, Podcast Index, Tune In, Listen Notes, iHeartRadio, Deezer, and YouTube.

Episode Transcript

Jerry Maatman: Welcome all our loyal listeners and blog readers. Thank you for being here on our weekly podcast, the Class Action Weekly Wire. I’m, Jerry Maatman of Duane Morris, and joining me today are my colleagues, Jen, Justin, and Ryan. Thanks so much for being on this particular podcast.

Jennifer Riley: Thank you, Jerry. Happy to be part of the podcast today.

Justin Donoho: Thanks, Jerry. Glad to be here.

Ryan Garippo: Thanks for having me, Jerry.

Jerry: Today in the podcast we’re discussing the publication of this year’s Duane Morris Data Breach Class Action Review and desk reference designed for our clients to give them the latest, greatest information on the cutting-edge issues in the world of data breach class action. Listeners can find the e-book publication on our blog, the Duane Morris Class Action Defense blog. Jen, can you share with our listeners a bit about this desk reference and publication?

Jennifer: Absolutely, Jerry. The volume of data breach class actions exploded in 2024. Data breach has emerged as one of the fastest growing areas of class action litigation. The Review contains an overview of these filing numbers as well as settlements as well as some of the key decisions in this area. So, in sum, courts continue to reach inconsistent outcomes on issues such as standing and uninjured class members, those issues that are uniquely challenging in the data breach space. The Review has dozens of contributors, and it reflects really the collective experience and expertise of our class action defense group.

Jerry: I think it used to be, people thought whenever there was a drop in the stock following a company announcement, as sure as the sun rises in the east and sets in the west every day, there’d be a securities fraud class action lawsuit being filed. That seems to be the case now, when there’s a data breach incident, a data breach class action follows in its wake. Justin, can you shed some light on why this particular cause of action in this particular space has been growing incrementally over the last 36 months?

Justin: Absolutely. I mean, the frequency of the data breaches have been increasing, which is a huge part, and of course, with that comes heightened attention from both consumers and the plaintiffs’ bar. High profile cases, such as that multidistrict litigation arising from the Marriott International breach that affected over 133 million people, for example. There’s the MOVEIt MDL, which is another big one that got going last year. These have all put companies on notice that failure to secure personal data can lead to costly litigation. Cost lawsuits are not just about the breach itself, it’s also about the aftermath. So, consumers are now more aware of the risks and more inclined to seek legal recourse when their data is compromised.

Jerry: I think this is a great area where the notion that the law is trailing behind technology and can’t keep up with it – may well explain some of the developments in this particular space from a cybersecurity perspective. How do you think the increasing frequency of these sorts of events, and the sophistication of cyber criminals, is playing out in the class action space?

Ryan: Well, the rise in cyberattacks is definitely a huge factor. We’re seeing more sophisticated tactics from cybercriminals. Ransomware is at least one prime example – hackers demand payments in exchange for not publishing or further exploiting stolen data. The issue is that paying the ransom doesn’t necessarily guarantee the safe return or the deletion of the data, which makes these incidents devastating for companies. Additionally, I think we’ve seen as there’s been a shift to remote work and cloud-based infrastructure, that more vulnerabilities are exposed which ultimately increases the frequency of breaches. As a result, I think we’re seeing more lawsuits following these incidents and plaintiffs’ attorneys are more eager to capitalize on the growing number of affected individuals.

Jerry: In the last two weeks, the U.S. Supreme Court has accepted a case for review on the issue of uninjured class members, and whether or not their presence is something that can be used by a defendant to stop class certification. And one of the things we’ve seen in the last few years in the data breach area is the lack of injury or no injury-in-fact, as the Supreme Court has articulated that in TransUnion v. Ramirez. Jen, what do you see in terms of what plaintiffs are doing to try and come up with theories, at least from a financial damage or injury standpoint, that companies are now facing in what I would call data breach litigation 2.0?

Jennifer: Well, Jerry, I think several factors are really contributing to the rise of the popularity of these lawsuits. First, I think the sheer volume of people affected by these breaches has ballooned. Especially with breaches impacting millions of consumers or employees. As the size of these cases increases, I think it naturally leads to higher settlement amounts which in turn are attracting more plaintiffs’ lawyers to this area. Additionally, I think the type of data being compromised is becoming more sensitive – financial and healthcare information, for example – are leading to additional claims and higher potential damages and are leading plaintiffs’ attorneys to become more creative in looking for ways to monetize, capitalize on these breaches in terms of converting them into settlement dollars.

Justin: Yes, absolutely. And some courts are also becoming more sympathetic to plaintiffs in these cases, and to the potential long-term consequences of data breaches to plaintiffs, even where immediate harm is not apparent. So, it’ll be interesting to see where that Supreme Court case plays out. And let’s not forget about the legal fees and the expert fees also contributing to some of these large settlement dollars. As these cases become more complex with issues like class certification and determining damages, and the reasonableness of the cybersecurity, the costs involved in litigating these lawsuits are skyrocketing.

Jerry: You mentioned class certification – certainly the plaintiffs’ bar their theory is file the case, certify the case, then monetize the case, and the statistical study within the desk reference talks about the rise in class certification to 40%. Still a low number, but significantly up from 16% in calendar year 2023. What do you attribute to the trend that’s showing an upward number and a more of a chance for the plaintiffs’ bar to certify their data breach class actions?

Ryan: Well, like we mentioned before, I think it’s reflective of the fact that plaintiffs’ counsel has gotten more sophisticated in this space, and courts are getting more sympathetic to the plaintiffs at issue. But that said, class certification is still a major hurdle in any class action. And it’s particularly challenging in data breach cases. The increased success rate for class certification in the data breach space is 40% in 2024, reflecting that evolving legal precedent. Courts are now more inclined to accept the argument that consumers have suffered harm, even if their data hasn’t been directly misused, and that the mere recognition of an indirect harm, such as the increased risk of identity, theft, or emotional dispute or emotional distress, is enough to allow plaintiffs to get into court and overcome this clear obstacle.

Jerry: Jen, what were some of the major data breach litigation markers in the federal courts this year, by your way of thinking?

Jennifer: Well, Jerry, great question. We discuss in the Review some of the largest ones. Certainly, one of the prime examples is the ongoing MOVEIt Customer Data Breach Litigation. That litigation that began back in 2023 continued throughout 2024, and is ongoing. In that one, the Judicial Panel on Multidistrict Litigation consolidated more than 200 class action lawsuits. Those lawsuits resulted from a Russian cybergang hacking the file transfer software MOVEIt. The Judicial Panel on Multidistrict Litigation transferred those proceedings after consolidating them to the U.S. District Court for the District of Massachusetts. The plaintiffs in that case, as I mentioned, alleged that this vulnerability in the Massachusetts-based company MOVEIt, a transfer file software, was exploited. That data breach is considered to be the largest hack of 2023. According to the Panel’s initial transfer order, it exposed personally identifiable information of more than 55 million people. So, as I mentioned, that proceeding is ongoing. In July 2024, the Transferee Court issued an order adopting a modified bellwether structure in which it ordered the plaintiffs to file up to six consolidated amended complaints, and it ordered the parties to meet confer on the defendants to be named in each of those. The plaintiffs are going to file their motions for class certification, according to the schedule at least, in the summer of 2025. So, lots to be done in those cases yet.

Jerry: Well, it seems to me that data breach litigation, especially in the class action arena, is a problem or a fear that keeps corporate counsel up at night, and some of the top settlements in this space in 2024 maybe fuel that fear. What were some of the key and highest class action settlements in the data breach case, despite the fact that certification hovered around 40%?

The largest data breach class action settlement in 2024 was $350 million in In Re Alphabet Inc. Securities Litigation, Case No. 18-CV-6245 (N.D. Cal. Sept. 30, 2024), in which the court granted final settlement approval in a class action alleging that a software glitch led to a data breach in which Google+ users’ personal data was exposed for three years.

Justin: Yes, Jerry. Plaintiffs did very well in securing high dollar settlements last year, with the top 10 settlements totaling $593.2 million dollars. This was a significant increase over 2023 when the top 10 totaled $515 million – so they keep going up, too.

Jerry: Well, my prognostication is the 2025 numbers are going to go up and even exceed those chart-toppers in the next 12 months. In terms of final parting thoughts for our loyal listeners, what are some of the takeaways and key points that our listeners and readers should keep in mind for data breach issues in 2025?

Ryan: Invest in strong cybersecurity measures – it’s essential to stay out of the game in this space and constantly involve your cybersecurity infrastructure against these emerging threats. But beyond that, companies should also have a well-designated incident response plan in place and make sure that it’s regularly tested. This helps ensure not only quicker recovery, but also a stronger defense in court if a breach ever occurs. This legal landscape is evolving, and data breaches are no longer niche; they’re becoming an expected part of the litigation landscape, and so, having a proactive and comprehensive approach can help mitigate the immediate and long-term costs, and help keep you out of those $500 million numbers that Jerry and Justin mentioned before.

Jerry: Well, thanks, Jen, Justin, and Ryan, for your thought leadership and your analysis of this particular area. Loyal listeners, please stop by our blog and website to download for free our e-book, Data Breach Class Action Review – 2025. Thanks so much everyone for lending your expertise today on our Class Action Weekly Wire podcast.

Ryan: Thanks, Jerry.

Justin: Thanks for having me and thank you, listeners.

Jennifer: Thanks so much, everyone. See you next week.

Illinois Supreme Court Affirms Dismissal Of Data Breach Class Action For Lack Of Standing

By Gerald L. Maatman, Jr., Justin Donoho, and George J. Schaller

Duane Morris Takeaways: On January 24, 2025, in Petta v. Christie Bus. Holdings Co., P.C., 2025 IL 130337, the Illinois Supreme Court ruled that a plaintiff lacked standing under Illinois law to bring her class action complaint alleging that her social security number and insurance information may have been accessed in connection with a data incident where a medical provider discovered unauthorized access to one of its business email accounts.  The ruling is significant because it shows that data breach claims cannot be brought in Illinois court without specifying actual injury that is fairly traceable to the breach.

Case Background

This case is one of the thousands of data breach class actions filed in the last three years.  In Petta, Plaintiff brought suit against a medical provider.  According to Plaintiff,  she received a letter from the provider titled “Notice of Data Incident” explaining that an unknown third party gained unauthorized access to one of its business email accounts for about a month, in an attempt to intercept a business transaction between the provider and a third-party vendor.  Id. ¶¶ 1, 6.  The letter also stated that “the impacted account MAY have contained certain information related” to Plaintiff’s social security number and medical insurance information but “[t]he unauthorized actor did not have access to [the provider’s] electronic medical record” and there was no “evidence of identity theft or misuse of [Plaintiff’s] personal information.”  Id. ¶ 6 (emphasis in letter).The letter concluded by offering Plaintiff 12 months of credit monitoring and identity protection services at no cost if she wished to enroll.  Id., ¶ 7.

Plaintiff also alleged her “phone number, city, and state [were] used in connection with a loan application … in someone else’s name” and she received multiple calls regarding “loan applications she did not initiate.”  Id., ¶ 9.   

Based on these allegations, Plaintiff alleged claims for negligence and violation of Illinois’ Personal Information Protection Act. 

The trial court dismissed the complaint for lack of a viable legal theory and a bar by the economic loss doctrine.  The Illinois Appellate Court affirmed, but on the basis that the Plaintiff lacked standing to bring the action on behalf of herself and the putative class. 

Plaintiff thereafter appealed to the Illinois Supreme Court. 

The Illinois Supreme Court’s Opinion

The Illinois Supreme Court affirmed and ruled Plaintiff lacked standing and affirmed the dismissal of her complaint on that basis.  Id., ¶ 25.

In Illinois, standing requires an injury in-fact. As a result, the Illinois Supreme Court reasoned that a plaintiff alleging only “a ‘purely speculative’ future injury” and “no ‘immediate danger of sustaining a direct injury’ lacks sufficient interest to have standing.”  Id. ¶ 18 (quoting Chi. Teachers Union, Local 1 v. Bd. of Ed. of Chi., 189 Ill. 2d 200, 206-07 (2000)). 

The Illinois Supreme Court affirmed Plaintiffs’ lack of standing, reasoning that she, and the putative class, faced “only an increased risk that their private personal data was accessed by an unauthorized third party” and that “an increased risk of harm is insufficient to confer standing” in a complaint seeking money damages.  Id., ¶ 21.  The Illinois Supreme Court opined nothing “in the letter suggest[ed] that it is likely the third party did, in fact, take the [private personal] data” and the provider’s investigation revealed that the unauthorized third party was “attempting to intercept a financial transaction, not steal patients’ private personal information.” Id, ¶ 20

The Illinois Supreme Court also noted that Plaintiff’s unauthorized loan application related solely to Plaintiff and her complaint did not present any allegations that putative class members had a similar experience regarding a loan application.  Id., ¶ 23.  However, the Illinois Supreme Court declined to answer the question of whether standing must be shown at the outset for the entire putative class and instead focused “solely on [Plaintiff] individually,” finding that “Plaintiff’s allegation regarding the loan application is insufficient to confer standing.”  Id. 

In short, the Illinois Supreme Court concluded that the unsuccessful loan application allegations were not “fairly traceable” to any of the provider’s alleged misconduct and instead were “purely speculative” given there was “no apparent connection between the purported fraudulent loan attempt and the data breach at issue” as the phone number and city information used in the loan application was “readily available” to the public.  Id., ¶ 25(citing 2023 IL App (5th) 220742, ¶ 23).  Therefore, Plaintiff lacked standing to bring her claims.

Implications For Companies

The Illinois Supreme Court’s decision in Petta is a win for companies that suffered a data breach only possibly affecting customers, informed the customers of the breach, and offered to pay for their credit monitoring.  Petta shows that to confer standing under Illinois law, more is required.  Specifically, data breach plaintiffs need to identify actual injury fairly traceable to the breach.

Ninth Circuit Dismisses Adtech Class Action For Lack Of Standing

By Gerald L. Maatman, Jr. and Justin Donoho

Duane Morris Takeaways:  On December 17, 2024, in Daghaly, et al. v. Bloomingdales.com, LLC, No. 23-4122, 2024 WL 5134350 (9th Cir. Dec. 17, 2024), the Ninth Circuit ruled that a plaintiff lacked Article III standing to bring her class action complaint alleging that an online retailer’s use of website advertising technology disclosed website visitors’ browsing activities in violation of the California Invasion of Privacy Act and other statutes.  The ruling is significant because it shows that adtech claims cannot be brought in federal court without specifying the plaintiffs’ web browsing activities allegedly disclosed. 

Background

This case is one of the hundreds of class actions that plaintiffs have filed nationwide alleging that Meta Pixel, Google Analytics, and other similar software embedded in defendants’ websites secretly captured plaintiffs’ web browsing data and sent it to Meta, Google, and other online advertising agencies.  This software, often called website advertising technologies or “adtech” is a common feature on many websites in operation today.

In Daghaly, Plaintiff brought suit against an online retailer.  According to Plaintiff, the retailer installed the Meta Pixel and other adtech on its public-facing website and thereby transmitted web-browsing information entered by visitors such as which products the visitor clicked on and whether the visitor added the product to his or her shopping cart or wish list.  Id., No. 23-CV-129, ECF No. 1 ¶¶ 44-45.  As for Plaintiff herself, she did not allege what she clicked on or what her web browsing activities entailed upon visiting the website, only that she accessed the website via the web browser on her phone and computer.  Id. ¶ 40.

Based on these allegations, Plaintiff alleged claims for violation of the California Invasion of Privacy Act (CIPA) and other statutes.  The district court dismissed the complaint for lack of personal jurisdiction.  Id., 697 F. Supp. 3d 996 (S.D. Cal. 2023).  Plaintiff appealed and, in its appellate response brief, the retailer argued for the first time that Plaintiff lacked Article III standing.

The Ninth Circuit’s Opinion

The Ninth Circuit agreed with the retailer, found that Plaintiff lacked standing, and remanded for further proceedings.

To allege Article III standing, as is required to bring suit in federal court, the Ninth Circuit opined that a plaintiff must “clearly allege facts demonstrating” that she “suffered an injury in fact that is concrete, particularized, and actual or imminent.”  Id., 2024 WL 5134350, at *2 (citing, e.g., TransUnion LLC v. Ramirez, 594 U.S. 413, 423 (2021)). 

Plaintiff argued that she sufficiently alleged standing via her allegations that she “visited” and “accessed” the website and was “subjected to the interception of her Website Communications.”  Id. at *1.  Moreover, Plaintiff argued, the retailer’s alleged disclosure to adtech companies of the fact of her visiting the retailer’s website sufficiently alleged an invasion of her privacy and thereby invoked Article III standing because the adtech companies could use this fact to stitch together a broader, composite picture of Plaintiffs’ online activities.  See oral argument, here.

The Ninth Circuit rejected these arguments. It found that Plaintiff “does not allege that she herself actually made any communications that could have been intercepted once she had accessed the website. She does not assert, for example, that she made a purchase, entered text, or took any actions other than simply opening the webpage and then closing it.”  Id., 2024 WL 5134350, at *1.As the Ninth Circuit explained during oral argument by way of example, it is not like the Plaintiff had alleged that she was shopping for underwear and that the retailer transmitted information about her underwear purchases.  Moreover, the Ninth Circuit found “no authority suggesting that the fact that she visited [the retailer’s website] (as opposed to information she might have entered while using the website) constitutes ‘contents’ of a communication within the meaning of CIPA Section 631.”  Id.

In short, the Ninth Circuit concluded that Plaintiff lacked Article III standing, and that this conclusion followed from Plaintiff’s failure to sufficiently allege the nature her web browsing activities giving rise to all of her statutory claims.  Id. at *2.  The Ninth Circuit remanded with instructions that the district court grant leave to amend if properly requested. 

Implications For Companies

The holding of Daghaly is a win for adtech class action defendants and should be instructive for courts around the country.  Other courts already have found that an adtech plaintiff’s failure to identify what allegedly private information allegedly was disclosed via the adtech warrants dismissal under Rule 12(b)(6) for failure to plausibly plead various statutory and common-law claims.  See, e.g, our blog post about such a decision here.   Daghaly shows that adtech plaintiffs also need to identify what allegedly private information beyond the fact of a visit to an online retailer’s website was allegedly disclosed via the adtech, in order to have Article III standing to bring their federal lawsuit in the first place.

The FTC Issues Three New Orders Showing Its Increased 2024 Enforcement Activities Regarding AI And Adtech

By Gerald L. Maatman, Jr. and Justin R. Donoho

Duane Morris Takeaways: On December 3, 2024, the Federal Trade Commission (FTC) issued an order in In Re Intellivision Technologies Corp., (FTC Dec. 3, 2024) prohibiting an AI software developer from making misrepresentations that its AI-powered facial recognition software was free from gender and racial bias, and two orders in In Re Mobilewalla, Inc. (FTC Dec. 3, 2024), and In RE Gravy Analytics, Inc. (FTC Dec. 3, 2024), requiring data brokers to improve their advertising technology (adtech) privacy and security practices.  These three orders are significant in that they highlight that in 2024, the FTC has significantly increased its enforcement activities in the areas of AI and adtech.

Background

In 2024, the FTC brought and litigated at least 10 enforcement actions involving alleged deception about AI, alleged AI-powered fraud, and allegedly biased AI.  See the FTC’s AI case webpage located here.  This is a fivefold increase from the at least two AI-related actions brought by the FTC last year.  See id.  Just as private class actions involving AI are on the rise, so are the FTC’s AI-related enforcement actions.

This year the FTC also brought and litigated at least 21 enforcement actions categorized by the FTC as involving privacy and security.  See the FTC’s privacy and security webpage located here.  This is about twice the case activity by the FTC in privacy and data security cases compared with 2023.  See id.  Most of these new cases involve alleged unfair use of adtech, an area of recently increased litigation activity in private class actions, as well.

In short, this year the FTC officially achieved its “paradigm shift” of focusing enforcement activities on modern technologies and data privacy, as forecasted in 2022 by the FTC’s Director, Bureau of Consumer Protection, Samuel Levine, here.

All these complaints were brought by the FTC under the FTC Act, under which there is no private right of action.

The FTC’s December 3, 2024 Orders

In Intellivision, the FTC brought an enforcement action against a developer of AI-based facial recognition software embedded in home security products to enable consumers to gain access to their home security systems.  According to the complaint, the developer described its facial recognition software publicly as being entirely free of any gender or racial bias as shown by rigorous testing when, in fact, testing by the U.S. Department of Commerce’s National Institute of Standards and Technology (NIST) showed that the software was not among the top 100 best performing algorithms tested by NIST in terms of error rates across different demographics, including region of birth and sex.  (Compl. ¶ 11.)  Moreover, according to the FTC, the developer did not possess any of its own testing to support its claims of lack of bias.  Based on these allegations, the FTC brought misrepresentation claims under the FTC Act.  The parties agreed to a consent order, in which the developer agreed to refrain from making any representations about the accuracy, efficacy, or lack of bias of its facial recognition technology, unless it could first substantiate such claims with reliable testing and documentation as set forth in the consent order.  The consent order also requires the developer to communicate the order to any of its managers and affiliated companies in the next 20 years, to make timely compliance reports and notices, and to create and maintain various detailed records, including regarding the company’s accounting, personnel, consumer complaints, compliance, marketing, and testing.

In Mobilewalla and Gravy Analytics, the FTC brought enforcement actions against data brokers who allegedly obtained consumer location data from other data suppliers and mobile applications and sold access to this data for purposes of online advertising without consumers’ consent.  According to the FTC’s complaints, the data brokers engaged in unfair collection, sale, use, and retention of sensitive location information, all in alleged violation of the FTC Act.  The parties agreed to consent orders, in which the data brokers agreed to refrain from collecting, selling, using, and retaining sensitive location information; to establish a Sensitive Location Data Program, Supplier Assessment Program, and a comprehensive privacy program, as detailed in the orders; provide consumers clear and conspicuous notice; provide consumers a means to request data deletion; delete location data as set forth in the order; and perform compliance, recordkeeping, and other activities, as set forth in the order.

Implications For Companies

The FTC’s increased enforcement activities in the areas of adtech and AI serve as a cautionary tale for companies using adtech and AI. 

As the FTC’s recent rulings and its 2024 dockets show, the FTC is increasingly using the FTC Act as a sword against alleged unfair use of adtech and AI.  Moreover, although the December 3 orders do not expressly impose any monetary penalties, the injunctive relief they impose may be costly and, in other FTC consent orders, harsher penalties have included express penalties of millions of dollars and, further, algorithmic disgorgement.  As adtech and AI continue to proliferate, organizations should consider in light of the FTC’s increased enforcement activities in these areas—and in light of the plaintiffs’ class action bar’s and EEOC’s increased activities in these areas, as well, as we blogged about here, here, here, here, and here—whether to modify their website terms of use, data privacy policies, and all other notices to the organizations’ website visitors and customers to describe the organization’s use of AI and adtech in additional detail.  Doing so could deter or help defend a future enforcement action or class action similar to the many that are being filed today, alleging omission of such additional details, and seeking a wide range of injunctive and monetary relief.

Illinois Federal Court Dismisses Class Action Privacy Claims Involving Use Of Samsung’s “Gallery” App

By Tyler Zmick, Justin Donoho, and Gerald L. Maatman, Jr.

Duane Morris Takeaways:  In G.T., et al. v. Samsung Electronics America, Inc., et al., No. 21-CV-4976, 2024 WL 3520026 (N.D. Ill. July 24, 2024), Judge Lindsay C. Jenkins of the U.S. District Court for the Northern District of Illinois dismissed claims brought under the Illinois Biometric Information Privacy Act (“BIPA”).  In doing so, Judge Jenkins acknowledged limitations on the types of conduct (and types of data) that can subject a company to liability under the statute.  The decision is welcome news for businesses that design, sell, or license technology yet do not control or store any “biometric” data that may be generated when customers use the technology.  The case also reflects the common sense notion that a data point does not qualify as a “biometric identifier” under the BIPA if it cannot be used to identify a specific person.  G.T. v. Samsung is required reading for corporate counsel facing privacy class action litigation.

Background

Plaintiffs — a group of Illinois residents who used Samsung smartphones and tablets — alleged that their respective devices came pre-installed with a “Gallery application” (the “App”) that can be used to organize users’ photos.  According to Plaintiffs, whenever an image is created on a Samsung device, the App automatically: (1) scans the image to search for faces using Samsung’s “proprietary facial recognition technology”; and (2) if it detects a face, the App analyzes the face’s “unique facial geometry” to create a “face template” (i.e., “a unique digital representation of the face”).  Id. at *2.  The App then organizes photos based on images with similar face templates, resulting in “pictures with a certain individual’s face [being] ‘stacked’ together on the App.”  Id.

Based on their use of the devices, Plaintiffs alleged that Samsung violated §§ 15(a) and 15(b) of the BIPA by: (1) failing to develop a written policy made available to the public establishing a retention policy and guidelines for destroying biometric data, and (2) collecting Plaintiffs’ biometric data without providing them with the requisite notice and obtaining their written consent.

Samsung moved to dismiss on two grounds, arguing that: (1) Plaintiffs did not allege that Samsung “possessed” or “collected” their biometric data because they did not claim the data ever left their devices; and (2) Plaintiffs failed to allege that data generated by the App qualifies as “biometric identifiers” or “biometric information” under the BIPA, because Samsung cannot use the data to identify Plaintiffs or others appearing in uploaded photos.

The Court’s Decision

The Court granted Samsung’s motion to dismiss on both grounds.

“Possession” And “Collection” Of Biometric Data

Regarding Samsung’s first argument, the Court began by explaining what it means for an entity to be “in possession of” biometric data under § 15(a) and to “collect” biometric data under § 15(b).  The Court observed that “possession” occurs when an entity exercises control over data or holds it at its disposal.  Regarding “collection,” the Court noted that the term “collect,” and the other verbs used in § 15(b) (“capture, purchase, receive through trade, or otherwise obtain”), all refer to an entity taking an “active step” to gain control of biometric data.

The Court proceeded to consider Plaintiffs’ contention that Samsung was “in possession of” their biometrics because Samsung controls the proprietary software used to operate the App.  The Court sided with Samsung, however, concluding that Plaintiffs failed to allege “possession” (and thus failed to state a § 15(a) claim) because they did not allege that Samsung can access the data (as opposed to the technology Samsung employs).  Id. at *9 (“Samsung controls the App and its technology, but it does not follow that this control gives Samsung dominion over the Biometrics generated from the App, and plaintiffs have not alleged Samsung receives (or can receive) such data.”).

As for § 15(b), the Court rejected Plaintiffs’ argument that Samsung took an “active step” to “collect” their biometrics by designing the App to “automatically harvest[] biometric data from every photo stored on the Device.”  Id. at *11.  The Court determined that Plaintiffs’ argument failed for the same reason their § 15(a) “possession” argument failed.  Id. at *11-12 (“Plaintiffs’ argument again conflates technology with Biometrics. . . . Plaintiffs do not argue that Samsung possesses the Data or took any active steps to collect it.  Rather, the active step according to Plaintiffs is the creation of the technology.”).

“Biometric Identifiers” And “Biometric Information”

The Court next turned to Samsung’s second argument for dismissal – namely, that Plaintiffs failed to allege that data generated by the App is “biometric” under the BIPA because Samsung could not use it to identify Plaintiffs (or others appearing in uploaded photos).

In opposing this argument, Plaintiffs asserted that: (1) the “App scans facial geometry, which is an explicitly enumerated biometric identifier”; and (2) the “mathematical representations of face templates” stored through the App constitute “biometric information” (i.e., information “based on” scans of Plaintiffs’ “facial geometry”).  Id. at *13.

The Court ruled that “Samsung has the better argument,” holding that Plaintiffs’ claims failed because Plaintiffs did not allege that Samsung can use data generated through the App to identify specific people.  Id. at *15.  The Court acknowledged that cases are split “on whether a plaintiff must allege a biometric identifier can identify a particular individual, or if it is sufficient to allege the defendant merely scanned, for example, the plaintiff’s face or retina.”  Id. at *13.  After employing relevant principles of statutory interpretation, the Court sided with the cases in the former category and opined that “the plain meaning of ‘identifier,’ combined with the BIPA’s purpose, demonstrates that only those scans that can identify an individual qualify.”  Id. at *15.

Turning to the facts alleged in the Complaint, the Court concluded that Plaintiffs failed to state claims under the BIPA because the data generated by the App does not amount to “biometric identifiers” or “biometric information” simply because the data can be used to identify and group the unique faces of unnamed people.  In other words, biometric information must be capable of recognizing an individual’s identity – “not simply an individual’s feature.”  Id. at *17; see also id. at *18 (noting that Plaintiffs claimed only that the App groups unidentified faces together, and that it is the device user who can add names or other identifying information to the faces).

Implications Of The Decision

G.T. v. Samsung is one of several recent decisions grappling with key questions surrounding the BIPA, including questions as to: (1) when an entity engages in conduct that rises to the level of “possession” or “collection” of biometrics; and (2) what data points qualify (and do not qualify) as “biometric identifiers” and “biometric information” such that they are subject to regulation under the statute.

Regarding the first question, the Samsung case reflects the developing majority position among courts – i.e., a company is not “in possession of,” and has not “collected,” data that it does not actually receive or access, even if it created and controlled the technology that generated the allegedly biometric data.

As for the second question, the Court’s decision in Samsung complements the Ninth Circuit’s recent decision in Zellmer v. Meta Platforms, Inc., where it held that a “biometric identifier” must be capable of identifying a specific person.  See Zellmer v. Meta Platforms, Inc., 104 F.4th 1117, 1124 (9th Cir. 2024) (“Reading the statute as a whole, it makes sense to impose a similar requirement on ‘biometric identifier,’ particularly because the ability to identify did not need to be spelled out in that term — it was readily apparent from the use of ‘identifier.’”).  Courts have not uniformly endorsed this reading, however, and parties will likely continue litigating the issue unless and until the Illinois Supreme Court provides the final word on what counts as a “biometric identifier” and “biometric information.”

© 2009-2025 Duane Morris LLP. Duane Morris is a registered service mark of Duane Morris LLP.

The opinions expressed on this blog are those of the author and are not to be construed as legal advice.

Proudly powered by WordPress