SB 2979 – Illinois Biometric Privacy Act Legislation Passes The Illinois Senate

By Gerald L. Maatman, Jr., Alex W, Karasik, and George J. Schaller

Duane Morris Takeaways: On April 11, 2024, the Illinois Senate passed Senate Bill 2979 (the “Bill”) by vote of 46 to 13. The Bill introduces legislation that would amend the Biometric Information Privacy Act (“BIPA”) to limit claims accrued to one violation of the BIPA in stark contrast to the statute’s current “per-collection basis.” The Bill’s proposed revisions are accessible here and the status of the Bill can be tracked here. For any companies involved in privacy class action litigation, the proposed legislations is exceedingly important.

Background On The BIPA

The BIPA currently provides for “a violation for every scan,” based on the Illinois Supreme Court’s decision in Cothron v. White Castle Sys., 2023 IL 128004 (Feb. 17, 2023).  In Cothron, the Illinois Supreme Court held that “the plain language of §§ 15(b) and 15(d) shows that a claim accrues under the Act with every scan or transmission of biometric identifiers or biometric information without prior informed consent.” Id. at ¶ 45.

The majority of the Illinois Supreme Court opined that any policy-based concerns “about potentially excessive damage awards under the Act are best addressed by the legislature.” Id. at ¶ 43.

On January 31, 2024, Senator Bill Cunningham introduced SB 2979 to the Illinois Senate.

The Proposed Revisions To The BIPA Under SB 2979

The Bill’s proposed revisions articulate two key amendments regarding: (1) the “every scan” violation under §§ 15(b) and 15(d); and (2) an additional definition for “electronic signature” that augments the BIPA’s current “Written release” definition.

For violations under §§ 15(b) and 15(d), the Bill endeavors to limit alleged violations of the BIPA to a “single violation” for these respective sections.

The Bill narrows an aggrieved person’s entitled recovery to “at most, one recovery under this section,” provided that biometric identifier or biometric information was obtained from the same person using the same method of collection.  See SB 2979, 740 ILCS 14/20(b).  Similar single violation language is proposed under sub-section (d) of § 15 on the BIPA’s dissemination provision.  See SB 2979, 740 ILCS 14/20(c).

Also included in the Bill is a new definition for ‘electronic signature’ as “an electronic sound, symbol, or process attached to or logically associated with a record and executed or adopted by a person with the intent to sign the record.” See SB 2979, 740 ILCS 14/10.  This definition is then incorporated to the BIPA’s already defined “Written release.”  See id.

As of April 25, 2024, the Bill advanced to Illinois General Assembly’s House of Representatives and is assigned to the Judiciary – Civil Committee.

Implications For Employers

Employers should monitor SB 2979 closely as it progresses through the Illinois House of Representatives.  The unfettered potential damages from BIPA claims may be limited to a single scan if the Bill passes.  This would be a major and much-needed legislative coup for businesses with operations in Illinois who utilize biometric technology.

Pennsylvania Federal Court Dismisses Data Privacy Class Action Based On Lack Of Standing

By Gerald L. Maatman, Jr., Jesse S. Stavis, and Ryan T. Garippo

Duane Morris Takeaways: On April 5, 2024, Judge Marilyn J. Horan of the U.S. District Court for the Western District of Pennsylvania granted defendant Spirit Airlines’ motion to dismiss in Smidga et al. v. Spirit Airlines, No: 2:22-CV-0157 (W.D. Pa. Apr. 5, 2024). Plaintiffs alleged that Spirit had invaded their privacy and violated state wiretapping laws by recording data regarding visits to Spirit’s website, but the Court held that they failed to plead a concrete injury sufficient to establish Article III standing. The ruling should serve as a reminder of the importance of considering challenges to standing, particularly in data privacy class actions where alleged injuries are often abstract and speculative.

Case Background

Like many companies, Spirit Airlines uses session replay code to track users’ activity on its website in order to optimize user experience. Session replay code allows a website operator to track mouse movements, clicks, text entries, and other data concerning a visitor’s activity on a website. According to Spirit, all data that is collected is thoroughly anonymized.

The plaintiffs in this putative class action alleged that Spirit violated numerous state wiretapping and invasion of privacy laws by recording their identities, travel plans, and contact information. One of the plaintiffs also alleged that she had entered credit card information into the website. All three plaintiffs claimed that the invasion of privacy had caused them mental anguish and suffering as well as lost economic value in their information.

Spirit moved to dismiss based on a lack of standing under Rule 12(b)(1) and failure to state a claim under Rule 12(b)(6).

The Court’s Ruling

The Court dismissed all claims without prejudice. It held that the plaintiffs had failed to establish standing. Under Article III of the U. S. Constitution, a plaintiff must establish that he or she has standing to sue in order to proceed with a lawsuit. The standing analysis asks whether: “(1) the plaintiff suffered an injury in fact, (2) that is fairly traceable to the challenged conduct of the defendant, and (3) that is likely to be redressed by a favorable judicial decision.” Spokeo, Inc. v. Robins, 136 S. Ct. 1540, 1547 (2016).

Spirit argued that the plaintiffs had failed to identify an injury in fact because they did not suffer any concrete injury from the recording of session data. The court accepted this argument, noting that absent a concrete injury, a violation of a statute alone is insufficient to establish standing: “Congress [or a state legislature] may not simply enact an injury into existence, using its lawmaking power to transform something that is not remotely harmful into something that is.” Smidga et. al v. Spirit Airlines, Inc., No. 2:22-CV-1578, 2024 WL 1485853, at *3 (W.D. Pa. Apr. 5, 2024) (internal citations and quotation marks omitted).

Judge Horan cited over fifteen recent cases where federal courts denied standing in similar circumstances to demonstrate that the mere recording of anonymized data does not satisfy the constitutional standing requirement. Further, the Court reasoned that a website’s “collection of basic contact information” is also insufficient. Id. at *4. However, the Court did note that recording credit card data without a user’s authorization might be sufficient to establish standing. Id. at *5. In Smidga, one plaintiff alleged that she had entered her credit card information, but Spirit insisted that no personally identifying information had been stored. Because plaintiffs bear the burden to prove standing, the Court found that the mere assertion that a plaintiff entered her credit card information into a website was — absent allegations that her personalized data was tied to that information — insufficient to confer Article III standing.

Having dismissed the case for lack of standing, the Court did not analyze Spirit’s arguments under Rule 12(b)(6) for failure to state a claim. The court did, however, grant the plaintiffs leave to amend their complaint.

Implications For Companies

The success or failure of a class action often comes down to whether the putative class can achieve certification under Rule 23. Nonetheless, Rule 23 challenges are not the only weapon in a defendant’s arsenal. Indeed, a Rule 12(b)(1) challenge to standing is often an effective and efficient way to quickly dispose of a claim. This strategy is a particularly potent defense in the data privacy space, as the harms that are alleged in these cases are often abstract and speculative. The ruling in Smidga shows that even if a defendant allegedly violated a state privacy or wiretapping law, a plaintiff must still demonstrate that he or she has actually been harmed.

The Class Action Weekly Wire – Episode 46: 2024 Preview: Privacy Class Action Litigation


Duane Morris Takeaway:
This week’s episode of the Class Action Weekly Wire features Duane Morris partner Jennifer Riley, special counsel Brandon Spurlock, and associate Jeff Zohn with their discussion of 2023 developments and trends in privacy class action litigation as detailed in the recently published Duane Morris Privacy Class Action Review – 2024.

Check out today’s episode and subscribe to our show from your preferred podcast platform: Spotify, Amazon Music, Apple Podcasts, Google Podcasts, the Samsung Podcasts app, Podcast Index, Tune In, Listen Notes, iHeartRadio, Deezer, YouTube or our RSS feed.

Episode Transcript

Jennifer Riley: Welcome to our listeners, thank you for being here for our weekly podcast, the Class Action Weekly Wire. I’m Jennifer Riley, partner at Duane Morris, and joining me today is special counsel Brandon Spurlock and associate Jeffrey Zohn. Thank you for being on the podcast, guys.

Brandon Spurlock: Thank you, Jen, happy to be part of the podcast.

Jeff Zohn: Thanks, Jen, I am glad to be here.

Jennifer: Today on the podcast we are discussing the recent publication of this year’s edition of the Duane Morris Privacy Class Action Review. Listeners can find the eBook publication on our blog, the Duane Morris Class Action Defense Blog. Brandon, can you tell our listeners a little bit about our new publication?

Brandon: Yeah, sure, Jen, the last year saw a virtual explosion of privacy class action litigation. As a result, compliance with privacy laws in the myriad ways that companies interact with employees, customers, and third parties is a corporate imperative. To that end, the class action team at Duane Morris is pleased to present the Privacy Class Action Review – 2024. This publication analyzes the key privacy-related rulings and developments in 2023, and the significant legal decisions and trends impacting privacy class action litigation for 2024. We hope the companies and employers will benefit from this resource. Their compliance with these evolving laws and standards

Jennifer: In the rapidly evolving privacy litigation landscape, it is crucial for businesses to understand how courts are interpreting these often ambiguous privacy statutes. In 2023, courts across the country issued a mixed bag of results leading to major victories for both plaintiffs and defendants. Jeff, what were some of the takeaways from the publication with regard to litigation in this area in 2023?

Jeff: Yeah, you’re absolutely right that there was a mixed bag of results – both defendants and plaintiffs can point to major BIPA victories in 2023. This past year will definitely be remembered for some of the landmark pro-plaintiff rulings that will provide the plaintiffs’ bar with more than enough ammunition to keep BIPA litigation in the headlines for the foreseeable future. Specifically in 2023, the Illinois Supreme Court issued two seminal decisions that increase the opportunity for recovery of damages under BIPA, including Tims, et al. v. Black Horse Carriers, which held a five-year statute of limitations applies to claims under BIPA, and Cothron, et al. v. White Castle System, Inc., which held that a claim accrues under the BIPA each time a company collects or discloses biometric information.

Jennifer: Two major rulings indeed. Brandon, what do you anticipate these rulings will mean for privacy class actions in 2024?

Brandon: Sure, Jen. These rulings have far-reaching implications together. They have the potential to increase monetary damages in BIPA class actions in an exponential manner, especially in employment context, where employees may scan in and out of work multiple times per day across more than 200 workdays per year. In 2023, in the wake of these rulings, class action filings more than doubled. We anticipate that the high volume of case filings will continue at 2024.

Jeff: I think it’s important to add that even though BIPA is an Illinois state statue, various other states are continuing to consider proposed copycat statutes that follow the lead of Illinois. The federal government likewise continues to consider proposals for a national statute. These factors have transformed biometric privacy compliance into a top priority for businesses nationwide and have promoted privacy class actions to the top of the list of litigation risks facing business today. If other states succeed in enacting similar statutes, businesses can expect similar surges in those States as the filing numbers of Illinois continue their upward trend.

Jennifer: Thanks so much for that information – all very important for companies navigating the privacy class action regulations and statutes. The Review also talks about the top privacy settlements in 2023. How did plaintiffs do in securing settlement funds last year?

Brandon: Plaintiffs did very well in securing high dollar settlements. In 2023, the top 10 privacy settlements totaled $1.32 billion. This was a significant increase over 2022, when the top 10 privacy class action settlements totaled still a high number, but just almost $900 million. Specific to BIPA litigation settlements, the top 10 BIPA class action settlements totaled almost $150 million dollars in 2023.


Jennifer: Thank you. We will continue to track those settlement numbers in 2024 as record breaking settlement amounts have been a huge trend that we have tracked over the past two years. Thank you to Brandon and Jeff for being here today, and thank you to the loyal listeners for tuning in. Listeners, please stop by the blog for a free copy of the Privacy Class Action Review eBook.

Jeff: Thank you for having me, Jen, and thank you to all of our listeners.

Brandon: Thanks so much, everyone.

It’s Here! The Duane Morris Privacy Class Action Review – 2024


By Gerald L. Maatman, Jr., Jennifer A. Riley, and Alex W. Karasik

Duane Morris Takeaways: The last year saw a virtual explosion in privacy class action litigation. As a result, compliance with privacy laws in the myriad of ways that companies interact with employees, customers, and third parties is a corporate imperative. To that end, the class action team at Duane Morris is pleased to present the Privacy Class Action Review – 2024. This publication analyzes the key privacy-related rulings and developments in 2023 and the significant legal decisions and trends impacting privacy class action litigation for 2024. We hope that companies and employers will benefit from this resource in their compliance with these evolving laws and standards.

Click here to download a copy of the Privacy Class Action Review – 2023 eBook. Look forward to an episode on the Review coming soon on the Class Action Weekly Wire!

Spygate 2.0? New England Patriots Sued In VPPA Privacy Class Action

By Alex W. Karasik and Gerald L. Maatman, Jr.

Duane Morris Takeaways:  On February 1, 2024, a football fan filed a class action lawsuit against the New England Patriots in a Massachusetts federal court, alleging that the football team’s mobile app (the “App”) knowingly disclosed users’ location data and personal information to third-parties in alleged violation of the Video Privacy Protection Act (“VPPA”). This lawsuit marks the latest high-profile VPPA class action lawsuit filing, which have significantly spiked in the last two years.

Although the recent tide of VPPA class action court rulings has generally tipped in favor of defendants, the plaintiffs’ class action bar is still exploring novel theories to bring these high-stakes cases. Companies must therefore pay close attention to privacy-related issues involving mobile applications, including what data is collected and to whom it is transmitted.

The VPPA

Congress passed the VPPA in 1988.  The statute imposes liability on, “[a] video tape service provider who knowingly discloses, to any person, personally identifiable information concerning any consumer of such provider.”  18 U.S.C. § 2710(b)(1).  A “video tape service provider” is defined as “any person, engaged in the business, in or affecting interstate or foreign commerce, of rental, sale, or delivery of prerecorded video cassette tapes or similar audio visual materials.”  Id. 3-4 (citations omitted).  “Personally identifiable information” (“PII”) is defined as “information which identifies a person as having requested or obtained specific video materials or services from a video service provider.”  Id.  In essence, the statute purports to account for advancements in video-delivery technology by defining a “video tape service provider” broadly to include any business engaged in the “rental, sale, or delivery of prerecorded video cassette tapes or similar audio visual materials.”  Id.

The New VPPA Class Action Lawsuit

Plaintiff alleges that he downloaded and installed the App to his mobile phone and regularly used it to access video content.  Id. at 2.  When downloading the App, users are presented with an option to sign into an existing account, create a new account, or continue without signing in by selecting “MAYBE LATER.”  Id. at 4-5.  Plaintiff alleges that consumers who select “MAYBE LATER” are not presented with the App’s Terms of Use or Privacy Policy.  And even if users select “JOIN NOW”, they are redirected to a login screen where they have the option to log in, but are not required to view or assent to any terms of use or privacy policy unless they take additional steps to create an account.  Id. at 5.

In terms of data collection, the lawsuit alleges that when a user opens a video on the App, the App sends the content type, video title, and a persistent identifier to the user’s device. The App then transmits to third parties the user’s information, including location (in geographical coordinates and altitude), advertising ID, and video content consumption. Id. at 6. According to the complaint, the New England Patriots allegedly leverage users’ geolocation so it can maximize advertising revenue and, to that end, uniquely identify its users. For Android software users, the complaint alleges that the Patriots unique advertising ID called an Android Advertising ID (“AAID”) for each of its users with third-parties, which enables a third party to track the user’s movements, habits, and activity on mobile applications.  Id. at 10.

Accordingly, the lawsuit alleges that through the New England Patriots’ dissemination of consumers’ PII, third parties such as Google can collect and store billions of metrics and events and make it easier for clients to make data-driven decisions, and these reports are continuously updated and metrics are reported as they occur.  Id at 16.  Plaintiff seeks to represent a class defined as “All persons in the United States who used the Patriots App to watch videos and had their personally identifiable information — including but not limited to the videos they watched, their geolocation, and their unique advertising IDs — transmitted to one or more third parties.”  Id.  On behalf of the class, Plaintiff seeks an award of damages, including, but not limited to, actual, consequential, punitive, statutory, and nominal damages.

Implications For Businesses

This lawsuit represents another example of class action plaintiffs’ lawyers using traditional state and federal laws – including the long dormant VPPA – to seek relief for alleged privacy violations.  In applying modern technologies to older laws like the VPPA (passed in 1988), courts have grappled with issues such as the determination of who qualifies as a “video tape service provider” or a “consumer” under the statute. It will be interesting to follow this lawsuit to see whether the Court follows the recent trend of courts dismissing VPPA class actions.

That said, this high-profile filing also suggests that companies should regularly update their online consent provisions as needed to specifically address the VPPA. Businesses that pro-actively implement compliance mechanisms will thank themselves later in terms of preventing class action litigation.

Texas Federal Court Dismisses Video Privacy Protection Act Class Action Concerning Email Newsletter From University Of Texas

By Gerald L. Maatman, Jr., Jennifer A. Riley, and Emilee N. Crowther

Duane Morris Takeaways: In Brown v. Learfield Communications, LLC, et al., No. 1:23-CV-00374, 2024 U.S. Dist. LEXIS 15587 (W.D. Tex. Jan. 29, 2024), Judge David A. Ezra of the U.S. District Court for the Western District of Texas granted Defendants Learfield Communications, LLC and Sidearm Sports, LLC’s Rule 12(b)(6) motion to dismiss Plaintiff’s Video Privacy Protection Act (VPPA) class claim.  The Court held that Plaintiff failed to plead facts to support his claim under the VPPA because he did not allege that he was a subscriber to audio-visual goods or services themselves, just a newsletter that contained links to publicly-available content on The University of Texas’s website.  Defendants in VPPA class actions can utilize this decision as a roadmap when preparing motions to dismiss.

Case Background

Defendants Learfield Communications, LLC and Sidearm Sports, LLC (collectively, “Defendants”) operated the University of Texas at Austin’s (“UT”) website (the “UT Website”).  Id. at 2.  The UT Website contains software that enables Facebook to track the activity of UT Website users on other websites.  Id.  Defendants invite UT Website visitors to subscribe to emailed newsletters.  Id. at 3.  The newsletters provide links to various videos, clips, and other content on the UT Website related to UT Athletics.  Id.  Plaintiff Adam Brown subscribes to UT’s emailed newsletter.  Id.

In April 2023, Plaintiff filed a class action against Defendants UT, UT Athletics, Learfield, and Sidearm alleging that they violated the VPPA by purportedly exposing the subscribers’ personal identification information and gathering marketing data without consent.  Id. at 4.  In June 2023, UT and UT Athletics filed a motion to dismiss based on sovereign immunity.  Id.  at 2.  The motion was granted in July.  Id.  In September, Defendants Learfield and Sidearm filed a motion to dismiss under 12(b)(1), 12(b)(6), and 12(b)(7).  Id.

The Court’s Decision

The Court denied Defendants’ Rule 12(b)(1) and 12(b)(7) motions to dismiss. It held that neither Learfield or Sidearm was entitled to immunity as an “arm of the state,” and that neither UT or UT Athletics were indispensable parties to the lawsuit.  Id. at 7-10.

The Court, however, granted Defendants’ Rule 12(b)(6) motion to dismiss on the basis that Plaintiff was not a “consumer” under the VPPA because he failed to allege a factual nexus between the subscription and Defendants’ allegedly actionable video content.  Id. at 2, 19, 26.

To state a claim under the VPPA, the Court noted that a plaintiff must allege that a defendant “(1) is a video tape service provider; (2) who knowingly disclosed to any person; (3) personally identifiable information; (4) concerning any consumer.”  Id. at 10-11; 18 U.S.C. 2710(b)(1).  Under the VPPA, a “consumer” is “any renter, purchaser, or subscriber of goods or services from a video tape service provider.”  18 U.S.C. § 2710(a)(1).

The Court reasoned that the VPPA “only applies to consumers (including subscribers) of audio video services” because, when reading the term “consumer” in the full context of the VPPA, “a reasonable reader would understand the definition of ‘consumer’ to apply to a renter, purchaser or subscriber of audio-visual goods or services, and not goods or services writ large.”  Id. at * 19 (emphasis original) (quoting Carter v. Scripps Networks, LLC, 2023 WL 3061858, at *6 (S.D.N.Y. Apr. 24, 2023)).

The Court concluded that Plaintiff was not a “consumer” under the VPPA because (i) the newsletter did not contain videos, just links to videos on the UT Website; and (ii) the linked videos were available for any member of the public to see on the UT Website, not just those who subscribed to the newsletter.  Id. at 26-28.  Accordingly, the Court ruled that Plaintiff was not a subscriber to audio-visual goods or services, just a newsletter.  Id. at 28-29.  Ultimately, because Plaintiff failed to allege facts to support a claim under the VPPA, the Court granted Defendants 12(b)(6) motion to dismiss.  Id. at 29.

Implications For Companies

The decision in Brown v. Learfield serves as a roadmap for defendants in VPPA class actions to utilize when preparing motions to dismiss. This case is also important as it adds the Western District of Texas to a growing number of federal courts that strictly construe the VPPA to audio-visual materials, not links to publically-available videos in newsletters.  See, e.g., Carter v. Scripps Networks, LLC, No. 22-CV-2031, 2023 WL 3061858, at *6 (S.D.N.Y. Apr. 24, 2023); Jefferson v. Healthline Media, Inc., No. 3:22-CV-05059, 2023 WL 3668522, at *3 (N.D. Cal. May 24, 2023); Gardener v. MeTV, No. 22-CV-5963, 2023 WL 4365901, at *4 (N.D. Ill. July 6, 2023).

Illinois Federal Court Partially Dismisses Class Action Privacy Claims Involving “Eufy” Security Cameras

By Gerald L. Maatman, Jr., Alex W. Karasik, and Tyler Zmick

Duane Morris Takeaways:  In Sloan, et al. v. Anker Innovations Ltd., No. 22-CV-7174 (N.D. Ill. Jan. 9, 2024), Judge Sarah Ellis of the U.S. District Court for the Northern District of Illinois granted in part a motion to dismiss privacy claims brought against the companies that manufacture and sell “eufy” security products.  The Court dismissed the claims asserted under the federal Wiretap Act because Defendants were “parties” to the communication during which the eufy products sent security recordings to Plaintiffs’ mobile devices (notwithstanding that the products also sent the data to a server owned by Defendants).  In addition, the Court partially dismissed Plaintiffs’ claims under the Illinois Biometric Information Privacy Act and under four state consumer protection statutes, thereby allowing Plaintiffs to proceed with their case only with respect to some of their claims.

For businesses who are embroiled in facial recognition software and related privacy class actions, this ruling provides a helpful roadmap for fracturing such claims at the outset of the lawsuit.

Case Background

Plaintiffs were individuals from various states who purchased and used Defendants’ “eufy” branded home security cameras and video doorbells.  The eufy products can, among other things, detect motion outside a person’s home and apply a facial recognition program differentiate “between known individuals and strangers by recognizing biometric identifiers and comparing the face template against those stored in a database.”  Id. at 3.  Eufy products sync to a user’s phone through eufy’s Security app, which notifies a user of motion around the camera by sending the use a recorded thumbnail image or text message.

Defendants advertised that the video recordings and facial recognition data obtained through eufy cameras are stored locally on user-owned equipment owned and that the data would be encrypted so that only the user could access it.  Media reports later revealed, however, that the eufy products uploaded thumbnail images used to notify users of movement to Defendants’ cloud storage without encryption, and that users could stream content from their videos through unencrypted websites.

Claiming they relied to their detriment on Defendants’ (allegedly false) privacy-related representations when purchasing the eufy products, the eight named Plaintiffs filed a putative class action against corporate Defendants involved in the manufacture and sale of “eufy” products.  In their complaint, Plaintiffs asserted that Defendants violated: (1) the Federal Wiretap Act; (2) the Biometric Information Privacy Act (the “BIPA”); and (3) the consumer protection statutes of Illinois, New York, Massachusetts, and Florida.  Defendants moved to dismiss Plaintiffs’ claims under Federal Rule of Civil Procedure 12(b)(6).

The Court’s Decision

The Court granted in part and denied in part Defendants’ motion, holding that: (1) the Wiretap Act claim should be dismissed because Defendants were a party to the relevant communication (i.e., the transmission of data from eufy products to Plaintiffs via the eufy Security app); (2) the BIPA claims should be dismissed as to non-Illinois resident Plaintiffs; and (3) the claims brought under the relevant consumer protection statutes should be dismissed only to the extent they were premised on certain of Defendants’ public-facing privacy statements.

Wiretap Act Claims

The Court first addressed Plaintiffs’ Wiretap Act claims, explaining that the statute “empowers a private citizen to bring a civil claim against someone who ‘intentionally intercepts [or] endeavors to intercept . . . any wire, oral, or electronic communication.’”  Id. at 8 (quoting 18 U.S.C. § 2511(1)(a)).

Defendants argued that Plaintiffs failed to state a claim under the Wiretap Act because the statute does not apply to a party to the relevant communication.  Specifically, the Wiretap Act exempts a person who intercepts an electronic communication “where such person is a party to the communication or where one of the parties to the communication has given prior consent to such interception.”  18 U.S.C. § 2511(2)(d).

The Court agreed with Defendants and thus dismissed Plaintiffs’ Wiretap Act claim.  The Court described the relevant “communication” as the transmission of data from eufy products to Plaintiffs’ devices and explained that the transmission “is not between the eufy product and Plaintiffs, but rather between the eufy product and the eufy Security app, which Defendants own and operate.  As such, the communication necessarily requires Defendants’ participation, even if Plaintiffs did not intend to share their information with Defendants.”  Id. at 8-9 (emphasis added).  The Court thus held that Defendants were parties to the communication, and Defendants also uploading the data to their own server (without Plaintiffs’ knowledge) did not change that conclusion.

BIPA Claims

Regarding Plaintiffs’ BIPA claims, Defendants argued that Plaintiffs failed to allege that the relevant data (which Defendants described as “thumbnail images”) qualifies for protection under the BIPA because photographs are not biometric data under the statute.  The Court rejected this argument since Plaintiffs alleged that Defendants uploaded thumbnail information and facial recognition data (namely, “scans of face geometry”) to their server.

The Court agreed with Defendants’ second argument, however, which asserted that Plaintiffs’ BIPA claim failed to the extent it was brought by or on behalf of Plaintiffs who are not Illinois residents.  The BIPA applies only where the underlying conduct occurs “primarily and substantially” in Illinois.  The Court determined that the relevant communications between Plaintiffs and Defendants “occurred primarily and substantially in the state of residency for each Plaintiff.”  Id. at 12-13.  And the End User License Agreement for eufy Camera Products and the Security App stating that the agreement is governed by Illinois law did not change the result that the BIPA claim brought by non-Illinois residents must be dismissed.

Statutory Consumer Protection Claims

Finally, the Court turned to Defendants’ contentions relative to the alleged violations of the four state consumer protection statutes.  In beginning its analysis, the Court explained that “[t]o state a claim for deceptive practices under any of the alleged state consumer fraud statutes, Plaintiffs must allege a deceptive statement or act that caused their harm.”  Id. at 14.  Moreover, “a statement is deceptive if it creates a likelihood of deception or has the capacity to deceive.”  Id. at 15 (citation omitted); see also id. (noting that “the allegedly deceptive act must be looked upon in light of the totality of the information made available to the plaintiff”) (citation omitted).  Defendants argued in their motion to dismiss that Plaintiffs did not allege cognizable deceptive statements because the statements at issue constitute either puffery or are not false.

The Court dismissed Plaintiffs’ statutory fraud claims in part.  Specifically, the Court held that Defendants’ advertising in the form of certain “statements relating to privacy” (e.g., “your privacy is something that we value as much as you do”) constituted nonactionable “puffery.”  Id. at 16.  The Court therefore dismissed Plaintiffs’ statutory fraud claims insofar as they were premised on the similarly vague “statements relating to privacy.”

However, the Court denied Defendants’ attempt to dismiss the claims premised on their more specific statements about (1) end-user data being stored only on a user’s local device, (2) the use of alleged facial recognition, and (3) end-user data being encrypted.  Defendants argued that these were “accurate statements” and thus could not serve as the basis for consumer fraud claims.  The Court disagreed, ruling that Plaintiffs sufficiently alleged that the storage, encryption, and facial recognition statements may have misled a reasonable consumer.  Accordingly, the Court granted in part and denied in part Defendants’ motion to dismiss.

Implications For Corporate Counsel

The most significant aspect of Sloan v. Anker Innovations Limited is the Court’s analysis of Plaintiffs’ Wiretap Act claims, given the rapidly emerging trend among the plaintiff class action bar of using traditional state and federal laws – including the Wiretap Act – to seek relief for alleged privacy violations.  In applying modern technologies to older laws like the Wiretap Act (passed in 1986), courts have grappled with issues such as the determination of who is a “party to the communication” such that an entity is exempt from the statute’s scope.  As data exchanges and data storage become more complex, the “party to the communication” determination reciprocally becomes more nebulous.

In Sloan, the “communication” was the eufy products transmitting data to Plaintiffs’ device and “contemporaneously intercept[ing] and sen[ding] [the data] to [Defendant’s] server.”  Id. at 8 (citation omitted).  Because Plaintiffs had to use the eufy Security app to access the data, and because Defendants owned and operated the app, the Court determined that Defendants necessarily participated in the communication.  But the result may have been different if, for instance, Plaintiffs could use a different app (one not owned by Defendants) to access the data, or if unbeknownst to Plaintiffs, the eufy Securty app was actually owned and operated by a third-party entity.  The upshot is that corporate counsel should keep these principles in mind with respect to any data-flow processes regarding end-user or employee data.

Illinois Trial Court Grants Class-Wide Summary Judgement In BIPA Privacy Lawsuit

By Gerald L. Maatman, Jr., Alex W. Karasik, and Christian J. Palacios

Duane Morris Takeaways:  In Thompson, et al., v. Matcor Metal Fabrication (Illinois), Inc., Case No. 2020-CH-00132 (Ill. Cir. Ct. 10th Dist. Dec. 8, 2023), a class of metal fabricators prevailed on a motion for summary judgment against their employer in what is believed to be the first summary judgment ruling for a certified class under the Illinois Biometric Information Privacy Act (BIPA). An Illinois state court, determining there was no dispute of material fact, entered the pre-trial liability judgment against the defendant employer for collecting employee biometric data through its timekeeping system in violation of BIPA.

This decision highlights the danger that companies face under state privacy “strict liability” statutes, and should serve as a warning for employers that lack robust policies governing the way they collect biometrics data from their employees.

Background

In September of 2019, Matcor Fabrication rolled out a new timekeeping policy whereby it collected its employees’ fingerprints using “biometric scanners” for the purposes of determining when employees clocked in and out of work. Id. at 3. The scanners that collected this information were connected to Matcor’s timekeeping vendor – ADP – and the company sent finger-scan data to ADP every time an employee scanner their fingertips. The named Plaintiff and class representative William Thompson subsequently brought the lawsuit in May of 2020, alleging the company’s timekeeping policy violated the Illinois BIPA. Nearly one year after the lawsuit had commenced, Matcor implemented BIPA-compliant policies, which included distributing a “Biometric Consent Form” to employees that stated that the company’s vendors “may collect, retain, and use biometric data for the purposes of verifying employee identity and recording time” as well as describing Matcor’s policies for retaining and destroying employee data. Id. at 4. The Court previously had certified a class of Matcor employees who enrolled in the company’s finger-scan timekeeping system between May 13, 2015 and June 16, 2021, prior to the policy update. After a lengthy discovery period, both parties filed motions for summary judgement.

The Court’s Ruling

The Court held that there was no genuine dispute of material fact that Matcor’s timekeeping policies during the class-wide time period violated the BIPA. In its ruling, the Court dismissed a series of defenses offered by the company, including that in order for the BIPA to apply, Matcor’s timeclocks needed to “collect” and store its employees’ fingerprints, rather than just transmit it to a third-party vendor. The Court was unconvinced. It opined that the BIPA also applied when timeclocks collected biometric information “based on” a fingerprint. Id. at 7. Matcor further argued that there was a difference between the “fingertip” scans it took and the “fingerprint” scans covered by the BIPA, but it was unable to cite authority that showed a meaningful difference between the two. Finally, Matcor argued that the Court needed “expert testimony” to assess the type of information the company’s timeclocks collected. The Court rejected this contention. It observed that collecting employee’s fingertip information clearly fell under the BIPA’s definition of biometric information.

Based on the facts, the Court determined that it was undisputed that Matcor began using biometric timeclocks to collect employee’s fingerprints in 2019, and the company did not implement a BIPA-compliant policy until one year after the Plaintiff commenced his suit. The record also clearly showed that Matcor failed to obtain its employees consent before collecting their fingerprints, and only obtained BIPA releases 2 years after the suit was initiated. Accordingly, the Court granted the Plaintiff’s motion for summary judgement and the lawsuit will now proceed to the damages stage.

Implications

As this ruling emphasizes, employers can be held strictly liable for any period of time in which they collect their employees’ biometric data without having a corresponding BIPA-compliant policy. State privacy statutes like the BIPA pose unique dangers for unwary employers who do not keep up-to-date with evolving legal requirements relating to the collection, retention, and use of biometric data. Although Illinois was one of the first early adopters of such stringent privacy laws, it will certainly not be the last, and companies should begin taking preventative measures to limit liability associated with such statutes.

Illinois Supreme Court Endorses Broad Interpretation Of The BIPA’s “Health Care Exception”

By Gerald L. Maatman, Jr. and Tyler Zmick

Duane Morris Takeaways:  In the latest ruling in the biometric privacy class action space, the Illinois Supreme Court embraced a broad reading of the “health care exception” in the Illinois Biometric Information Privacy Act (“BIPA”) in Mosby v. Ingalls Memorial Hospital, 2023 IL 129081 (Ill. Nov. 30, 2023).  The Illinois Supreme Court held that the statute excludes from its scope data collected in two separate and distinct scenarios: (1) “information captured from a patient in a health care setting”; and (2) information collected “for health care treatment, payment, or operations under the federal Health Insurance Portability and Accountability Act of 1996 (HIPAA).”  Unlike clause (1), the Supreme Court held that the exception in clause (2) is not limited to data obtained from patients and serves to exclude information that originates from any source.

The Mosby ruling is welcome news to BIPA defendants and companies operating in the health care space.  In the wake of the decision, courts likely will be asked to define the exact contours of the BIPA’s broadened “health care exception” in cases presenting facts that are less obviously tied to health care treatment, payment, or operations compared to the facts at issue in Mosby.

Case Background

The Plaintiffs in Mosby were nurses who claimed that their hospital-employers required them to use a fingerprint-based medication-dispensing system to verify their identities.  Plaintiffs sued their employers and the company that distributed the medication-dispensing system, alleging that Defendants violated §§ 15(a), 15(b), and 15(d) of the BIPA by using the medical-station scanning device to collect, use, and/or store their “finger-scan data” without complying with the BIPA’s notice-and-consent requirements and by disclosing their purported biometric data to third parties without first obtaining their consent.

Defendants moved to dismiss in the trial court, arguing that the claims failed because Plaintiffs’ data was specifically excluded from the BIPA’s scope under § 10 of the statute, which states that “[b]iometric identifiers do not include information captured from a patient in a health care setting or information collected, used, or stored for health care treatment, payment, or operations under [the HIPAA].”  740 ILCS 14/10.  Defendants argued that the latter clause applied in that Plaintiffs’ fingerprints had been used in connection with Plaintiffs providing medicine to patients, meaning their fingerprints were “collected, used, or stored for health care treatment, payment, or operations under [the HIPAA].”  Id.

The trial court denied Defendants’ motions. It ruled that § 10’s “health care exception” was limited to patient information protected under the HIPAA and that the exclusion does not extend to information collected from health care workers.

On appeal, the First District of the Illinois Appellate Court affirmed the denial of Defendants’ motions to dismiss.  Echoing the trial court, the Appellate Court determined that the biometric data of health care workers is not excluded from the BIPA’s scope and that the relevant provision of § 10 excluded from the BIPA’s protections “only patient biometric information.”  Mosby, 2023 IL 129081, ¶ 16; see id. ¶ 17 (“[T]he appellate court held that ‘the plain language of the statute does not exclude employee information from the [BIPA’s] protections because they are neither (1) patients nor (2) protected under HIPAA.’”) (citation omitted).

Appellate Court Judge Mikva dissented from the majority’s opinion.  Judge Mikva opined that the legislature meant to exclude from the BIPA’s scope the biometric data of health care workers “where that information is collected, used, or stored for health care treatment, payment, or operations, as those functions are defined by the HIPAA.”  Id. ¶ 19 (citation omitted).  Judge Mikva expressed the view that the first part of § 10’s “health care exception” excludes from the BIPA’s coverage information from a particular source (i.e., patients in a health care setting) and that the second part excludes information used for particular purposes (i.e., health care treatment, payment, or operations), regardless of the source of that information.

The Illinois Supreme Court’s Decision

On further appeal, the Illinois Supreme Court agreed with Appellate Court Judge Mikva’s dissent, unanimously holding that the BIPA’s exclusion for “information collected, used, or stored for health care treatment, payment, or operations under [the HIPAA]” can apply to the biometric data of health care workers (not only patients).

The Supreme Court determined that the relevant sentence of § 10 excludes from the definition of “biometric identifier” data that may be collected in two distinct (rather than overlapping) scenarios – namely, biometric identifiers do not include (i) information captured from a patient in a health care setting or (ii) information collected, used, or stored for health care treatment, payment, or operations under HIPAA.  Id. ¶ 37 (“[T]he phrase prior to the ‘or’ and the phrase following the ‘or’ connotes two different alternatives.  The Illinois legislature used the disjunctive ‘or’ to separate the [BIPA’s] reference to ‘information captured from a patient in a health care setting’ from ‘information collected, used, or stored for health care treatment, payment, or operations under [the HIPAA].’  Pursuant to its plain language, information is exempt from the [BIPA] if it satisfies either statutory criterion.”) (internal citations omitted).

The Supreme Court agreed with Defendants that the two categories of information are different because information excluded under the first clause originates from the patient, whereas information excluded under the second clause may originate from any source.  Regarding the second clause, the Supreme Court observed that the Illinois legislature borrowed the phrase “health care treatment, payment, and operations” from the federal HIPAA regulations.  Accordingly, the Supreme Court determined that “the legislature was directing readers to the HIPAA to discern the meaning of those terms,” which meanings “relate to activities performed by the health care provider – not by the patient.”  Id. ¶ 52.

Thus, the Supreme Court held that a health care worker’s data used to permit access to medication-dispensing stations for patient care qualifies as “information collected, used, or stored for health care treatment, payment, or operations under [the HIPAA]” and is exempt from the statute’s scope.

Implications Of The Decision

After the recent slew of plaintiff-friendly BIPA decisions issued by both state and federal courts, the Illinois Supreme Court’s decision in Mosby comes as welcome news for companies facing privacy-related class actions – particularly those operating in the health care space.

Relying on Mosby, defendants will likely add the BIPA’s “health care exception” to their arsenal of defenses in a wider array of cases moving forward.  Importantly, for purposes of the second “HIPAA prong” of the statute’s “health care exception,” federal HIPAA regulations govern the definitions of the terms “health care treatment,” “payment,” and “operations.”  Given that the regulatory definitions of those terms are broad, see 45 C.F.R. § 160.103; id. § 164.501, defendants will likely test the breadth of the exception in future cases presenting facts that may be less obviously tied to health care treatment, health care payment, and/or health care operations compared to the facts at issue in Mosby.

Illinois Federal Court Allows Amazon “Alexa” Privacy Class Action To Proceed

By Gerald L. Maatman, Jr. and Tyler Zmick

Duane Morris Takeaways:  In Wilcosky, et al. v. Amazon.com, Inc., et al., No. 19-CV-5061 (N.D. Ill. Nov. 1, 2023), the U.S. District Court for the Northern District of Illinois issued a decision embracing a strict interpretation of the notice a private entity must provide before collecting a person’s biometric data in compliance with the Illinois Biometric Information Privacy Act (“BIPA”).  The decision underscores the importance of not only obtaining written consent before collecting a person’s biometric data, but also of the need to be as specific as possible in drafting privacy notices to inform end users that the company is collecting biometric data and to describe the “specific purpose and length of term for which” biometric data is being collected. 

In light of the potentially monumental exposure faced by companies defending putative BIPA class actions, companies that operate in Illinois and collect data that could potentially be characterized as “biometric” should review and, if necessary, update their public-facing privacy notices to ensure compliance with the BIPA. 

Background

Plaintiffs’ BIPA claims in Wilcosky were premised on their respective interactions with Amazon’s “Alexa” device – a digital assistant that provides voice-based access to Amazon’s shopping application and other services.  According to Plaintiffs, Alexa devices identify individuals who speak within the vicinity of an active device by collecting and analyzing the speaker’s “biometric identifiers” (specifically, “voiceprints”).

In their complaint, Plaintiffs claimed that Amazon identifies people from the sound of their voices after they enroll in Amazon’s “Voice ID” feature on the Alexa Application.  To enroll in Voice ID, a user is taken to a screen notifying him or her that the Voice ID feature “enables Alexa to learn your voice, recognize you when you speak to any of your Alexa devices, and provide enhanced personalization.”  Order at 3.  A hyperlink to the Alexa Terms of Use is located at the bottom of the enrollment screen, which Terms state that Voice ID “uses recordings of your voice to create an acoustic model of your voice characteristics.”  Id. at 8.  Before completing the Voice ID enrollment process, a user must agree to the Alexa Terms of Use and authorize “the creation, use, improvement, and storage” of his or her Voice ID by tapping an “Agree and Continue” button.  Id. at 3.

Among the four named Plaintiffs, three had enrolled in Voice ID using their respective Alexa devices (the “Voice ID Plaintiffs”).  One Plaintiff, Julia Bloom Stebbins, did not enroll in Voice ID; rather, she alleged that she spoke in the vicinity of Plaintiff Jason Stebbins’s Alexa device, resulting in Alexa collecting her “voiceprint” to determine whether her voice “matched” the Voice ID of Plaintiff Jason Stebbins.

Based on their alleged interactions with Alexa, Plaintiffs claimed that Amazon violated Sections 15(b), 15(c), and 15(d) of the BIPA by (i) collecting their biometric data without providing them with the requisite notice and obtaining their written consent, (ii) impermissibly “profiting from” their biometric data, and (iii) disclosing their biometric data without consent.

Amazon moved to dismiss Plaintiffs’ complainton the basis that: (1) the Voice ID Plaintiffs received the required notice and provided their written consent by completing the Voice ID enrollment process; and (2) Plaintiff Bloom Stebbins never enrolled in Voice ID – meaning she was a “total stranger” to Amazon such that Amazon could not possibly identify her based on the sound of her voice.

The Court’s Decision

The Court denied Amazon’s motion to dismiss in a 15-page order, focused primarily on Amazon’s arguments relating to Plaintiffs’ Section 15(b) claim.

Sufficiency Of Notice Provided To Voice ID Plaintiffs

Regarding the requirements of Section 15(b), the Court noted that a company collecting biometric data must first: (1) inform the individual that biometric data is being collected or stored; (2) inform the individual of the specific purpose and length of term for which the biometric data is being collected, stored, and used; and (3) receive a written release signed by the individual.

In moving to dismiss the Voice ID Plaintiffs’ Section 15(b) claim, Amazon argued that those three Plaintiffs received all legally required notices during the Voice ID enrollment process.  During that process, Amazon explained how Voice ID works and informed users that the technology creates an acoustic model of a user’s voice characteristics.  Amazon maintained that notice language need not track the exact language set forth in Section 15(b) because the BIPA does not require that any particular statutory language be provided to obtain a person’s informed consent.  Id. at 6 (noting Amazon’s argument that “Voice ID Plaintiffs’ voiceprints were collected in circumstances under which any reasonable consumer should have known that his or her biometric information was being collected”).

The Court adopted Plaintiffs’ stricter reading of Section 15(b). It held that the complaint plausibly alleged that Amazon’s disclosures did not fully satisfy Section 15(b)’s notice requirements.  While Amazon may have informed users that Voice ID enables Alexa to learn their voices and recognize them when they speak, Amazon did not specifically inform users that it is “collecting and capturing the enrollee’s voiceprint, a biometric identifier.”  Id.at 8.  As a result, and acknowledging that it was “a close call,” the Court denied Amazon’s motion to dismiss the Section 15(b) claim asserted by the Voice ID Plaintiffs.

Application Of The BIPA To “Non-User” Plaintiff Julia Bloom Stebbins

The Court next turned to Plaintiff Bloom Stebbins, who did not create an Alexa Voice ID but alleged that Amazon collected her “voiceprint” when she spoke in the vicinity of Plaintiff Jason Stebbins’s Alexa device.  Amazon argued that her Section 15(b) claim failed because the BIPA was not meant to apply to someone in her shoes – that is, a stranger to Amazon and “who Amazon has no means of identifying.”  Id. at 11.

The Court rejected Amazon’s argument.  In doing so, the Court refused to read Section 15(b)’s requirements as applying only where a company has some relationship with an individual.  According to the Court, that interpretation would amount to “read[ing] a requirement into the statute that does not appear in the statute itself.”  Id. at 12; see also id. (“[C]ourts in this Circuit have rejected the notion that to state a claim for a Section 15(b) violation, there must be a relationship between the collector of the biometric information and the individual.”).

Conclusion

Wilcosky is required reading for corporate counsel of companies that are facing privacy-related class actions and/or want to ensure their consumer or employee-facing privacy disclosures contain all notices required under applicable law.

The Wilcosky decision endorses a strict view regarding the notice a company must provide to individuals to fully comply with Section 15(b) of the BIPA.  To ensure compliance, companies should provide end users with language that is as specific as possible regarding the type(s) of data being collected (including the fact that the data may be “biometric”), the purpose the data is being collected, and the time period during which the data will be stored.  The notice should closely track the BIPA’s statutory text, and companies should also require individuals to affirmatively express that they have received the notice and agree to the collection of their biometric data.  (Despite a footnote stating that the Court’s order in Wilcosky should not “be interpreted to mean that . . . a disclosure must parrot the exact language of BIPA in order to satisfy Section 15(b),” id. at 8 n.3, the Court does not explain how a disclosure could satisfy Section 15(b) without tracking the statute’s language verbatim.)

Moreover, Wilcosky raises the question whether a company should characterize data it collects as “biometric” data in its privacy notice – even if the company maintains (perhaps for good reason) that the data does not constitute biometric data subject to regulation under the BIPA.  Further complicating this question is the fact that the precise contours of the types of data that qualify as “biometric” under the BIPA are unclear and are currently being litigated in many cases.  Companies may wish to err on the “safe side” and refer to the data being collected as “biometric” data in their privacy notices.

© 2009- Duane Morris LLP. Duane Morris is a registered service mark of Duane Morris LLP.

The opinions expressed on this blog are those of the author and are not to be construed as legal advice.

Proudly powered by WordPress