Illinois Federal Court Dismisses Class Action Privacy Claims Involving Use Of Samsung’s “Gallery” App

By Tyler Zmick, Justin Donoho, and Gerald L. Maatman, Jr.

Duane Morris Takeaways:  In G.T., et al. v. Samsung Electronics America, Inc., et al., No. 21-CV-4976, 2024 WL 3520026 (N.D. Ill. July 24, 2024), Judge Lindsay C. Jenkins of the U.S. District Court for the Northern District of Illinois dismissed claims brought under the Illinois Biometric Information Privacy Act (“BIPA”).  In doing so, Judge Jenkins acknowledged limitations on the types of conduct (and types of data) that can subject a company to liability under the statute.  The decision is welcome news for businesses that design, sell, or license technology yet do not control or store any “biometric” data that may be generated when customers use the technology.  The case also reflects the common sense notion that a data point does not qualify as a “biometric identifier” under the BIPA if it cannot be used to identify a specific person.  G.T. v. Samsung is required reading for corporate counsel facing privacy class action litigation.

Background

Plaintiffs — a group of Illinois residents who used Samsung smartphones and tablets — alleged that their respective devices came pre-installed with a “Gallery application” (the “App”) that can be used to organize users’ photos.  According to Plaintiffs, whenever an image is created on a Samsung device, the App automatically: (1) scans the image to search for faces using Samsung’s “proprietary facial recognition technology”; and (2) if it detects a face, the App analyzes the face’s “unique facial geometry” to create a “face template” (i.e., “a unique digital representation of the face”).  Id. at *2.  The App then organizes photos based on images with similar face templates, resulting in “pictures with a certain individual’s face [being] ‘stacked’ together on the App.”  Id.

Based on their use of the devices, Plaintiffs alleged that Samsung violated §§ 15(a) and 15(b) of the BIPA by: (1) failing to develop a written policy made available to the public establishing a retention policy and guidelines for destroying biometric data, and (2) collecting Plaintiffs’ biometric data without providing them with the requisite notice and obtaining their written consent.

Samsung moved to dismiss on two grounds, arguing that: (1) Plaintiffs did not allege that Samsung “possessed” or “collected” their biometric data because they did not claim the data ever left their devices; and (2) Plaintiffs failed to allege that data generated by the App qualifies as “biometric identifiers” or “biometric information” under the BIPA, because Samsung cannot use the data to identify Plaintiffs or others appearing in uploaded photos.

The Court’s Decision

The Court granted Samsung’s motion to dismiss on both grounds.

“Possession” And “Collection” Of Biometric Data

Regarding Samsung’s first argument, the Court began by explaining what it means for an entity to be “in possession of” biometric data under § 15(a) and to “collect” biometric data under § 15(b).  The Court observed that “possession” occurs when an entity exercises control over data or holds it at its disposal.  Regarding “collection,” the Court noted that the term “collect,” and the other verbs used in § 15(b) (“capture, purchase, receive through trade, or otherwise obtain”), all refer to an entity taking an “active step” to gain control of biometric data.

The Court proceeded to consider Plaintiffs’ contention that Samsung was “in possession of” their biometrics because Samsung controls the proprietary software used to operate the App.  The Court sided with Samsung, however, concluding that Plaintiffs failed to allege “possession” (and thus failed to state a § 15(a) claim) because they did not allege that Samsung can access the data (as opposed to the technology Samsung employs).  Id. at *9 (“Samsung controls the App and its technology, but it does not follow that this control gives Samsung dominion over the Biometrics generated from the App, and plaintiffs have not alleged Samsung receives (or can receive) such data.”).

As for § 15(b), the Court rejected Plaintiffs’ argument that Samsung took an “active step” to “collect” their biometrics by designing the App to “automatically harvest[] biometric data from every photo stored on the Device.”  Id. at *11.  The Court determined that Plaintiffs’ argument failed for the same reason their § 15(a) “possession” argument failed.  Id. at *11-12 (“Plaintiffs’ argument again conflates technology with Biometrics. . . . Plaintiffs do not argue that Samsung possesses the Data or took any active steps to collect it.  Rather, the active step according to Plaintiffs is the creation of the technology.”).

“Biometric Identifiers” And “Biometric Information”

The Court next turned to Samsung’s second argument for dismissal – namely, that Plaintiffs failed to allege that data generated by the App is “biometric” under the BIPA because Samsung could not use it to identify Plaintiffs (or others appearing in uploaded photos).

In opposing this argument, Plaintiffs asserted that: (1) the “App scans facial geometry, which is an explicitly enumerated biometric identifier”; and (2) the “mathematical representations of face templates” stored through the App constitute “biometric information” (i.e., information “based on” scans of Plaintiffs’ “facial geometry”).  Id. at *13.

The Court ruled that “Samsung has the better argument,” holding that Plaintiffs’ claims failed because Plaintiffs did not allege that Samsung can use data generated through the App to identify specific people.  Id. at *15.  The Court acknowledged that cases are split “on whether a plaintiff must allege a biometric identifier can identify a particular individual, or if it is sufficient to allege the defendant merely scanned, for example, the plaintiff’s face or retina.”  Id. at *13.  After employing relevant principles of statutory interpretation, the Court sided with the cases in the former category and opined that “the plain meaning of ‘identifier,’ combined with the BIPA’s purpose, demonstrates that only those scans that can identify an individual qualify.”  Id. at *15.

Turning to the facts alleged in the Complaint, the Court concluded that Plaintiffs failed to state claims under the BIPA because the data generated by the App does not amount to “biometric identifiers” or “biometric information” simply because the data can be used to identify and group the unique faces of unnamed people.  In other words, biometric information must be capable of recognizing an individual’s identity – “not simply an individual’s feature.”  Id. at *17; see also id. at *18 (noting that Plaintiffs claimed only that the App groups unidentified faces together, and that it is the device user who can add names or other identifying information to the faces).

Implications Of The Decision

G.T. v. Samsung is one of several recent decisions grappling with key questions surrounding the BIPA, including questions as to: (1) when an entity engages in conduct that rises to the level of “possession” or “collection” of biometrics; and (2) what data points qualify (and do not qualify) as “biometric identifiers” and “biometric information” such that they are subject to regulation under the statute.

Regarding the first question, the Samsung case reflects the developing majority position among courts – i.e., a company is not “in possession of,” and has not “collected,” data that it does not actually receive or access, even if it created and controlled the technology that generated the allegedly biometric data.

As for the second question, the Court’s decision in Samsung complements the Ninth Circuit’s recent decision in Zellmer v. Meta Platforms, Inc., where it held that a “biometric identifier” must be capable of identifying a specific person.  See Zellmer v. Meta Platforms, Inc., 104 F.4th 1117, 1124 (9th Cir. 2024) (“Reading the statute as a whole, it makes sense to impose a similar requirement on ‘biometric identifier,’ particularly because the ability to identify did not need to be spelled out in that term — it was readily apparent from the use of ‘identifier.’”).  Courts have not uniformly endorsed this reading, however, and parties will likely continue litigating the issue unless and until the Illinois Supreme Court provides the final word on what counts as a “biometric identifier” and “biometric information.”

BIPA Plaintiffs Ring The Bell In Class Certification Victory Over Amazon

By Gerald L. Maatman, Jr., Tyler Zmick, and Christian J. Palacios

Duane Morris Takeaways:  In Svoboda v. Amazon, Case No. 21-C-5336, 2024 U.S. Dist. LEXIS 58867 (N.D. Ill. Mar. 30, 2024), Judge Jorge L. Alonso of the U.S. District Court for the Northern District of Illinois granted class certification to a class of plaintiffs who alleged that Amazon’s “virtual try-on” (“VTO”) technology violated the Illinois Biometric Information Privacy Act (“BIPA”).  In doing so, Judge Alonso dealt Amazon a significant blow in its efforts to block the certification of a purported class of claimants that might number in the millions (with pre-certification discovery already establishing that there were over 160,000 persons who used Amazon’s VTO technology during the relevant class period). This decision is the most recent success by the plaintiffs’ bar in a string of victories for class action privacy lawsuits across Illinois and illustrates that even the largest and most sophisticated companies in the world face legal exposure in connection with their biometric retention and application practices. 

Background

Amazon sells products to consumers on its mobile website and shopping application. Its “virtual try-on” technology allows users to virtually try on makeup and eyewear, and is exclusively available on mobile devices. VTOs are software programs that use augmented reality to overlay makeup and eyewear products on an image or video of a face, which allows shoppers to see what the product might look like prior to deciding whether to purchase it. During the relevant class period, there were two VTOs at issue, one of which was developed by a third party (ModiFace), and another which was developed in-house by Amazon that later replaced ModiFace. Id. at 4. Amazon’s VTOs come in two forms, including: (1) VTO technology available for lip products, and; (2) VTOs available for glasses. The ModiFace VTO is available for lip liner, eye shadow, eye liner, and hair color. Amazon licenses, rather than owns ModiFace VTO. Id. at 5.

Both Amazon and ModiFace VTOs essentially works the same for every user. In order to access Amazon’s virtual try-on technology, the user first begins by clicking a “try on” button on an Amazon product page (the use of this try-on feature is entirely optional and does not serve as a barrier to the customer actually purchasing the product). The first time the customer uses Amazon VTO, she is shown a pop-up screen that states, “Amazon uses your camera to virtually place products such as sunglasses and lipstick on your face using Augmented Reality. All information remains on your device and is not otherwise stored, processed or shared by Amazon.”  Only after granting permission can the customer use the VTO technology to virtually try on the product. Users may select “live mode” or “photo upload mode” to superimpose the try-on product on an image of their own face. For both modes, the VTOs use software to detect users’ facial features and use that facial data in order to determine where to overlay the virtual products. Id. at 5.

Based on these facts, plaintiffs brought a class action lawsuit against Amazon, which alleged that the online retailer violated the BIPA’s requirements by collecting, capturing, storing or otherwise obtaining the facial geometry and associated personal identifying information of thousands (if not millions) of Illinois residents who used Amazon’s VTO applications from computers and other devices without first providing notice and the required information, obtaining informed written consent, or creating written publicly-available data retention and destruction guidelines. Id. Plaintiffs sought to certify a class of all individuals who used a virtual try-on feature on Amazon’s mobile website or app while in Illinois on or after September 7, 2016. Significantly, pre-certification discovery established that at least 163,738 people used virtual try-on technology on Amazon’s platforms while in Illinois during the class period. Id. at 6.

The Court’s Ruling

In ruling in favor of the plaintiffs, Judge Alonso systematically rejected each of Amazon’s arguments as to why plaintiffs failed to satisfy Rule 23’s requirements for class certification. Given the size of the purported class, Amazon did not attempt to contest the numerosity requirement. With respect to the adequacy requirement, Amazon argued that the named plaintiffs were inadequate and atypical because they alleged they used VTO for lipstick, and not eyewear or eye makeup (while the majority of the proposed class was comprised of individuals who used VTO for eyewear). Amazon further argued that the named plaintiffs had a conflicting interest with the class members who used VTO for eyewear, because the BIPA’s healthcare exception bars claims arising from the virtual try-on of eyewear. Id. at 12.  The Court rejected this argument. It found that there was no evidence that the named plaintiffs’ interests were antagonistic, or directly conflicted with those members who used VTO to try on eyewear, and Amazon’s concern that plaintiffs lacked an incentive to vigorously contest the healthcare exception defense was merely speculative. Id. The Court was also satisfied that the plaintiffs’ claims arose from the same course of conduct that gave rise to other class members’ claims (such as Amazon’s purported collection, capture, possession, and use of facial templates via its VTOs) and thus the typicality requirement was satisfied. Id. at 15.

Similarly, the Court reasoned that common questions of law or fact predominated over any questions affecting individual members, and a class action was superior to other available methods for fairly and efficiently adjudicating the controversy.  Id. at 17. Amazon asserted that there was no reliable way to identify class members who used VTO in Illinois during the class period, and thus, individualized inquiries predominated rendering the case unmanageable. Id. at 21. The Court rejected this argument. It agreed with the plaintiffs that Illinois billing addresses, IP addresses from which VTO was used, and geo-location data all served as a way of identifying class members. Amazon raised other arguments about the difficulty of identifying potential class members, but Judge Alonzo rejected all of these arguments, observing that plaintiffs did not need to identify every member of the class at the certification stage.  Id. at 23.

Finally, the Court dispatched with Amazon’s various affirmative defenses. In particular, Amazon argued that it had unique defenses for every member of the putative class, since damages were discretionary under the BIPA. The Court disagreed. It noted that the Illinois Supreme Court had determined that a trial court could fashion a damages award in a BIPA lawsuit that fairly compensated class members while deterring future violations, without destroying a defendant’s business. Id. at 25.

Implications for Employers

Employers should be well aware of the dangers associated with collecting or retaining individuals’ biometric information without BIPA-compliant policies in place. This case serves as a reminder that the larger the company, the larger the potential class size of a class or collective action. Although it remains to be seen the actual size of Svoboda v. Amazon, the class will likely number in the hundreds of thousands and this case should serve as a wake-up call for companies, regardless of scale, of the dangers of running afoul of the Prairie State’s privacy laws.

Illinois Federal Court Partially Dismisses Class Action Privacy Claims Involving “Eufy” Security Cameras

By Gerald L. Maatman, Jr., Alex W. Karasik, and Tyler Zmick

Duane Morris Takeaways:  In Sloan, et al. v. Anker Innovations Ltd., No. 22-CV-7174 (N.D. Ill. Jan. 9, 2024), Judge Sarah Ellis of the U.S. District Court for the Northern District of Illinois granted in part a motion to dismiss privacy claims brought against the companies that manufacture and sell “eufy” security products.  The Court dismissed the claims asserted under the federal Wiretap Act because Defendants were “parties” to the communication during which the eufy products sent security recordings to Plaintiffs’ mobile devices (notwithstanding that the products also sent the data to a server owned by Defendants).  In addition, the Court partially dismissed Plaintiffs’ claims under the Illinois Biometric Information Privacy Act and under four state consumer protection statutes, thereby allowing Plaintiffs to proceed with their case only with respect to some of their claims.

For businesses who are embroiled in facial recognition software and related privacy class actions, this ruling provides a helpful roadmap for fracturing such claims at the outset of the lawsuit.

Case Background

Plaintiffs were individuals from various states who purchased and used Defendants’ “eufy” branded home security cameras and video doorbells.  The eufy products can, among other things, detect motion outside a person’s home and apply a facial recognition program differentiate “between known individuals and strangers by recognizing biometric identifiers and comparing the face template against those stored in a database.”  Id. at 3.  Eufy products sync to a user’s phone through eufy’s Security app, which notifies a user of motion around the camera by sending the use a recorded thumbnail image or text message.

Defendants advertised that the video recordings and facial recognition data obtained through eufy cameras are stored locally on user-owned equipment owned and that the data would be encrypted so that only the user could access it.  Media reports later revealed, however, that the eufy products uploaded thumbnail images used to notify users of movement to Defendants’ cloud storage without encryption, and that users could stream content from their videos through unencrypted websites.

Claiming they relied to their detriment on Defendants’ (allegedly false) privacy-related representations when purchasing the eufy products, the eight named Plaintiffs filed a putative class action against corporate Defendants involved in the manufacture and sale of “eufy” products.  In their complaint, Plaintiffs asserted that Defendants violated: (1) the Federal Wiretap Act; (2) the Biometric Information Privacy Act (the “BIPA”); and (3) the consumer protection statutes of Illinois, New York, Massachusetts, and Florida.  Defendants moved to dismiss Plaintiffs’ claims under Federal Rule of Civil Procedure 12(b)(6).

The Court’s Decision

The Court granted in part and denied in part Defendants’ motion, holding that: (1) the Wiretap Act claim should be dismissed because Defendants were a party to the relevant communication (i.e., the transmission of data from eufy products to Plaintiffs via the eufy Security app); (2) the BIPA claims should be dismissed as to non-Illinois resident Plaintiffs; and (3) the claims brought under the relevant consumer protection statutes should be dismissed only to the extent they were premised on certain of Defendants’ public-facing privacy statements.

Wiretap Act Claims

The Court first addressed Plaintiffs’ Wiretap Act claims, explaining that the statute “empowers a private citizen to bring a civil claim against someone who ‘intentionally intercepts [or] endeavors to intercept . . . any wire, oral, or electronic communication.’”  Id. at 8 (quoting 18 U.S.C. § 2511(1)(a)).

Defendants argued that Plaintiffs failed to state a claim under the Wiretap Act because the statute does not apply to a party to the relevant communication.  Specifically, the Wiretap Act exempts a person who intercepts an electronic communication “where such person is a party to the communication or where one of the parties to the communication has given prior consent to such interception.”  18 U.S.C. § 2511(2)(d).

The Court agreed with Defendants and thus dismissed Plaintiffs’ Wiretap Act claim.  The Court described the relevant “communication” as the transmission of data from eufy products to Plaintiffs’ devices and explained that the transmission “is not between the eufy product and Plaintiffs, but rather between the eufy product and the eufy Security app, which Defendants own and operate.  As such, the communication necessarily requires Defendants’ participation, even if Plaintiffs did not intend to share their information with Defendants.”  Id. at 8-9 (emphasis added).  The Court thus held that Defendants were parties to the communication, and Defendants also uploading the data to their own server (without Plaintiffs’ knowledge) did not change that conclusion.

BIPA Claims

Regarding Plaintiffs’ BIPA claims, Defendants argued that Plaintiffs failed to allege that the relevant data (which Defendants described as “thumbnail images”) qualifies for protection under the BIPA because photographs are not biometric data under the statute.  The Court rejected this argument since Plaintiffs alleged that Defendants uploaded thumbnail information and facial recognition data (namely, “scans of face geometry”) to their server.

The Court agreed with Defendants’ second argument, however, which asserted that Plaintiffs’ BIPA claim failed to the extent it was brought by or on behalf of Plaintiffs who are not Illinois residents.  The BIPA applies only where the underlying conduct occurs “primarily and substantially” in Illinois.  The Court determined that the relevant communications between Plaintiffs and Defendants “occurred primarily and substantially in the state of residency for each Plaintiff.”  Id. at 12-13.  And the End User License Agreement for eufy Camera Products and the Security App stating that the agreement is governed by Illinois law did not change the result that the BIPA claim brought by non-Illinois residents must be dismissed.

Statutory Consumer Protection Claims

Finally, the Court turned to Defendants’ contentions relative to the alleged violations of the four state consumer protection statutes.  In beginning its analysis, the Court explained that “[t]o state a claim for deceptive practices under any of the alleged state consumer fraud statutes, Plaintiffs must allege a deceptive statement or act that caused their harm.”  Id. at 14.  Moreover, “a statement is deceptive if it creates a likelihood of deception or has the capacity to deceive.”  Id. at 15 (citation omitted); see also id. (noting that “the allegedly deceptive act must be looked upon in light of the totality of the information made available to the plaintiff”) (citation omitted).  Defendants argued in their motion to dismiss that Plaintiffs did not allege cognizable deceptive statements because the statements at issue constitute either puffery or are not false.

The Court dismissed Plaintiffs’ statutory fraud claims in part.  Specifically, the Court held that Defendants’ advertising in the form of certain “statements relating to privacy” (e.g., “your privacy is something that we value as much as you do”) constituted nonactionable “puffery.”  Id. at 16.  The Court therefore dismissed Plaintiffs’ statutory fraud claims insofar as they were premised on the similarly vague “statements relating to privacy.”

However, the Court denied Defendants’ attempt to dismiss the claims premised on their more specific statements about (1) end-user data being stored only on a user’s local device, (2) the use of alleged facial recognition, and (3) end-user data being encrypted.  Defendants argued that these were “accurate statements” and thus could not serve as the basis for consumer fraud claims.  The Court disagreed, ruling that Plaintiffs sufficiently alleged that the storage, encryption, and facial recognition statements may have misled a reasonable consumer.  Accordingly, the Court granted in part and denied in part Defendants’ motion to dismiss.

Implications For Corporate Counsel

The most significant aspect of Sloan v. Anker Innovations Limited is the Court’s analysis of Plaintiffs’ Wiretap Act claims, given the rapidly emerging trend among the plaintiff class action bar of using traditional state and federal laws – including the Wiretap Act – to seek relief for alleged privacy violations.  In applying modern technologies to older laws like the Wiretap Act (passed in 1986), courts have grappled with issues such as the determination of who is a “party to the communication” such that an entity is exempt from the statute’s scope.  As data exchanges and data storage become more complex, the “party to the communication” determination reciprocally becomes more nebulous.

In Sloan, the “communication” was the eufy products transmitting data to Plaintiffs’ device and “contemporaneously intercept[ing] and sen[ding] [the data] to [Defendant’s] server.”  Id. at 8 (citation omitted).  Because Plaintiffs had to use the eufy Security app to access the data, and because Defendants owned and operated the app, the Court determined that Defendants necessarily participated in the communication.  But the result may have been different if, for instance, Plaintiffs could use a different app (one not owned by Defendants) to access the data, or if unbeknownst to Plaintiffs, the eufy Securty app was actually owned and operated by a third-party entity.  The upshot is that corporate counsel should keep these principles in mind with respect to any data-flow processes regarding end-user or employee data.

Illinois Supreme Court Endorses Broad Interpretation Of The BIPA’s “Health Care Exception”

By Gerald L. Maatman, Jr. and Tyler Zmick

Duane Morris Takeaways:  In the latest ruling in the biometric privacy class action space, the Illinois Supreme Court embraced a broad reading of the “health care exception” in the Illinois Biometric Information Privacy Act (“BIPA”) in Mosby v. Ingalls Memorial Hospital, 2023 IL 129081 (Ill. Nov. 30, 2023).  The Illinois Supreme Court held that the statute excludes from its scope data collected in two separate and distinct scenarios: (1) “information captured from a patient in a health care setting”; and (2) information collected “for health care treatment, payment, or operations under the federal Health Insurance Portability and Accountability Act of 1996 (HIPAA).”  Unlike clause (1), the Supreme Court held that the exception in clause (2) is not limited to data obtained from patients and serves to exclude information that originates from any source.

The Mosby ruling is welcome news to BIPA defendants and companies operating in the health care space.  In the wake of the decision, courts likely will be asked to define the exact contours of the BIPA’s broadened “health care exception” in cases presenting facts that are less obviously tied to health care treatment, payment, or operations compared to the facts at issue in Mosby.

Case Background

The Plaintiffs in Mosby were nurses who claimed that their hospital-employers required them to use a fingerprint-based medication-dispensing system to verify their identities.  Plaintiffs sued their employers and the company that distributed the medication-dispensing system, alleging that Defendants violated §§ 15(a), 15(b), and 15(d) of the BIPA by using the medical-station scanning device to collect, use, and/or store their “finger-scan data” without complying with the BIPA’s notice-and-consent requirements and by disclosing their purported biometric data to third parties without first obtaining their consent.

Defendants moved to dismiss in the trial court, arguing that the claims failed because Plaintiffs’ data was specifically excluded from the BIPA’s scope under § 10 of the statute, which states that “[b]iometric identifiers do not include information captured from a patient in a health care setting or information collected, used, or stored for health care treatment, payment, or operations under [the HIPAA].”  740 ILCS 14/10.  Defendants argued that the latter clause applied in that Plaintiffs’ fingerprints had been used in connection with Plaintiffs providing medicine to patients, meaning their fingerprints were “collected, used, or stored for health care treatment, payment, or operations under [the HIPAA].”  Id.

The trial court denied Defendants’ motions. It ruled that § 10’s “health care exception” was limited to patient information protected under the HIPAA and that the exclusion does not extend to information collected from health care workers.

On appeal, the First District of the Illinois Appellate Court affirmed the denial of Defendants’ motions to dismiss.  Echoing the trial court, the Appellate Court determined that the biometric data of health care workers is not excluded from the BIPA’s scope and that the relevant provision of § 10 excluded from the BIPA’s protections “only patient biometric information.”  Mosby, 2023 IL 129081, ¶ 16; see id. ¶ 17 (“[T]he appellate court held that ‘the plain language of the statute does not exclude employee information from the [BIPA’s] protections because they are neither (1) patients nor (2) protected under HIPAA.’”) (citation omitted).

Appellate Court Judge Mikva dissented from the majority’s opinion.  Judge Mikva opined that the legislature meant to exclude from the BIPA’s scope the biometric data of health care workers “where that information is collected, used, or stored for health care treatment, payment, or operations, as those functions are defined by the HIPAA.”  Id. ¶ 19 (citation omitted).  Judge Mikva expressed the view that the first part of § 10’s “health care exception” excludes from the BIPA’s coverage information from a particular source (i.e., patients in a health care setting) and that the second part excludes information used for particular purposes (i.e., health care treatment, payment, or operations), regardless of the source of that information.

The Illinois Supreme Court’s Decision

On further appeal, the Illinois Supreme Court agreed with Appellate Court Judge Mikva’s dissent, unanimously holding that the BIPA’s exclusion for “information collected, used, or stored for health care treatment, payment, or operations under [the HIPAA]” can apply to the biometric data of health care workers (not only patients).

The Supreme Court determined that the relevant sentence of § 10 excludes from the definition of “biometric identifier” data that may be collected in two distinct (rather than overlapping) scenarios – namely, biometric identifiers do not include (i) information captured from a patient in a health care setting or (ii) information collected, used, or stored for health care treatment, payment, or operations under HIPAA.  Id. ¶ 37 (“[T]he phrase prior to the ‘or’ and the phrase following the ‘or’ connotes two different alternatives.  The Illinois legislature used the disjunctive ‘or’ to separate the [BIPA’s] reference to ‘information captured from a patient in a health care setting’ from ‘information collected, used, or stored for health care treatment, payment, or operations under [the HIPAA].’  Pursuant to its plain language, information is exempt from the [BIPA] if it satisfies either statutory criterion.”) (internal citations omitted).

The Supreme Court agreed with Defendants that the two categories of information are different because information excluded under the first clause originates from the patient, whereas information excluded under the second clause may originate from any source.  Regarding the second clause, the Supreme Court observed that the Illinois legislature borrowed the phrase “health care treatment, payment, and operations” from the federal HIPAA regulations.  Accordingly, the Supreme Court determined that “the legislature was directing readers to the HIPAA to discern the meaning of those terms,” which meanings “relate to activities performed by the health care provider – not by the patient.”  Id. ¶ 52.

Thus, the Supreme Court held that a health care worker’s data used to permit access to medication-dispensing stations for patient care qualifies as “information collected, used, or stored for health care treatment, payment, or operations under [the HIPAA]” and is exempt from the statute’s scope.

Implications Of The Decision

After the recent slew of plaintiff-friendly BIPA decisions issued by both state and federal courts, the Illinois Supreme Court’s decision in Mosby comes as welcome news for companies facing privacy-related class actions – particularly those operating in the health care space.

Relying on Mosby, defendants will likely add the BIPA’s “health care exception” to their arsenal of defenses in a wider array of cases moving forward.  Importantly, for purposes of the second “HIPAA prong” of the statute’s “health care exception,” federal HIPAA regulations govern the definitions of the terms “health care treatment,” “payment,” and “operations.”  Given that the regulatory definitions of those terms are broad, see 45 C.F.R. § 160.103; id. § 164.501, defendants will likely test the breadth of the exception in future cases presenting facts that may be less obviously tied to health care treatment, health care payment, and/or health care operations compared to the facts at issue in Mosby.

Illinois Appellate Court Denies Cell Phone Retailer’s Second Attempt To Arbitrate Class Action Privacy Claims

By Gerald L. Maatman, Jr. and Tyler Zmick

Duane Morris Takeaways:  In Ipina v. TCC Wireless, 2023 IL App (1st) 220547-U (Nov. 9, 2023), the First District of the Illinois Appellate Court held that T-Mobile retailer TCC Wireless was barred from enforcing an arbitration clause in the plaintiff’s employment agreement based on TCC’s actions in an earlier-filed privacy class action it settled.  The Court determined that TCC was collaterally estopped from compelling the plaintiff’s claims to arbitration because TCC had unsuccessfully moved to send nearly identical claims to arbitration in the earlier-filed case.  In doing so, the Illinois Appellate Court embraced a broad view of the circumstances in which “offensive” collateral estoppel is warranted in the class action context – that is, when a party may be prohibited from making an argument that was already raised and rejected in an earlier case.

Background

Plaintiff Stephanie Ipina alleged that while employed by Defendant TCC Wireless, she used a fingerprint-based timekeeping device to clock in and out of work.  According to Plaintiff, her use of the timekeeping device resulted in TCC collecting her biometric data.  Plaintiff claimed that TCC did not give her prior notice that it would be collecting her biometric data or obtain her prior written consent, and that TCC disclosed her data to TCC’s “payroll provider” without Plaintiff’s consent.  Based on these allegations, Plaintiff asserted that TCC violated §§ 15(b) and 15(d) of the Illinois Biometric Information Privacy Act (the “BIPA”).

Plaintiff’s complaint also described a prior BIPA class action entitled Garcia v. TCC Wireless, which had been brought against TCC based on the same timekeeping device used by the Plaintiff in Ipina.  In Garcia, TCC responded to the complaint by moving to compel arbitration pursuant to the plaintiff’s employment agreement, which stated that “[a]ny dispute arising out of or relating in any [way] to Employee’s employment with [TCC] . . . shall be resolved by binding arbitration . . . . except for (i) the institution of a civil action seeking equitable relief, or (ii) the institution of a civil action of a summary nature where the relief sought is predicated on there being no dispute with respect to any fact.”  Id. ¶ 7.

The trial court in Garcia denied TCC’s motion to compel because TCC did not dispute that it collected employees’ biometric data without consent, and therefore the plaintiff’s claims were subject to the arbitration clause’s “carve-out” for claims “of a summary nature where no facts are in dispute.”  Id. ¶ 23.  The parties in Garcia later reached a class-wide settlement, after which TCC produced a list of 899 employees to include in the settlement class.  Due to TCC “compil[ing] the class incorrectly,” however, Plaintiff Stephanie Ipina and other TCC employees were omitted from the list of class members eligible to receive payments in connection with the Garcia settlement.

In response to the complaint filed in the Ipina case (on behalf of Plaintiff and other individuals who should not have been omitted from the settlement class in Garcia), TCC moved to compel Plaintiff’s BIPA claims to arbitration based on the same employment agreement provision at issue in Garcia.  In opposing the motion, Plaintiff argued that TCC was collaterally estopped from compelling arbitration based on TCC’s motion to compel arbitration having been denied in the Garcia action.  The trial court granted TCC’s motion, however, reasoning that collateral estoppel did not apply because unlike in Garcia, in the present case TCC denied the factual allegations set forth in the complaint.

The Illinois Appellate Court’s Decision

On appeal, the Illinois Appellate Court reversed the trial court and held that TCC was collaterally estopped from enforcing the arbitration provision in Plaintiff’s employment agreement.

The Court noted that collateral estoppel is an equitable doctrine that “promotes fairness and judicial economy by preventing the relitigation of issues that have already been resolved in earlier actions.”  Id. ¶ 21 (internal quotation marks and citation omitted).  A party seeking to collaterally estop its opponent from raising a particular argument must show that (i)  the current issue is identical to one that was resolved in a prior action; (ii) the court in the previous matter entered a final judgment on the merits; and (iii) the party against whom estoppel is being asserted was a party, or in privity with a party, to the prior litigation.

The Appellate Court summarized TCC’s litigation conduct in Garcia by noting that in that case, TCC did not dispute that it collected employees’ biometric data without consent; in light of that fact, the court in Garcia denied TCC’s motion to compel arbitration because of the arbitration provision’s exception for claims of a summary nature where no facts are in dispute; the court also denied TCC’s motion to reconsider the order denying TCC’s motion to compel arbitration, which denial TCC did not appeal; and the parties subsequently settled the case on a class-wide basis.

Based on these facts, and contrary to the trial court’s order, the Appellate Court ruled that Plaintiff had shown that the collateral estoppel elements were established, and that the trial court erred in not applying the doctrine.

First, the Appellate Court rejected TCC’s attempt to distinguish the present case from Garcia on the basis that unlike Garcia, in this case TCC had denied the allegations in Plaintiff’s complaint.  According to the Appellate Court, this argument was contradicted by the position TCC had taken throughout the litigation, which is that Plaintiff should have been included in the Garcia settlement because TCC collected her biometric data before she signed a consent form.  Because “TCC is bound by these admissions,” the Appellate Court ruled that the issue in the present case was identical to the issue resolved in Garcia because TCC had effectively conceded the plaintiffs’ factual allegations in both cases. Id. ¶ 25.

Second, the Appellate Court found that the trial court in Garcia entered a “final judgment on the merits” when it issued an order granting final settlement approval and dismissing the case with prejudice.  Acknowledging the split in authority as to whether a settlement agreement qualifies as a “final order on the merits,” the Appellate Court sided with those decisions reflecting the proposition that “policy reasons counsel in favor of applying the doctrine of collateral estoppel to interlocutory judgments after settlement and dismissal with prejudice.”  Id. ¶ 28 (citation omitted).  As stated by the Appellate Court, “[c]ollateral estoppel exists to prevent litigants from doing exactly what TCC attempts.  The doctrine’s purpose is to prevent a party from losing an issue on the merits, but then relitigating it before a different judge to procure the desired result.”  Id. ¶ 29.  Thus, the Appellate Court found that Plaintiff satisfied the second element.

Third, the Appellate Court held that the last collateral estoppel element was satisfied because TCC was the defendant in Garcia and was the same party against whom estoppel was being asserted in the present case.  See id. ¶ 30 (“TCC was a party in Garcia, where it had the same incentive to fully litigate the enforcement of the arbitration clause (and in fact did so).”).  However, the Appellate Court also noted that while both parties argued on appeal the issue of Plaintiff’s privity, that was is “irrelevant” because “the privity requirement only applies to the party against whom estoppel is asserted.”  Id.

Implications For Corporations

Ipina is an important reminder that a litigation decision made in one case can have potentially significant consequences for that party in an entirely separate action.  As illustrated in the Ipina case, a party’s position in one matter (e.g., a defendant conceding the truth of certain factual allegations in a complaint) can be used to limit (or entirely foreclose) that party’s ability to raise a defense in another matter – regardless of how strong the defense might be on the merits.

Thus, corporate defendants should always think about the “big picture” when deciding on a course of action to take in defending a lawsuit.  They should consider not only how a defense position may impact that particular litigation, but also how the position could affect separate and seemingly unrelated actions involving the same (or a related) party, whether in cases that are currently pending or that may be filed in the future.

Illinois Federal Court Allows Amazon “Alexa” Privacy Class Action To Proceed

By Gerald L. Maatman, Jr. and Tyler Zmick

Duane Morris Takeaways:  In Wilcosky, et al. v. Amazon.com, Inc., et al., No. 19-CV-5061 (N.D. Ill. Nov. 1, 2023), the U.S. District Court for the Northern District of Illinois issued a decision embracing a strict interpretation of the notice a private entity must provide before collecting a person’s biometric data in compliance with the Illinois Biometric Information Privacy Act (“BIPA”).  The decision underscores the importance of not only obtaining written consent before collecting a person’s biometric data, but also of the need to be as specific as possible in drafting privacy notices to inform end users that the company is collecting biometric data and to describe the “specific purpose and length of term for which” biometric data is being collected. 

In light of the potentially monumental exposure faced by companies defending putative BIPA class actions, companies that operate in Illinois and collect data that could potentially be characterized as “biometric” should review and, if necessary, update their public-facing privacy notices to ensure compliance with the BIPA. 

Background

Plaintiffs’ BIPA claims in Wilcosky were premised on their respective interactions with Amazon’s “Alexa” device – a digital assistant that provides voice-based access to Amazon’s shopping application and other services.  According to Plaintiffs, Alexa devices identify individuals who speak within the vicinity of an active device by collecting and analyzing the speaker’s “biometric identifiers” (specifically, “voiceprints”).

In their complaint, Plaintiffs claimed that Amazon identifies people from the sound of their voices after they enroll in Amazon’s “Voice ID” feature on the Alexa Application.  To enroll in Voice ID, a user is taken to a screen notifying him or her that the Voice ID feature “enables Alexa to learn your voice, recognize you when you speak to any of your Alexa devices, and provide enhanced personalization.”  Order at 3.  A hyperlink to the Alexa Terms of Use is located at the bottom of the enrollment screen, which Terms state that Voice ID “uses recordings of your voice to create an acoustic model of your voice characteristics.”  Id. at 8.  Before completing the Voice ID enrollment process, a user must agree to the Alexa Terms of Use and authorize “the creation, use, improvement, and storage” of his or her Voice ID by tapping an “Agree and Continue” button.  Id. at 3.

Among the four named Plaintiffs, three had enrolled in Voice ID using their respective Alexa devices (the “Voice ID Plaintiffs”).  One Plaintiff, Julia Bloom Stebbins, did not enroll in Voice ID; rather, she alleged that she spoke in the vicinity of Plaintiff Jason Stebbins’s Alexa device, resulting in Alexa collecting her “voiceprint” to determine whether her voice “matched” the Voice ID of Plaintiff Jason Stebbins.

Based on their alleged interactions with Alexa, Plaintiffs claimed that Amazon violated Sections 15(b), 15(c), and 15(d) of the BIPA by (i) collecting their biometric data without providing them with the requisite notice and obtaining their written consent, (ii) impermissibly “profiting from” their biometric data, and (iii) disclosing their biometric data without consent.

Amazon moved to dismiss Plaintiffs’ complainton the basis that: (1) the Voice ID Plaintiffs received the required notice and provided their written consent by completing the Voice ID enrollment process; and (2) Plaintiff Bloom Stebbins never enrolled in Voice ID – meaning she was a “total stranger” to Amazon such that Amazon could not possibly identify her based on the sound of her voice.

The Court’s Decision

The Court denied Amazon’s motion to dismiss in a 15-page order, focused primarily on Amazon’s arguments relating to Plaintiffs’ Section 15(b) claim.

Sufficiency Of Notice Provided To Voice ID Plaintiffs

Regarding the requirements of Section 15(b), the Court noted that a company collecting biometric data must first: (1) inform the individual that biometric data is being collected or stored; (2) inform the individual of the specific purpose and length of term for which the biometric data is being collected, stored, and used; and (3) receive a written release signed by the individual.

In moving to dismiss the Voice ID Plaintiffs’ Section 15(b) claim, Amazon argued that those three Plaintiffs received all legally required notices during the Voice ID enrollment process.  During that process, Amazon explained how Voice ID works and informed users that the technology creates an acoustic model of a user’s voice characteristics.  Amazon maintained that notice language need not track the exact language set forth in Section 15(b) because the BIPA does not require that any particular statutory language be provided to obtain a person’s informed consent.  Id. at 6 (noting Amazon’s argument that “Voice ID Plaintiffs’ voiceprints were collected in circumstances under which any reasonable consumer should have known that his or her biometric information was being collected”).

The Court adopted Plaintiffs’ stricter reading of Section 15(b). It held that the complaint plausibly alleged that Amazon’s disclosures did not fully satisfy Section 15(b)’s notice requirements.  While Amazon may have informed users that Voice ID enables Alexa to learn their voices and recognize them when they speak, Amazon did not specifically inform users that it is “collecting and capturing the enrollee’s voiceprint, a biometric identifier.”  Id.at 8.  As a result, and acknowledging that it was “a close call,” the Court denied Amazon’s motion to dismiss the Section 15(b) claim asserted by the Voice ID Plaintiffs.

Application Of The BIPA To “Non-User” Plaintiff Julia Bloom Stebbins

The Court next turned to Plaintiff Bloom Stebbins, who did not create an Alexa Voice ID but alleged that Amazon collected her “voiceprint” when she spoke in the vicinity of Plaintiff Jason Stebbins’s Alexa device.  Amazon argued that her Section 15(b) claim failed because the BIPA was not meant to apply to someone in her shoes – that is, a stranger to Amazon and “who Amazon has no means of identifying.”  Id. at 11.

The Court rejected Amazon’s argument.  In doing so, the Court refused to read Section 15(b)’s requirements as applying only where a company has some relationship with an individual.  According to the Court, that interpretation would amount to “read[ing] a requirement into the statute that does not appear in the statute itself.”  Id. at 12; see also id. (“[C]ourts in this Circuit have rejected the notion that to state a claim for a Section 15(b) violation, there must be a relationship between the collector of the biometric information and the individual.”).

Conclusion

Wilcosky is required reading for corporate counsel of companies that are facing privacy-related class actions and/or want to ensure their consumer or employee-facing privacy disclosures contain all notices required under applicable law.

The Wilcosky decision endorses a strict view regarding the notice a company must provide to individuals to fully comply with Section 15(b) of the BIPA.  To ensure compliance, companies should provide end users with language that is as specific as possible regarding the type(s) of data being collected (including the fact that the data may be “biometric”), the purpose the data is being collected, and the time period during which the data will be stored.  The notice should closely track the BIPA’s statutory text, and companies should also require individuals to affirmatively express that they have received the notice and agree to the collection of their biometric data.  (Despite a footnote stating that the Court’s order in Wilcosky should not “be interpreted to mean that . . . a disclosure must parrot the exact language of BIPA in order to satisfy Section 15(b),” id. at 8 n.3, the Court does not explain how a disclosure could satisfy Section 15(b) without tracking the statute’s language verbatim.)

Moreover, Wilcosky raises the question whether a company should characterize data it collects as “biometric” data in its privacy notice – even if the company maintains (perhaps for good reason) that the data does not constitute biometric data subject to regulation under the BIPA.  Further complicating this question is the fact that the precise contours of the types of data that qualify as “biometric” under the BIPA are unclear and are currently being litigated in many cases.  Companies may wish to err on the “safe side” and refer to the data being collected as “biometric” data in their privacy notices.

Court Dismisses VPPA Class Claim Alleging That General Mills Shared Consumer Data With Facebook And Google

By Gerald L. Maatman, Jr. and Tyler Zmick

Duane Morris Takeaways:  In Carroll v. General Mills, Inc., No. 23-CV-1746 (C.D. Cal. Sept. 1, 2023), Judge Dale Fischer of the U.S. District Court for the Central District of California issued a decision dismissing (for a second time) a class claim brought against General Mills under the Video Privacy Protection Act (“VPPA”).  In its decision, the Court ruled that General Mills – a company that manufactures and sells cereals and other food products – did not qualify as a “video tape service provider” under the VPPA, and that even if it did, Plaintiffs’ claim would still fail because they did not show they were “consumers” covered by the statute’s privacy protections.  Carroll v. General Mills is the latest decision involving the VPPA – a long dormant statute that class action plaintiffs have recently turned to in attempting to seek redress for alleged privacy violations.

Case Background

Plaintiffs Keith Carroll and Rebeka Rodriguez alleged that they watched videos on General Mills’ website and that General Mills subsequently disclosed their “video viewing behavior” to Facebook and Google.  Specifically, Carroll claimed that General Mills sent Facebook the video he watched online and his identifying information in connection with General Mills’ use of a Facebook advertising feature.  Similarly, Rodriguez claimed that General Mills disclosed her “video viewing behavior” and other website analytics data to Google through General Mills’ use of the Google Marketing Platform.

Based on these allegations, Plaintiffs filed a class action that alleged General Mills violated the Video Privacy Protection Act (“VPPA”) by knowingly disclosing their personally identifiable information (“PII”) to Facebook and Google.  See 18 U.S.C. § 2710(b)(1).

The District Court’s Decision

The Court granted General Mills’ motion to dismiss Plaintiffs’ VPPA claim. It held that Plaintiffs failed to satisfy the first two prongs of the four-step pleading test applicable to VPPA claims.

In analyzing the allegations, the Court explained that to state a VPPA claim, a plaintiff must allege that: (1) a defendant is a “video tape service provider”; (2) the defendant disclosed PII concerning a consumer to another person; (3) the disclosure was made knowingly; and (4) the disclosure was not authorized by the “safe harbor” provision set forth in 18 U.S.C. § 2710(b)(2).

Like the claim asserted in the previous version of their complaint, the Court determined that Plaintiffs’ VPPA claim failed at step (1) because Plaintiffs did not adequately allege that General Mills is a “video tape service provider,” and that even if the Court were to proceed to step (2), Plaintiffs would also fail at that step based on their inability to show that they qualify as “consumers” under the statute.

“Video Tape Service Provider”

Regarding step (1), the VPPA defines a “video tape service provider” as “any person, engaged in the business, in or affecting interstate or foreign commerce, of rental, sale, or delivery of prerecorded video cassette tapes or similar audio visual materials.”  18 U.S.C. § 2710(a)(4).  Importantly, the Court noted that the statute does not apply to every company that “delivers audio visual materials ancillary to its business” but only to companiesspecifically in the business of providing audio visual materials.”  See Order at 6.

Based on the allegations at hand, the Court held that Plaintiffs failed to allege that General Mills – who manufactures and sells cereals, yogurts, dog food, and other products – is “engaged in the business of delivering, selling, or renting audiovisual material.”  Id.  The Court rejected Plaintiffs’ attempt to satisfy step (1) by adding allegations in their amended complaint regarding General Mills posting on its website links to professionally made videos.  In the Court’s words, these “allegations do no more than show that videos are part of General Mills’ marketing and brand awareness,” which does not suggest “that the videos are profitable in and of themselves” or that the videos “are the business that General Mills is engaged in.”  Id. at 6-7.

“Consumer”

The Court next held that even if Plaintiffs had satisfied the first step, they nonetheless would have failed at step (2) based on their failure to allege facts establishing that they are “consumers” under the VPPA.

The VPPA defines “consumer” as “any renter, purchaser, or subscriber of goods or services from a video tape service provider.”  18 U.S.C. § 2710(a)(1).  Read in the statute’s full context, courts have held that “a reasonable reader would understand the definition of ‘consumer’ to apply to a renter, purchaser or subscriber of audio-visual goods or services, and not goods or services writ large.”  See Order at 7 (citation omitted).  That is, the definition of “consumer” “mirrors the language used to define a ‘video tape service provider’ as one who is in the business of ‘rental, sale, or delivery’ of audiovisual material.”  Id.; see also id. at 7-8 (“‘[C]onsumer’ is obviously meant to be cabined in the same way [as ‘video tape service provider’] – as a renter, purchaser, or subscriber of prerecorded video cassette tapes or similar audio visual materials.”).

The Court determined that Plaintiffs’ prior purchase of General Mills’ food – an “unrelated product” – does not make them “consumers of audiovisual material.”  Id. at 8.  The Court further noted that Plaintiffs’ failure at step (2) highlights “the fundamental issue” with their VPPA claim – namely, Plaintiffs struggle to plead that they are consumers of General Mills’ audiovisual material because General Mills is not in the business of offering audiovisual material to consumers.  See id. at 8-9 (“If General Mills were in such a business, Plaintiffs would not be referring to purchases of General Mills’ food products to establish themselves as consumers.”).

Implications For Corporate Counsel

The decision in Carroll v. General Mills reflects the recent trend among class action plaintiffs’ lawyers of using traditional state and federal laws – including the long dormant VPPA – to seek relief for alleged privacy violations.  In applying modern technologies to older laws like the VPPA (passed in 1988), courts have grappled with, among other issues, determining who qualifies as a “video tape service provider” or a “consumer” under the statute.

The Carroll decision may suggest that the definitions of “video tape service provider” and “consumer” are relatively straightforward, but other cases can present close calls (e.g., whether a social media platform that delivers various services to users, including video content, is a “video tape service provider”).  Indeed, courts have recently faced challenges in interpreting the VPPA’s definitions in cases involving, inter alia, whether individuals who download a free app through which they view videos qualify as “subscribers” (and therefore “consumers”) under the statute.

Given this uncertainty, companies that provide audio visual materials in connection with their business operations should take advantage of the “safe harbor” amendment, adopted in 2013, under which “video tape service providers” may lawfully disclose PII with the informed written consent of consumers.  To do so, companies should update their online consent provisions as needed to specifically address the VPPA.

Illinois Supreme Court Refuses To Reconsider “Per-Scan” BIPA Accrual Ruling In Cothron v. White Castle

By Gerald L. Maatman, Jr. and Tyler Zmick

Duane Morris Takeaways:  As we previously blogged, on February 17, 2023 the Illinois Supreme Court held in Cothron v. White Castle, 2023 IL 128004 (2023), that a separate claim for damages accrues under the Biometric Information Privacy Act (“BIPA”) each time a private entity scans or transmits an individual’s biometric data in violation of Sections 15(b) or 15(d) of the statute.  On July 18, 2023, the Illinois Supreme Court denied White Castle’s petition for hearing, resulting in the February 17 ruling becoming the final “law of the land” in Illinois.  The Court’s decision to deny White Castle’s rehearing petition was not unanimous, however, as reflected by the blistering dissent penned by Justice Overstreet and joined by Chief Justice Theis and Justice Holder White. For companies involved in BIPA class action litigation, the dissent is required reading, as it foreshadows an array of defense-oriented arguments over damages issues in privacy litigation.

Illinois Supreme Court’s Majority Decision In Cothron

In a 4-3 split ruling, the Illinois Supreme Court held on February 17, 2023 that a separate claim accrues under the BIPA each time a private entity scans or transmits an individual’s biometric data in violation of Sections 15(b) or 15(d), respectively.

Relying on the statute’s plain language and the fact that the actions of “collecting” and “disclosing” biometric data can occur more than once, the Supreme Court agreed with Plaintiff’s interpretation – namely, that Section 15(b) “applies to every instance when a private entity collects biometric information without prior consent” and that Section 15(d) “applies to every transmission to a third party.”  Cothron, 2023 IL 128004, ¶¶ 19, 23, 28.  The Supreme Court acknowledged that this interpretation – coupled with the statute allowing prevailing plaintiffs to recover up to $1,000 or $5,000 for each “violation” – could lead to astronomical damages awards that may be “harsh, unjust, absurd or unwise,’” id. ¶ 40 (citation omitted), but noted that it must apply the statute as written and that policy-based concerns should be addressed by the Illinois legislature.

Dissent To Majority’s Decision To Deny White Castle’s Rehearing Petition

On July 18, 2023 the Illinois Supreme Court denied White Castle’s petition for rehearing in Cothron v. White Castle, effectively leaving White Castle with no further avenues for challenging the ruling.

Three Justices (the same three who dissented to the February 17 majority decision) disagreed with the decision to deny White Castle’s petition for rehearing.  In opining that the Supreme Court should have granted rehearing, the Dissent focused on three issues, including: (1) the majority’s “per scan” theory of liability subverting the intent of the Illinois legislature; (2) the majority’s “per scan” theory of liability threatening the survival of Illinois businesses and raising “significant constitutional due process concerns,” id. ¶ 70; and (3) the majority’s decision in failing to provide trial courts with criteria to use in exercising their discretion whether to award statutory damages for BIPA violations.

First, the Dissent stated that the Illinois legislature meant for the BIPA to be a straightforward remedial statute that allows individuals to choose to provide (or not to provide) their biometric data after being informed that the data is being collected, stored, and potentially disclosed.  The Dissent rejected the majority’s “flawed construction” of the statute, which mistakenly presumes that the legislature meant for the BIPA to “establish a statutory landmine” and “destroy commerce in its wake when negligently triggered.”  Id. ¶ 73; see also id. (“The majority’s construction of the [BIPA] does not give effect to the legislature’s true intent but instead eviscerates the legislature’s remedial purpose of the [BIPA] and impermissibly recasts [it] as one that is penal in nature rather than remedial.”).

Second, the Dissent opined that by construing the statute to allow for awards of statutory damages that bear no relation to any actual monetary injury suffered, the majority’s decision raises due process concerns that “raise doubt as to [the BIPA’s] validity.”  Id. ¶ 74; see also id. ¶ 75 (“The legislature’s authority to set a statutory penalty is limited by the requirements of due process.  When a statute authorizes an award that is so severe and oppressive as to be wholly disproportioned to the offense and obviously unreasonable, it does not further a legitimate government purpose, runs afoul of the due process clause, and is unconstitutional.”).

Finally, the Dissent took issue with the majority’s refusal to clarify its February 17 holding with respect to the discretionary (rather than mandatory) nature of liquidated damages under the statute.  Specifically, the Dissent noted that the majority opinion did not provide trial courts with standards or criteria to apply in determining whether to award statutory damages in a particular BIPA case and, if so, in what amount.  The Dissent asserted that the Supreme Court should have agreed to clarify “that statutory damages awards must be no larger than necessary to serve the [BIPA’s] remedial purposes” and to “explain how lower courts should make that determination.”  Id. ¶ 85.  Per the Dissent, “[w]ithout any guidance regarding the standard for setting damages, defendants, in class actions especially, remain unable to assess their realistic potential exposure.”  Id.

Implications For Corporations

Assuming White Castle cannot convince the U.S. Supreme Court to grant review of the Cothron decision based on constitutional issues, Cothron is now the final law of the land in Illinois.  White Castle and other BIPA defendants may, however, attempt to raise constitutional challenges to the statute in other BIPA cases moving forward based on the same concerns expressed by the three dissenting Justices in Cothron.

The denial of White Castle’s rehearing petition indicates that the well is beginning to dry for businesses in terms of potential BIPA defenses.  While employers and other BIPA defendants can still explore novel defenses, such as the exception for information captured from a patient in a health care setting or challenges to personal jurisdiction, many companies caught in the crosshairs of BIPA class actions will face pressure to settle due to the risk of facing monumental potential damages.  Moreover, attempts to reform the BIPA statute failed in 2023, and the Illinois legislature likely will not consider any further reform proposals until 2024.  Given the bleak outlook of the law as it stands, it is imperative that businesses immediately ensure they are compliant with the BIPA.

California District Court Gives Green Light To BIPA Claims Brought Against YouTube

By Gerald L. Maatman, Jr. and Tyler Z. Zmick

Duane Morris Takeaways:  In Colombo v. YouTube, LLC, et al., No. 22-CV-6987, 2023 WL 4240226 (N.D. Cal. June 28, 2023), the U.S. District Court for the Northern District of California issued a decision embracing a broad interpretation of the data types that are within the scope of the Illinois Biometric Information Privacy Act (“BIPA”).  The decision puts businesses on notice that the statute may apply to the collection or possession of any “scan of face geometry,” regardless of whether the scan can be used to identify a specific individual – – in other words, a “biometric identifier” under the BIPA need not be capable of “identifying” a person.  Colombo v. YouTube, LLC is required reading for corporate counsel facing privacy class action litigation.

Background

Plaintiff’s BIPA claims were premised on two YouTube video editing tools that allegedly resulted in the collection of his “biometric identifiers” and “biometric information” (collectively, “biometric data”) – YouTube’s (1) “Face Blur” tool and (2) “Thumbnail Generator” tool.  Id. at 2-3. According to Plaintiff, the “Face Blur” tool enables a user to select faces appearing in videos uploaded by the user that he or she may wish to “blur,” resulting in those faces appearing blurry and unrecognizable to any viewer of the videos.  Plaintiff claimed that when someone uses the tool, YouTube scans the uploaded video “to detect all unique faces” and, in doing so, “captures and stores scans of face geometry from all detected faces, creating a unique ‘faceId’ for each.”  Id. at 2 (citation omitted).

Regarding YouTube’s “Thumbnail Generator” feature, Plaintiff described the tool as auto-generating photographic thumbnails (i.e., screenshots from an uploaded video) by scanning videos for faces at the time they are uploaded and using the “face data to auto-generate thumbnails that contain faces.”  Id. (citation omitted).

Based on his alleged use of these two YouTube tools, Plaintiff alleged that YouTube violated Sections 15(a) and 15(b) of the BIPA by (i) failing to develop and comply with a written policy made available to the public establishing a retention policy and guidelines for destroying biometric data, and (ii) collecting his biometric data without providing him with the requisite notice and obtaining his written consent.

YouTube moved to dismiss on three grounds, arguing that: (1) Plaintiff failed to allege that data collected by YouTube qualifies as “biometric data” under the BIPA because YouTube did not (and could not) use the data to identify Plaintiff or others appearing in uploaded videos; (2) Plaintiff’s claims violated Illinois’s extraterritoriality doctrine and the dormant Commerce Clause; and (3) Plaintiff failed to allege that he was “aggrieved” for purposes of his Section 15(a) claim.

The Court’s Decision

The Court denied YouTube’s motion to dismiss on all three grounds.

“Biometric Identifiers” And “Biometric Information”

YouTube first argued that Plaintiff failed to allege that data collected through the Face Blur and Thumbnail Generator tools qualify as “biometric data” under the BIPA because Plaintiff did not plausibly allege that YouTube could use the data to affirmatively identify Plaintiff or other individuals.  See id. at 4 (“In YouTube’s view, biometric identifiers must identify a person and biometric information must actually be used to identify a person.”).

The Court rejected YouTube’s argument, stating that “[t]he “point is not well taken.”  Id.  The Court noted the statute’s definition of “biometric identifier” as “a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry,” see 740 ILCS 14/10 – a definition that does not explicitly require that the listed data points be capable of identifying a particular person.  While the Court acknowledged that the term “identifier” may suggest that the data must be used to identify a person, the Court also opined that “‘[w]hen a statute includes an explicit definition, we must follow that definition,’ even if it varies from a term’s ordinary meaning.”  Id. at 4 (citation omitted); see also id. at 5 (“[T]he Illinois legislature was perfectly free to define ‘biometric identifier’ in a specific manner that is not tethered to the plain meaning of the word ‘identifier’ alone.”).

Extraterritoriality & Dormant Commerce Clause

The Court also rejected YouTube’s arguments that Plaintiff failed to allege that YouTube’s relevant conduct occurred “primarily and substantially” in Illinois, and Plaintiff’s interpretation of the BIPA would run afoul of the dormant Commerce Clause.

The Court held that Plaintiff sufficiently alleged that YouTube’s conduct occurred “primarily and substantially” in Illinois, thereby satisfying the extraterritoriality doctrine.  Id. at 5. Responding to YouTube’s argument that the company’s headquarters and data servers are located outside of Illinois, the Court stated that those facts are “not dispositive” and that “[m]aking the geographic coordinates of a server the most important circumstance in fixing the location of an Internet company’s conduct would . . . effectively gut the ability of states without server sites to apply their consumer protection laws to residents for online activity that occurred substantially within their borders.”  Id. at 6 (citation omitted).

Using the same reasoning, the Court concluded that “YouTube’s dormant Commerce Clause theory fares no better” because YouTube’s allegedly BIPA-violating conduct “cannot be understood to have occurred wholly outside Illinois,” id. at 7 (citation omitted) – i.e., Plaintiff’s claims were based on the application of an Illinois law to Illinois-based YouTube users.

Whether Plaintiff Is “Aggrieved” Under Section 15(a)

Finally, the Court rejected YouTube’s argument that Plaintiff failed to allege that he was “aggrieved” under Section 15(a), which sets forth two requirements for entities in possession of biometric data: (i) to develop a publicly available BIPA-compliant retention policy; and (ii) to comply with that policy.  YouTube argued that Plaintiff failed to allege that he was aggrieved under Section 15(a) because he did not claim that YouTube failed to comply with an existing retention policy as to his biometric data (e.g., that three years had passed since his last interaction with YouTube, yet YouTube had failed to destroy his biometric data).

The Court observed, however, that Plaintiff alleged that YouTube failed to develop and “therefore failed to comply with any BIPA-compliant policy,” which “is enough to move forward . . . [a]t the pleadings stage.”  Id. at 8 (emphasis added) (citation omitted).

Implications For Corporate Counsel

Colombo can be added to the list of recent plaintiff-friendly BIPA decisions, as it endorses an expansive view of the types of data that constitute “biometric data” under the statute.  Indeed, the Colombo ruling suggests that any data that can be characterized as a “scan of face geometry” – regardless of whether the scan can be linked to a specific person to identify him or her – qualifies as a “biometric identifier” within the BIPA’s scope.  Put another way, technology capable of only detecting a category of objects or characteristics in a photo or video (e.g., software that identifies the location of a human face in a photo – as opposed to an arm or leg – without being able to link that face to a specific person) may involve data subject to regulation under the BIPA.

Tennessee Becomes Eighth State To Enact Comprehensive Privacy Legislation

By Gerald L. Maatman, Jr., Jennifer A. Riley, and Tyler Zmick

Duane Morris Takeaways: As efforts to enact comprehensive privacy protection continue to stall on the federal level, states have stepped up to create a patchwork quilt of protections for those doing business with consumers within their borders.  Tennessee recently became the eighth state – following Indiana, California, Colorado, Connecticut, Iowa, Utah, and Virginia – to enact comprehensive privacy legislation.  At least 15 other states have introduced similar bills during the current legislative session, and Montana’s comprehensive consumer privacy statute awaits the signature of its Governor.  Companies doing business in Tennessee or with Tennessee consumers should take heed of the new law and review their policies and processes for compliance.

Tennessee Legislation

After receiving overwhelming support from both houses of the General Assembly, on May 11, 2023, Governor Bill Lee signed the Tennessee Information Protection Act into law.  With this law, Tennessee became the eighth state to institute comprehensive consumer privacy legislation.  The law is set to take effect on July 1, 2024.

The act applies to businesses that conduct business in Tennessee or produce products or services that are targeted to Tennessee residents and that: (1) control or possess the personal information of at least 175,000 consumers; or (2) control or process personal information of at least 25,000 consumers and derive more than 50% of their gross revenue from the sale of personal information.  The law contains exemptions for certain types of entities, such as governmental entities, certain financial institutions, non-profit organizations, and higher education institutions.  The law also exempts certain types of data, such as personal information regulated by the Family Educational Rights and Privacy Act, and protected health information under HIPAA.

Similar to other comprehensive state privacy laws, the Tennessee law grants Tennessee residents certain rights in their personal information.  It allows for consumers to confirm whether a company is processing their personal information, to access their personal information, to correct inaccuracies in their personal information, to delete their personal information, to obtain copies of their personal information, and to opt out of future sales or targeted advertising.

The law allows a consumer to invoke his or her rights (and the rights of his or her children) at any time by submitting a request to a controller of the personal information specifying the rights that the consumer wishes to invoke, and it requires the respondent to comply with an authenticated request without undue delay but, in all cases, within 45 days.

The law imposes various requirements on persons and entities who “determine[] the purpose and means” of processing personal information.  For example, it requires such persons and entities to limit the collection of personal information to what is adequate, relevant, and reasonably necessary in relation to the purposes for which the data is processed; to establish, implement, and maintain reasonable data security practices; and, if the controller processes or sells personal information for targeted advertising, to clearly and conspicuously disclose the processing, as well as the manner in which a consumer may exercise the right to opt out of the processing.

The Tennessee law does not provide for a private right of action and vests exclusive enforcement authority in the Tennessee attorney general.  It allows a court to impose civil penalties of up to $7,500 per violation, and allows treble damages for willful or knowing violations.  The law requires that, prior to initiating an action, the attorney general must provide a 60-day notice period during which the recipient may cure the noticed violation to avoid an enforcement action. The law also creates an affirmative defense under certain circumstances for a company that creates, maintains, and complies with a written privacy policy that reasonably conforms to documented policies, standards, and procedures designed to safeguard consumer privacy.

Implications for Businesses

Covered persons and entities who do business in Tennessee or who target Tennessee consumers should start reviewing their policies and developing processes to comply with the Tennessee law.  Although the law is not set to take effect until July 1, 2024, the law adds another challenge to the already complex compliance landscape for companies seeking to operate on a nationwide basis.

© 2009- Duane Morris LLP. Duane Morris is a registered service mark of Duane Morris LLP.

The opinions expressed on this blog are those of the author and are not to be construed as legal advice.

Proudly powered by WordPress