Illinois Federal Court Allows Amazon “Alexa” Privacy Class Action To Proceed

By Gerald L. Maatman, Jr. and Tyler Zmick

Duane Morris Takeaways:  In Wilcosky, et al. v., Inc., et al., No. 19-CV-5061 (N.D. Ill. Nov. 1, 2023), the U.S. District Court for the Northern District of Illinois issued a decision embracing a strict interpretation of the notice a private entity must provide before collecting a person’s biometric data in compliance with the Illinois Biometric Information Privacy Act (“BIPA”).  The decision underscores the importance of not only obtaining written consent before collecting a person’s biometric data, but also of the need to be as specific as possible in drafting privacy notices to inform end users that the company is collecting biometric data and to describe the “specific purpose and length of term for which” biometric data is being collected. 

In light of the potentially monumental exposure faced by companies defending putative BIPA class actions, companies that operate in Illinois and collect data that could potentially be characterized as “biometric” should review and, if necessary, update their public-facing privacy notices to ensure compliance with the BIPA. 


Plaintiffs’ BIPA claims in Wilcosky were premised on their respective interactions with Amazon’s “Alexa” device – a digital assistant that provides voice-based access to Amazon’s shopping application and other services.  According to Plaintiffs, Alexa devices identify individuals who speak within the vicinity of an active device by collecting and analyzing the speaker’s “biometric identifiers” (specifically, “voiceprints”).

In their complaint, Plaintiffs claimed that Amazon identifies people from the sound of their voices after they enroll in Amazon’s “Voice ID” feature on the Alexa Application.  To enroll in Voice ID, a user is taken to a screen notifying him or her that the Voice ID feature “enables Alexa to learn your voice, recognize you when you speak to any of your Alexa devices, and provide enhanced personalization.”  Order at 3.  A hyperlink to the Alexa Terms of Use is located at the bottom of the enrollment screen, which Terms state that Voice ID “uses recordings of your voice to create an acoustic model of your voice characteristics.”  Id. at 8.  Before completing the Voice ID enrollment process, a user must agree to the Alexa Terms of Use and authorize “the creation, use, improvement, and storage” of his or her Voice ID by tapping an “Agree and Continue” button.  Id. at 3.

Among the four named Plaintiffs, three had enrolled in Voice ID using their respective Alexa devices (the “Voice ID Plaintiffs”).  One Plaintiff, Julia Bloom Stebbins, did not enroll in Voice ID; rather, she alleged that she spoke in the vicinity of Plaintiff Jason Stebbins’s Alexa device, resulting in Alexa collecting her “voiceprint” to determine whether her voice “matched” the Voice ID of Plaintiff Jason Stebbins.

Based on their alleged interactions with Alexa, Plaintiffs claimed that Amazon violated Sections 15(b), 15(c), and 15(d) of the BIPA by (i) collecting their biometric data without providing them with the requisite notice and obtaining their written consent, (ii) impermissibly “profiting from” their biometric data, and (iii) disclosing their biometric data without consent.

Amazon moved to dismiss Plaintiffs’ complainton the basis that: (1) the Voice ID Plaintiffs received the required notice and provided their written consent by completing the Voice ID enrollment process; and (2) Plaintiff Bloom Stebbins never enrolled in Voice ID – meaning she was a “total stranger” to Amazon such that Amazon could not possibly identify her based on the sound of her voice.

The Court’s Decision

The Court denied Amazon’s motion to dismiss in a 15-page order, focused primarily on Amazon’s arguments relating to Plaintiffs’ Section 15(b) claim.

Sufficiency Of Notice Provided To Voice ID Plaintiffs

Regarding the requirements of Section 15(b), the Court noted that a company collecting biometric data must first: (1) inform the individual that biometric data is being collected or stored; (2) inform the individual of the specific purpose and length of term for which the biometric data is being collected, stored, and used; and (3) receive a written release signed by the individual.

In moving to dismiss the Voice ID Plaintiffs’ Section 15(b) claim, Amazon argued that those three Plaintiffs received all legally required notices during the Voice ID enrollment process.  During that process, Amazon explained how Voice ID works and informed users that the technology creates an acoustic model of a user’s voice characteristics.  Amazon maintained that notice language need not track the exact language set forth in Section 15(b) because the BIPA does not require that any particular statutory language be provided to obtain a person’s informed consent.  Id. at 6 (noting Amazon’s argument that “Voice ID Plaintiffs’ voiceprints were collected in circumstances under which any reasonable consumer should have known that his or her biometric information was being collected”).

The Court adopted Plaintiffs’ stricter reading of Section 15(b). It held that the complaint plausibly alleged that Amazon’s disclosures did not fully satisfy Section 15(b)’s notice requirements.  While Amazon may have informed users that Voice ID enables Alexa to learn their voices and recognize them when they speak, Amazon did not specifically inform users that it is “collecting and capturing the enrollee’s voiceprint, a biometric identifier.” 8.  As a result, and acknowledging that it was “a close call,” the Court denied Amazon’s motion to dismiss the Section 15(b) claim asserted by the Voice ID Plaintiffs.

Application Of The BIPA To “Non-User” Plaintiff Julia Bloom Stebbins

The Court next turned to Plaintiff Bloom Stebbins, who did not create an Alexa Voice ID but alleged that Amazon collected her “voiceprint” when she spoke in the vicinity of Plaintiff Jason Stebbins’s Alexa device.  Amazon argued that her Section 15(b) claim failed because the BIPA was not meant to apply to someone in her shoes – that is, a stranger to Amazon and “who Amazon has no means of identifying.”  Id. at 11.

The Court rejected Amazon’s argument.  In doing so, the Court refused to read Section 15(b)’s requirements as applying only where a company has some relationship with an individual.  According to the Court, that interpretation would amount to “read[ing] a requirement into the statute that does not appear in the statute itself.”  Id. at 12; see also id. (“[C]ourts in this Circuit have rejected the notion that to state a claim for a Section 15(b) violation, there must be a relationship between the collector of the biometric information and the individual.”).


Wilcosky is required reading for corporate counsel of companies that are facing privacy-related class actions and/or want to ensure their consumer or employee-facing privacy disclosures contain all notices required under applicable law.

The Wilcosky decision endorses a strict view regarding the notice a company must provide to individuals to fully comply with Section 15(b) of the BIPA.  To ensure compliance, companies should provide end users with language that is as specific as possible regarding the type(s) of data being collected (including the fact that the data may be “biometric”), the purpose the data is being collected, and the time period during which the data will be stored.  The notice should closely track the BIPA’s statutory text, and companies should also require individuals to affirmatively express that they have received the notice and agree to the collection of their biometric data.  (Despite a footnote stating that the Court’s order in Wilcosky should not “be interpreted to mean that . . . a disclosure must parrot the exact language of BIPA in order to satisfy Section 15(b),” id. at 8 n.3, the Court does not explain how a disclosure could satisfy Section 15(b) without tracking the statute’s language verbatim.)

Moreover, Wilcosky raises the question whether a company should characterize data it collects as “biometric” data in its privacy notice – even if the company maintains (perhaps for good reason) that the data does not constitute biometric data subject to regulation under the BIPA.  Further complicating this question is the fact that the precise contours of the types of data that qualify as “biometric” under the BIPA are unclear and are currently being litigated in many cases.  Companies may wish to err on the “safe side” and refer to the data being collected as “biometric” data in their privacy notices.

© 2009- Duane Morris LLP. Duane Morris is a registered service mark of Duane Morris LLP.

The opinions expressed on this blog are those of the author and are not to be construed as legal advice.

Proudly powered by WordPress