Spygate 2.0? New England Patriots Sued In VPPA Privacy Class Action

By Alex W. Karasik and Gerald L. Maatman, Jr.

Duane Morris Takeaways:  On February 1, 2024, a football fan filed a class action lawsuit against the New England Patriots in a Massachusetts federal court, alleging that the football team’s mobile app (the “App”) knowingly disclosed users’ location data and personal information to third-parties in alleged violation of the Video Privacy Protection Act (“VPPA”). This lawsuit marks the latest high-profile VPPA class action lawsuit filing, which have significantly spiked in the last two years.

Although the recent tide of VPPA class action court rulings has generally tipped in favor of defendants, the plaintiffs’ class action bar is still exploring novel theories to bring these high-stakes cases. Companies must therefore pay close attention to privacy-related issues involving mobile applications, including what data is collected and to whom it is transmitted.


Congress passed the VPPA in 1988.  The statute imposes liability on, “[a] video tape service provider who knowingly discloses, to any person, personally identifiable information concerning any consumer of such provider.”  18 U.S.C. § 2710(b)(1).  A “video tape service provider” is defined as “any person, engaged in the business, in or affecting interstate or foreign commerce, of rental, sale, or delivery of prerecorded video cassette tapes or similar audio visual materials.”  Id. 3-4 (citations omitted).  “Personally identifiable information” (“PII”) is defined as “information which identifies a person as having requested or obtained specific video materials or services from a video service provider.”  Id.  In essence, the statute purports to account for advancements in video-delivery technology by defining a “video tape service provider” broadly to include any business engaged in the “rental, sale, or delivery of prerecorded video cassette tapes or similar audio visual materials.”  Id.

The New VPPA Class Action Lawsuit

Plaintiff alleges that he downloaded and installed the App to his mobile phone and regularly used it to access video content.  Id. at 2.  When downloading the App, users are presented with an option to sign into an existing account, create a new account, or continue without signing in by selecting “MAYBE LATER.”  Id. at 4-5.  Plaintiff alleges that consumers who select “MAYBE LATER” are not presented with the App’s Terms of Use or Privacy Policy.  And even if users select “JOIN NOW”, they are redirected to a login screen where they have the option to log in, but are not required to view or assent to any terms of use or privacy policy unless they take additional steps to create an account.  Id. at 5.

In terms of data collection, the lawsuit alleges that when a user opens a video on the App, the App sends the content type, video title, and a persistent identifier to the user’s device. The App then transmits to third parties the user’s information, including location (in geographical coordinates and altitude), advertising ID, and video content consumption. Id. at 6. According to the complaint, the New England Patriots allegedly leverage users’ geolocation so it can maximize advertising revenue and, to that end, uniquely identify its users. For Android software users, the complaint alleges that the Patriots unique advertising ID called an Android Advertising ID (“AAID”) for each of its users with third-parties, which enables a third party to track the user’s movements, habits, and activity on mobile applications.  Id. at 10.

Accordingly, the lawsuit alleges that through the New England Patriots’ dissemination of consumers’ PII, third parties such as Google can collect and store billions of metrics and events and make it easier for clients to make data-driven decisions, and these reports are continuously updated and metrics are reported as they occur.  Id at 16.  Plaintiff seeks to represent a class defined as “All persons in the United States who used the Patriots App to watch videos and had their personally identifiable information — including but not limited to the videos they watched, their geolocation, and their unique advertising IDs — transmitted to one or more third parties.”  Id.  On behalf of the class, Plaintiff seeks an award of damages, including, but not limited to, actual, consequential, punitive, statutory, and nominal damages.

Implications For Businesses

This lawsuit represents another example of class action plaintiffs’ lawyers using traditional state and federal laws – including the long dormant VPPA – to seek relief for alleged privacy violations.  In applying modern technologies to older laws like the VPPA (passed in 1988), courts have grappled with issues such as the determination of who qualifies as a “video tape service provider” or a “consumer” under the statute. It will be interesting to follow this lawsuit to see whether the Court follows the recent trend of courts dismissing VPPA class actions.

That said, this high-profile filing also suggests that companies should regularly update their online consent provisions as needed to specifically address the VPPA. Businesses that pro-actively implement compliance mechanisms will thank themselves later in terms of preventing class action litigation.

California Court Dismisses Artificial Intelligence Employment Discrimination Lawsuit

By Alex W. Karasik, Gerald L. Maatman, Jr. and George J. Schaller

Duane Morris Takeaways:  In Mobley v. Workday, Inc., Case No. 23-CV-770 (N.D. Cal. Jan 19, 2024) (ECF No. 45), Judge Rita F. Lin of the U.S. District Court for the Northern District of California dismissed a lawsuit against Workday involving allegations that algorithm-based applicant screening tools discriminated applicants on the basis of race, age, and disability. With businesses more frequently relying on artificial intelligence to perform recruiting and hiring functions, this ruling is helpful for companies facing algorithm-based discrimination lawsuits in terms of potential strategies to attack such claims at the pleading stage.

Case Background

Plaintiff, an African-American male over the age of forty with anxiety and depression, alleged that he applied to 80 to 100 jobs with companies that use Workday’s screening tools. Despite holding a bachelor’s degree in finance and an associate’s degree in network systems administration, Plaintiff claimed he did not receive not a single job offer. Id. at 1-2.

On July 19, 2021, Plaintiff filed an amended charge of discrimination with the Equal Employment Opportunity Commission (“EEOC”). On November 22, 2022, the EEOC issued a dismissal and notice of right to sue. On February 21, 2023, Plaintiff filed a lawsuit against Workday, alleging that Workday’s tools discriminated against job applicants who are African-American, over the age of 40, and/or disabled in violation of Title VII, the ADEA, and the ADA, respectively.

Workday moved to dismiss the complaint, arguing that Plaintiff failed to exhaust administrative remedies with the EEOC as to his intentional discrimination claims; and that Plaintiff did not allege facts to state a plausible claim that Workday was liable as an “employment agency” under the anti-discrimination statutes at issue.

The Court’s Decision

The Court granted Workday’s motion to dismiss. First, the Court noted the parties did not dispute that Plaintiff’s EEOC charge sufficiently exhausted the disparate impact claims. However, Workday moved to dismiss Plaintiff’s claims for intentional discrimination under Title VII and the ADEA on the basis of his failure to exhaust administrative remedies. Workday argued that the EEOC charge alleged only claims for disparate impact, not intentional discrimination.

Rejecting Workday’s argument, the Court held that it must construe the language of the EEOC charge with “utmost liberality since they are made by those unschooled in the technicalities of formal pleading.” Id. at 5 (internal quotation marks and citations omitted). The Court acknowledged that the thrust of Plaintiff’s factual allegations in the EEOC charge concerned how Workday’s screening tools discriminated against Plaintiff based on his race and age. However, the Court held that those claims were reasonably related to his intentional discrimination claims, and that the EEOC investigation into whether the tools had a disparate impact or were intentionally biased would be intertwined. Accordingly, the Court denied Workday’s motion to dismiss on the basis of failure to exhaust administrative remedies.

Next, the Court addressed Workday argument that Mobley did not allege facts to state a plausible claim that it was liable as an “employment agency” under the anti-discrimination statutes at issue. The Court opined that Plaintiff did not allege facts sufficient to state a claim that Workday was “procuring” employees for these companies, as required for Workday to qualify as an “employment agency.” Id. at 1. For example, Plaintiff did not allege details about his application process other than that he applied to jobs with companies using Workday, and did not land any job offers. The complaint also did not allege that Workday helped recruit and select applicants.

In an attempt to salvage these defects at the motion hearing and in his opposition brief, Plaintiff identified two other potential legal bases for Workday’s liability — as an “indirect employer” and as an “agent.” Id. To give Plaintiff an opportunity to attempt to correct these deficiencies, the Court granted Workday’s motion to dismiss on this basis, but with leave for Plaintiff to amend. Accordingly, the Court granted in part and denied in part Workday’s motion to dismiss.

Implications For Businesses

Artificial intelligence and algorithm-based applicant screening tools are game-changers for companies in terms of streamlining their recruiting and hiring processes. As this lawsuit highlights, these technologies also invite risk in the employment discrimination context.

For technology vendors, this ruling illustrates that novel arguments about the formation of the “employment” relationship could potentially be fruitful at the pleading stage. However, the Court’s decision to let Plaintiff amend the complaint and have one more bite at the apple means Workday is not off the hook just yet. Employers and vendors of recruiting software would be wise to pay attention to this case  –and the anticipated wave of employment discrimination lawsuits that are apt to be filed – as algorithm-based applicant screening tools become more commonplace.

Illinois Federal Court Partially Dismisses Class Action Privacy Claims Involving “Eufy” Security Cameras

By Gerald L. Maatman, Jr., Alex W. Karasik, and Tyler Zmick

Duane Morris Takeaways:  In Sloan, et al. v. Anker Innovations Ltd., No. 22-CV-7174 (N.D. Ill. Jan. 9, 2024), Judge Sarah Ellis of the U.S. District Court for the Northern District of Illinois granted in part a motion to dismiss privacy claims brought against the companies that manufacture and sell “eufy” security products.  The Court dismissed the claims asserted under the federal Wiretap Act because Defendants were “parties” to the communication during which the eufy products sent security recordings to Plaintiffs’ mobile devices (notwithstanding that the products also sent the data to a server owned by Defendants).  In addition, the Court partially dismissed Plaintiffs’ claims under the Illinois Biometric Information Privacy Act and under four state consumer protection statutes, thereby allowing Plaintiffs to proceed with their case only with respect to some of their claims.

For businesses who are embroiled in facial recognition software and related privacy class actions, this ruling provides a helpful roadmap for fracturing such claims at the outset of the lawsuit.

Case Background

Plaintiffs were individuals from various states who purchased and used Defendants’ “eufy” branded home security cameras and video doorbells.  The eufy products can, among other things, detect motion outside a person’s home and apply a facial recognition program differentiate “between known individuals and strangers by recognizing biometric identifiers and comparing the face template against those stored in a database.”  Id. at 3.  Eufy products sync to a user’s phone through eufy’s Security app, which notifies a user of motion around the camera by sending the use a recorded thumbnail image or text message.

Defendants advertised that the video recordings and facial recognition data obtained through eufy cameras are stored locally on user-owned equipment owned and that the data would be encrypted so that only the user could access it.  Media reports later revealed, however, that the eufy products uploaded thumbnail images used to notify users of movement to Defendants’ cloud storage without encryption, and that users could stream content from their videos through unencrypted websites.

Claiming they relied to their detriment on Defendants’ (allegedly false) privacy-related representations when purchasing the eufy products, the eight named Plaintiffs filed a putative class action against corporate Defendants involved in the manufacture and sale of “eufy” products.  In their complaint, Plaintiffs asserted that Defendants violated: (1) the Federal Wiretap Act; (2) the Biometric Information Privacy Act (the “BIPA”); and (3) the consumer protection statutes of Illinois, New York, Massachusetts, and Florida.  Defendants moved to dismiss Plaintiffs’ claims under Federal Rule of Civil Procedure 12(b)(6).

The Court’s Decision

The Court granted in part and denied in part Defendants’ motion, holding that: (1) the Wiretap Act claim should be dismissed because Defendants were a party to the relevant communication (i.e., the transmission of data from eufy products to Plaintiffs via the eufy Security app); (2) the BIPA claims should be dismissed as to non-Illinois resident Plaintiffs; and (3) the claims brought under the relevant consumer protection statutes should be dismissed only to the extent they were premised on certain of Defendants’ public-facing privacy statements.

Wiretap Act Claims

The Court first addressed Plaintiffs’ Wiretap Act claims, explaining that the statute “empowers a private citizen to bring a civil claim against someone who ‘intentionally intercepts [or] endeavors to intercept . . . any wire, oral, or electronic communication.’”  Id. at 8 (quoting 18 U.S.C. § 2511(1)(a)).

Defendants argued that Plaintiffs failed to state a claim under the Wiretap Act because the statute does not apply to a party to the relevant communication.  Specifically, the Wiretap Act exempts a person who intercepts an electronic communication “where such person is a party to the communication or where one of the parties to the communication has given prior consent to such interception.”  18 U.S.C. § 2511(2)(d).

The Court agreed with Defendants and thus dismissed Plaintiffs’ Wiretap Act claim.  The Court described the relevant “communication” as the transmission of data from eufy products to Plaintiffs’ devices and explained that the transmission “is not between the eufy product and Plaintiffs, but rather between the eufy product and the eufy Security app, which Defendants own and operate.  As such, the communication necessarily requires Defendants’ participation, even if Plaintiffs did not intend to share their information with Defendants.”  Id. at 8-9 (emphasis added).  The Court thus held that Defendants were parties to the communication, and Defendants also uploading the data to their own server (without Plaintiffs’ knowledge) did not change that conclusion.

BIPA Claims

Regarding Plaintiffs’ BIPA claims, Defendants argued that Plaintiffs failed to allege that the relevant data (which Defendants described as “thumbnail images”) qualifies for protection under the BIPA because photographs are not biometric data under the statute.  The Court rejected this argument since Plaintiffs alleged that Defendants uploaded thumbnail information and facial recognition data (namely, “scans of face geometry”) to their server.

The Court agreed with Defendants’ second argument, however, which asserted that Plaintiffs’ BIPA claim failed to the extent it was brought by or on behalf of Plaintiffs who are not Illinois residents.  The BIPA applies only where the underlying conduct occurs “primarily and substantially” in Illinois.  The Court determined that the relevant communications between Plaintiffs and Defendants “occurred primarily and substantially in the state of residency for each Plaintiff.”  Id. at 12-13.  And the End User License Agreement for eufy Camera Products and the Security App stating that the agreement is governed by Illinois law did not change the result that the BIPA claim brought by non-Illinois residents must be dismissed.

Statutory Consumer Protection Claims

Finally, the Court turned to Defendants’ contentions relative to the alleged violations of the four state consumer protection statutes.  In beginning its analysis, the Court explained that “[t]o state a claim for deceptive practices under any of the alleged state consumer fraud statutes, Plaintiffs must allege a deceptive statement or act that caused their harm.”  Id. at 14.  Moreover, “a statement is deceptive if it creates a likelihood of deception or has the capacity to deceive.”  Id. at 15 (citation omitted); see also id. (noting that “the allegedly deceptive act must be looked upon in light of the totality of the information made available to the plaintiff”) (citation omitted).  Defendants argued in their motion to dismiss that Plaintiffs did not allege cognizable deceptive statements because the statements at issue constitute either puffery or are not false.

The Court dismissed Plaintiffs’ statutory fraud claims in part.  Specifically, the Court held that Defendants’ advertising in the form of certain “statements relating to privacy” (e.g., “your privacy is something that we value as much as you do”) constituted nonactionable “puffery.”  Id. at 16.  The Court therefore dismissed Plaintiffs’ statutory fraud claims insofar as they were premised on the similarly vague “statements relating to privacy.”

However, the Court denied Defendants’ attempt to dismiss the claims premised on their more specific statements about (1) end-user data being stored only on a user’s local device, (2) the use of alleged facial recognition, and (3) end-user data being encrypted.  Defendants argued that these were “accurate statements” and thus could not serve as the basis for consumer fraud claims.  The Court disagreed, ruling that Plaintiffs sufficiently alleged that the storage, encryption, and facial recognition statements may have misled a reasonable consumer.  Accordingly, the Court granted in part and denied in part Defendants’ motion to dismiss.

Implications For Corporate Counsel

The most significant aspect of Sloan v. Anker Innovations Limited is the Court’s analysis of Plaintiffs’ Wiretap Act claims, given the rapidly emerging trend among the plaintiff class action bar of using traditional state and federal laws – including the Wiretap Act – to seek relief for alleged privacy violations.  In applying modern technologies to older laws like the Wiretap Act (passed in 1986), courts have grappled with issues such as the determination of who is a “party to the communication” such that an entity is exempt from the statute’s scope.  As data exchanges and data storage become more complex, the “party to the communication” determination reciprocally becomes more nebulous.

In Sloan, the “communication” was the eufy products transmitting data to Plaintiffs’ device and “contemporaneously intercept[ing] and sen[ding] [the data] to [Defendant’s] server.”  Id. at 8 (citation omitted).  Because Plaintiffs had to use the eufy Security app to access the data, and because Defendants owned and operated the app, the Court determined that Defendants necessarily participated in the communication.  But the result may have been different if, for instance, Plaintiffs could use a different app (one not owned by Defendants) to access the data, or if unbeknownst to Plaintiffs, the eufy Securty app was actually owned and operated by a third-party entity.  The upshot is that corporate counsel should keep these principles in mind with respect to any data-flow processes regarding end-user or employee data.

The Brave New World: President Biden Signs Executive Order On Use Of Artificial Intelligence 

By Gerald L. Maatman, Jr., Alex W. Karasik, and George J. Schaller

Duane Morris Takeaways: On October 30, 2023, President Biden signed an Executive Order (the “EO”) providing guidance for employers on the emerging utilization of Artificial Intelligence in the workplace.  The EO establishes industry standards for AI security, innovation, and safety across significant employment sectors. Spanning over 100 pages, the robust EO endeavors to set parameters for responsible AI use, seeking to harness AI for good while mitigating risks associated with AI usage.

For businesses who utilize AI software in their employment decisions processes, the EO signifies a shift in beneficial versus harmful AI use and promotes a principled plan on advancing beneficial AI use.

Security, Innovation, And Safety With AI

AI’s significant developments in such a short period has required policymakers to keep up with the ever-changing AI landscape.  President Biden’s EO manifests the White House’s commitment to AI use in a safe and secure manner.  The EO also signals a commitment to promoting responsible innovation, competition, and collaboration to propel the United States to lead in AI and unlock the technology’s potential.  At the same time, the EO focuses on AI implications for workplaces and problematic AI usage.

AI And Employment Issues

In the White House’s continued dedication to advance equity and civil rights, the EO purports to commit to supporting American workers.  As AI creates new jobs and industries, the EO maintains that all workers should be included in benefiting from AI opportunities. As to the workplace, the EO asserts that responsible AI use will improve workers’ lives, positively impact human work, and help all to gain from technological innovation. Nonetheless, the EO opines that irresponsible AI use could undermine workers’ rights.

Further, protections to Americans who increasingly interact with AI are contemplated in the EO and signals that organizations will not be excused from legal obligations.  Chief among these protections are continued enforcement of existing safeguards against fraud, unintended bias, discrimination, infringements on privacy, and other harms from AI.  The White House seeks parity with the Federal Government in enforcement efforts and creating new appropriate safeguards against harmful AI use.

Significantly, within 180 days of issuing the EO, the Secretary of Labor is tasked with consulting with agencies and outside entities (including labor unions and workers) to develop and publish principles and best practices for employers to maximize AI’s potential benefits.  In so doing, the key principles and best practices are to address job-displacement, labor standards and job quality, and employer’s AI-related collection and use of worker data.  These principles and best practices further aim to prevent any harms to employees’ well-being.

Implications For Employers

This lengthy order should alert employers that AI is here to stay and the perils of AI use will change as the technology further augments the modern workforce.

As AI becomes more engrained in employment, employers should be mindful of the guidance developed in the EO and should stay up to date on any legislation that stems from AI usage. If businesses have not been paying attention to AI developments, now is the time to start.

EEOC Issues New Guidance On Harassment In The Workplace

By Gerald J. Maatman, Jr., Alex W. Karasik, and Derek Franklin

Duane Morris Takeaways:  On September 29, 2023, the EEOC issued a new Proposed Enforcement Guidance on Harassment in the Workplace (the “Guidance”).  The Guidance provides insights into how employers can handle evolving workplace realities and developing trends with harassment claims. Notably, the Guidance addresses how digital technology and social media postings can contribute to a hostile work environment.  It also addresses the U.S. Supreme Court’s 2020 landmark decision in Bostock v. Clayton County, where Supreme Court held that discrimination based on sexual orientation or gender identity constitutes sex-based discrimination under Title VII of the Civil Rights Act of 1964 (“Title VII”).  The Guidance is open to public comment through November 1, 2023; if issued in final form, it will mark the first update to the EEOC’s official harassment guidance in nearly 25 years.

For employers, the Guidance is a “must read” in terms of preventing future workplace harassment claims.

Workplace Harassment In The Digital Landscape

The Guidance spotlights how social media postings and other online content can contribute to hostile work environments, even if it occurs outside of the workplace and is not work-related.  For instance, the Guidance cites the following examples of conduct occurring in an employee’s “virtual work environment” that employers can be liable for: “[a] sexist comments made during a video meeting, [b] racist imagery that is visible in an employee’s workspace while the employee participates in a video meeting, or [c] sexual comments made during a video meeting about a bed being near an employee in the video image.”

In addition to discussing conduct occurring in a “virtual work environment,” the Guidance also clarifies that conduct occurring in non-work-related contexts can contribute to a hostile work environment if it impacts the workplace.  This includes electronic communications through phones, computers, and social media.  For example, the Guidance cautions that, if an employee’s private social media posting subjects a co-worker to racial epithets, and other co-workers discuss the posting at work, then that posting “can contribute to a racially hostile work environment.”

Harassment Based On Sexual Orientation And Gender Identity

Another notable aspect of the Guidance is that it incorporates the U.S. Supreme Court’s 2020 landmark decision in Bostock v. Clayton County, 140 S. Ct. 1731, 1747 (2020), which held that Title VII’s prohibition of sex-based discrimination encompasses discrimination based on sexual orientation and gender identity.

While Bostock concerned an allegedly discriminatory employment discharge and did not involve harassment, the EEOC states in the Guidance that the Supreme Court’s reasoning “logically extends to claims of harassment.”  The Guidance therefore dictates that “sex-based harassment includes harassment on the basis of sexual orientation and gender identity, including how that identity is expressed.”

The Guidance lists several examples of conduct that can constitute this type of harassment, including: “[a] epithets regarding sexual orientation or gender identity; [b] physical assault; [c] harassment because an individual does not present in a manner that would stereotypically be associated with that person’s gender; [d] intentional and repeated use of a name or pronoun inconsistent with the individual’s gender identity (misgendering); or [e] the denial of access to a bathroom or other sex-segregated facility consistent with the individual’s gender identity.”

The EEOC also includes a hypothetical fact pattern in the Guidance depicting harassment based on gender identity.  In that hypothetical, supervisors and co-workers of a fast food employee who identifies as female commonly referred to the employee using her prior male name and pronouns, asked questions about her sexual orientation and anatomy, and asserted that she was not female.  In addition, customers “intentionally misgendered” the employee and “made threatening statements to her,” which the employer only responded to by reassigning the employee to a workstation where customers could not see her.  These facts, according to the EEOC, established harassment based on gender identity and, therefore, sex-based discrimination under Title VII.

Takeaways For Employers

The Guidance is a “must read” resource for employers to navigate potential harassment concerns.  It provides employers with an opportunity to revise their policies and protocols to better reflect the current legal landscape and the evolution of digital technology.  The Guidance also highlights the EEOC’s emphasis on enforcing Title VII’s prohibition of harassment based on sexual orientation and gender identity.

Employers should review their policies and practices to ensure they adequately protect against, and provide avenues to report, potential harassment that takes place virtually.  Likewise, employers may wish to consider incorporating examples of harassment given by the EEOC when implementing harassment prevention measures.

EEOC’s September Spree Of Filings Caps Off Landmark Year In FY 2023

By Gerald L. Maatman, Jr., Alex W. Karasik, George J. Schaller, and Jennifer A. Riley

Duane Morris Takeaways:  In FY 2023, the EEOC’s litigation enforcement activity showed that any previous slowdown due to the COVID-19 pandemic is well in the rearview mirror, as the total number of lawsuits filed by the EEOC increased from 97 in 2020 to a whopping total of 144 in FY 2023. Per tradition, September 2023 was a busy month for EEOC-Initiated litigation, as this month marks the end of the EEOC’s fiscal year. This year, 67 lawsuits were filed September, up from the 39 filed in September of FY 2022.

Overall, the FY 2023 lawsuit filing data confirms that EEOC litigation is back in full throttle, with no signs of slowing down. Employers should take heed. Amplifying that activism, the Commission issued a press release at the end of the fiscal year touting its increased enforcement litigation activity, a somewhat unprecedented media statement that the EEOC has never issued in previous years.

Lawsuit Filings Based On EEOC District Offices

In addition to tracking the total number of filings, we closely monitor which of the EEOC’s 15 district offices are most actively filing new cases over the year and throughout September. Some districts tend to be more aggressive than others, and some focus on different case filing priorities. The following chart shows the number of lawsuit filings by EEOC district offices.

In FY 2023, Philadelphia District Office had by far the most lawsuit filings with 19, followed by Indianapolis and Chicago with 13 filings, and New York and Los Angeles each with 10 filings. Charlotte, Atlanta, Dallas, Phoenix, and Memphis had 9 each,  Houston had 8, Miami, Birmingham, and St. Louis had 7 each, and San Francisco had 5 filings.

The most noticeable trend of FY 2023 is the filing deluge in Philadelphia (19 lawsuits), compared to FY 2022 where Philadelphia District Office filed 7 lawsuits. Similarly, Indianapolis ramped up its filings compared to the 7 filings from FY 2022.  Like FY 2022, Chicago remained steady near the top of the list again with 13 filings.  Los Angeles, had a slight increase, based on the 8 filings it had in FY 2022.  Going another direction, Miami filings slightly fell compared to its 8 filings in FY 2022.   Finally, both New York and Charlotte increased their filings from FY 2022, with New York substantially increasing from 7, and Charlotte moderately increasing from 7 filings.

The balance across various District Offices throughout the country confirms that the EEOC’s aggressiveness is in peak form, both at the national and regional level.

Lawsuit Filings Based On Type Of Discrimination

We also analyzed the types of lawsuits the EEOC filed, in terms of the statutes and theories of discrimination alleged, in order to determine how the EEOC is shifting its strategic priorities.

When considered on a percentage basis, the distribution of cases filed by statute remained roughly consistent compared to FY 2023 and FY 2022. Title VII cases once again made up the majority of cases filed, making up 68% of all filings (down from the 69% filings in FY 2022, and significantly above 61% in FY 2021). ADA cases also made up a significant percentage of the EEOC’s September filings, totaling 34%, in line with 29.7% in FY 2022, although down from the 37% in FY 2021. There were also 12 ADEA cases filed in FY 2023, after 7 age discrimination cases filed in FY 2022.

The graphs below show the number of lawsuits filed according to the statute under which they were filed (Title VII, Americans With Disabilities Act, Pregnancy Discrimination Act, Equal Pay Act, and Age Discrimination in Employment Act) and, for Title VII cases, the theory of discrimination alleged.

Lawsuits Filings Based On Industry

The graphs below show the number of lawsuits filed by industry.  Three industries were the primary targets of lawsuit filings in FY 2023:  Restaurants with 28 filings, Retail with 24 filings, and Healthcare with 24 filings.  Not far off those industries are Manufacturing with 15 filings; Construction with 7 filings; Automotive, Security, and Transportation with 6 filings each; and Technology with 5 filings.

Hospitality and Healthcare employers should be keenly aware of the EEOC’s enforcement of alleged discriminatory practices in these sectors.  But in reality, employers in nearly any industry are vulnerable to EEOC-initiated litigation., as detailed by the below graph.

Looking Ahead To Fiscal Year 2024

Moving into FY 2024, the EEOC’s budget includes a $26.069 million increase from 2023, and focuses on six key areas including advancing racial justice and combatting systemic discrimination on all protected bases; protecting pay equity; supporting diversity, equity, inclusion, and accessibility (DEIA); addressing the use of artificial intelligence in employment decisions and preventing unlawful retaliation.

The EEOC also announced goals for its own Diversity, Equity, Inclusion, and Accesibility (DEIA) program where it seeks to achieve four goals, including workplace diversity, employee equity, inclusive practices, and accessibility. Additionally, the EEOC continues to polish its FY 2021 software initiatives addressing artificial intelligence, machine learning, and other emerging technologies in continued efforts to provide guidance.  Finally, the joint anti-retaliation initiative among the EEOC, the U.S. Department of Labor, and the National Labor Relations Board will continue to address retaliation in American workplaces.

Key Employer Takeaways

In sum, FY 2023 was a year of new leadership and structural changes at the EEOC.  With a significantly increased proposed budget, it is more crucial than ever for employers pay close attentions in regards to the EEOC’s strategic priorities and enforcement agendas.  We anticipate these figures will grow by next year’s report, so it is more crucial than ever for employers to comply with discrimination laws.

Key Takeaways From The EEOC’s Strategic Plan For Fiscal Years 2022-2026

By Gerald L. Maatman, Jr., Alex W. Karasik, and George J. Schaller

Duane Morris Takeaways: On August 22, 2023, the EEOC announced the approval its Strategic Plan (“SP”) for Fiscal Years 2022-2026.  The Strategic Plan can be accessed here.  The SP furthers the EEOC’s mission of preventing and remedying unlawful employment discrimination and advancing equal employment opportunity for all.  The SP focuses on: (1) Enforcement; (2) Education and Outreach; and (3) Organizational Excellence. The SP also provides performance measures for each strategic goal.  For corporate counsel involved in employment-related compliance and EEOC litigation, the new SP is required reading.

The EEOC’s Strategic Priorities

  1. Enforcement

The EEOC continues to promote equitable employment initiatives through its enforcement authority.  The SP highlights the EEOC’s primary mission of preventing unlawful employment discrimination through its administrative and litigation enforcement mechanisms, and adjudicatory and oversight processes.  The main strategic focus for employing these mechanisms is through fair and efficient enforcement based on the circumstances of each charge or complaint while maintaining a balance of meaningful relief for victims of discrimination.

As to enforcement, the SP provides a broad overview of the EEOC’s efforts to allocate its resources to ensure its efforts in stopping unlawful employment discrimination.  To that end, the EEOC indicates that it will continue its targeting of systemic discrimination through training staff on systemic cases and devoting additional resources to systemic litigation enforcement.  The SP included several performance measures for achieving enforcement goals, including measures on conciliation and litigation resolution, favorably resolving lawsuits, and increasing capacity for systemic investigations.

  1. Education and Outreach

The SP prioritizes education and outreach for deterring employment discrimination before it occurs.  The SP focuses on providing education and outreach programs, projects, and events as cost-effective tools for enforcement.  Primarily these programs are aimed at individuals who historically have been subjected to employment discrimination.  Part of the EEOC’s education and outreach involves expanding use of technology through social media, ensuring the EEOC website is more user-friendly and accessible, and leveraging technology to reach the agency’s audience.

These efforts to improve on education and outreach are aimed at promoting public awareness of employment discrimination laws while maintaining information and guidance for employers, federal agencies, unions, and staffing agencies.  The SP provides an in-depth list of measuring education and outreach by utilizing technology to expand the EEOC’s audience and ensuring accessible delivery of information through events, programs, and up-to-date website accessibility and functionality.

  1. Organizational Excellence

The SP makes clear that organizational excellence is the cornerstone of achieving the EEOC’s strategic goals.  The SP confirms that the EEOC aims to improve on its culture of accountability, inclusivity, and accessibility.  In addition, the EEOC seeks to continue protecting the public and advancing civil rights in the workplace by ensuring its resources are allocated properly to strengthen intake, outreach, education, enforcement, and service.

The EEOC’s organizational excellence strategic goal has two prongs, including improving the training of EEOC employees and enhancing the EEOC’s infrastructure.  For employees, the EEOC seeks to foster enhanced diversity, equity, inclusion, and accessibility in the workplace, maintain employee retention, and implement leadership and succession plans.  Relative to the agency’s infrastructure, the SP embraces the increased use of technology through analytics, and management of fiscal resources promote the agency’s mission of serving the public.

Implications For Employers

The EEOC’s SP is an important publication for employers since it previews immediate action areas.  The SP’s focus on systemic discrimination, conciliation, and litigation, and increasing the Commission’s capacity for litigating alleged systemic violations shows the EEOC is ramping up to improve handling all aspects of charges.  The EEOC’s increased focus on technology and employment discrimination awareness similarly shows accessibility will continue to be a pillar of the agency.  Accordingly, prudent employers should be mindful of these strategic priorities, and prepare themselves for continued EEOC enforcement.

Maryland Federal Court Issues Arrest Warrant In EEOC Sex Bias Suit

By Gerald L. Maatman, Jr., Alex W. Karasik, and George J. Schaller

Duane Morris Takeaways: In EEOC v. Above All Odds, LLC, No. 1:21-CV-02492 (D. Md. Aug. 15, 2023) (ECF No. 50), a federal district court in Maryland issued an arrest warrant for an ex-executive of a company involved in an EEOC lawsuit. The EEOC alleged that the ex-executive sexually harassed employees of a mental health clinic. The Court issued the  arrest warrant due to the ex-executive refusal to cooperate in the case and with discovery orders.

For employers facing EEOC-initiated lawsuits, the issuance of an arrest warrant is a novel development but informative in terms of the perils of continuously ignoring court orders. 

Case Background

The EEOC initiated this lawsuit on behalf of three former workers, Bricciana Strickland, Shana Hanson, and Saidah Feyijinmi, of Above All Odds, LLC (“Company”) and the Company’s co-founder, Raymond Dorsey, alleging a pattern of sexual harassment of female employees.  (Compl. at 1).

Strickland alleged Dorsey sent text messages asking for a date, and when she refused, Dorsey responded by stating he could fire her from her position.  Id. at 5-6. Hanson alleged Dorsey made repeated unwanted sexual advances including Dorsey asking if he could rub her back, sending an email with pornographic content, and throwing condoms on her desk.  Id. at 7. Feyijinmi alleged she saw Dorsey throw condoms on Hanson’s desk.  Id. at 8. Together, Hanson and Feyijinmi reported Dorsey’s sexual harassment to the Company’s senior management. Id. at 7.

Strickland continued to reject Dorsey’s advances and was demoted, and ultimately Dorsey ordered members of management staff to terminate her.  Id. at 6. Hanson was terminated in response to reporting Dorsey’s conduct.  Id. at 7. Feyjinmi was presented with a new contract of employment that lowered her salary and required her to work two positions, and after she requested time to review the contract before signing, the company terminated her before she had the opportunity to sign her contract.  Id. at 8-9.

The Arrest Warrant

Throughout the course of the lawsuit, Dorsey failed to respond to the EEOC’s complaint and ignored several show cause orders directing him to appear in court.  Subsequently, the court found Dorsey in contempt of court in June 2023.

Dorsey also ignored a subpoena to appear in the case brought by the EEOC.  Thereafter, the court authorized the arrest of Raymond Dorsey and issued an arrest warrant on August 15, 2023.

Implications For Employers

Employers that are confronted with EEOC-initiated litigation involving allegations of a pattern of sexual harassment should note that ignoring court filings, court proceedings, and orders issued by the court, may result in the court taking action.  In this instance, the court relied on the ex-executive’s lack of response to pleadings, court orders, and subpoenas leading to the court issuing an arrest warrant.  While the issuance of arrest warrants is rare in litigation, this development illustrates that court orders should not be taken lightly.


EEOC Settles Its First Discrimination Lawsuit Involving Artificial Intelligence Hiring Software

By Alex W. Karasik, Gerald L. Maatman, Jr. and George J. Schaller

Duane Morris Takeaways: InEqual Employment Opportunity Commission v. ITutorGroup, Inc., et al., No. 1:22-CV-2565 (E.D.N.Y. Aug. 9, 2023), the EEOC and a tutoring company filed a Joint Settlement Agreement and Consent Decree in the U.S. District Court for the Eastern District of New York, memorializing a $365,000 settlement for claims involving hiring software that automatically rejected applicants based on their age. This is first EEOC settlement involving artificial intelligence (“AI”) software bias. As we previously blogged about here, eradicating discrimination stemming from AI software is an EEOC priority that is here to stay. For employers who utilize AI software in their hiring processes, this settlement highlights the potential risk of legal and monetary exposure when AI software generates hiring decisions that disparately impact applicants from protected classes.

Case Background

Defendants iTutorGroup, Inc., Shanghai Ping’An Intelligent Education Technology Co., LTD, and Tutor Group Limited (collectively “Defendants”) hired tutors to provide English-language tutoring to adults and children in China.  Id. at *3.  Defendants received tutor applications through their website.  The sole qualification to be hired as a tutor for Defendants is a bachelor’s degree.  Additionally, as part of the application process, applicants provide their date of birth.

On May 5, 2022, the EEOC filed a lawsuit on behalf of Wendy Pincus, the Charging Party, who was over the age of 55 at the time she submitted her application.  The EEOC alleged that Charging Party provided her date of birth on her application and was immediately rejected.  Accordingly, the EEOC alleged that Defendants violated the Age Discrimination in Employment Act of 1967 (“ADEA”) for programming its hiring software to reject female applicants over 55 years old and male applicants over 60 years old.  Id. at *1. Specifically, the EEOC alleged that in early 2020, Defendants failed to hire Charging Party, Wendy Pincus, and more than 200 other qualified applicants age 55 and older from the United States because of their age.  Id.

The Consent Decree

On August 9, 2023, the parties filed a “Joint Notice Of Settlement Agreement And Requested Approval And Execution Of Consent Decree,” (the “Consent Decree.”).  Id.  The Consent Decree confirmed that the parties agreed to settle for $365,000, to be distributed to tutor applicants who were allegedly rejected by Defendants because of their age, during the time period of March 2020 through April 2020.  Id. at 15.  The settlement payments will be split evenly between compensatory damages and backpay.  Id. at 16.

In terms of non-monetary relief, the Consent Decree also requires Defendants to provide anti-discrimination policies and complaint procedures applicable to screening, hiring, and supervision of tutors and tutor applicants.  Id. at 9.  Further, the Consent Decree requires Defendants to provide training programs on an annual basis for all supervisors and managers involved in the hiring process.  Id. at 12-13.  The Consent Decree, which will remain in effect for five years, also contains reporting requirements and record-keeping requirements.  Most notably, the Consent Decree contains a monitoring requirement, which allows the EEOC to inspect the premises and records of the Defendants, and conduct interviews with the Defendant’s officers, agents, employees, and independent contractors to ensure compliance.

Implications For Employers

To best deter EEOC-initiated litigation involving AI in the hiring context, employers should review their AI software upon implementation to ensure applicants are not excluded based on any protected class.  Employers should also regularly audit the use of these programs to make sure the AI software is not resulting in adverse impact on applicants in protected-category groups.

This significant settlement should serve as a cautionary tale for businesses who use AI in hiring and are not actively monitoring its impact.  The EEOC’s commitment to its Artificial Intelligence and Algorithmic Fairness Initiative is in full force.  If businesses have not been paying attention, now is the time to start.

EEOC Issues New ADA Guidance On Visual Disabilities And Discussing AI Impact

By Alex W. Karasik, Gerald L. Maatman, Jr., and George J. Schaller

Duane Morris Takeaways:  On July 26, 2023, the EEOC issued a new Guidance entitled “Visual Disabilities in the Workplace and the Americans with Disabilities Act” (the “Guidance”).  This document is an excellent resource for employers, and provides insight into how to handle situations that may arise with job applicants and employees that have visual disabilities. Notably, for employers that use algorithms or artificial intelligence (“AI”) as a decision-making tool, the Guidance makes clear that employers have an obligation to make reasonable accommodations for applicants or employees with visual disabilities who request them in connection with these technologies.

The EEOC’s Guidance

The EEOC’s Guidance endeavors to address four subjects, including: (1) when an employer may ask an applicant or employee questions about a vision impairment and how an employer should treat voluntary disclosure; (2) what types of reasonable accommodations applicants or employees with visual disabilities may need; (3) how an employer should handle safety concerns about applicants and employees with visual disabilities; and (4) how an employer can ensure that no employee is harassed because of a visual disability.

The EEOC notes that if an applicant has an obvious impairment or voluntarily discloses the existence of a vision impairment, and based on this information, the employer reasonably believes that the applicant will require an accommodation to perform the job, the employer may ask whether the applicant will need an accommodation (and, if so, what type). Some potential accommodations may include: (i) assistive technology materials, such as screen readers and website accessibility modifications; (ii) personnel policy modifications, such as allowing the use of sunglasses, service animals, and customized work schedules; (iii) making adjustments to the work area, including brighter lighting; and (iv) allowing worksite visits by orientation, mobility, or assistive technology professionals.

For safety concerns, the Guidance clarifies that if the employer has concerns that the applicant’s vision impairment may create a safety risk in the workplace, the employer may conduct an individualized assessment to evaluate whether the individual’s impairment poses a “direct threat,” which is defined as, “a significant risk of substantial harm to the health or safety of the applicant or others that cannot be eliminated or reduced through reasonable accommodation.”  For harassment concerns, the EEOC notes that employers should make clear that they will not tolerate harassment based on disability or on any other protected basis, including visual impairment.  This can be done in a number of ways, such as through a written policy, employee handbooks, staff meetings, and periodic training.

Artificial Intelligence Implications

As we previously blogged about here, the EEOC has made it a priority to examine whether the use of artificial intelligence in making employment decisions can disparately impact various classes of individuals.  In the Q&A section, the Guidance tackles this issue by posing the following hypothetical question: “Does an employer have an obligation to make reasonable accommodations to applicants or employees with visual disabilities who request them in connection with the employer’s use of software that uses algorithms or artificial intelligence (AI) as decision-making tools?”According to the EEOC, the answer is “yes.”

The Guidance opines that AI tools may intentionally or unintentionally “screen out” individuals with disabilities in the application process and when employees are on the job, even though such individuals are able to do jobs with or without reasonable accommodations. As an example, an applicant or employee may have a visual disability that reduces the accuracy of an AI assessment used to evaluate the applicant or employee. In those situations, the EEOC notes that the employer has an obligation to provide a reasonable accommodation, such as an alternative testing format, that would provide a more accurate assessment of the applicant’s or employee’s ability to perform the relevant job duties, absent undue hardship.

Takeaways For Employers

The EEOC’s Guidance serves a reminder that the Commission will vigorously seek to protect the workplace rights of individuals with disabilities, including those with visual impairments. When employers are confronted with situations where an applicant or employee requests reasonable accommodations, the Guidance provides a valuable roadmap for how to handle such requests, and offers a myriad of potential solutions.

From an artificial intelligence perspective, the Guidance’s reference to the use of AI tools suggests that employers must be flexible in terms providing alternative solutions to visually impaired employees and applicants. In these situations, employers should be prepared to utilize alternative means of evaluation.

© 2009- Duane Morris LLP. Duane Morris is a registered service mark of Duane Morris LLP.

The opinions expressed on this blog are those of the author and are not to be construed as legal advice.

Proudly powered by WordPress