EEOC Settles Its First Discrimination Lawsuit Involving Artificial Intelligence Hiring Software

By Alex W. Karasik, Gerald L. Maatman, Jr. and George J. Schaller

Duane Morris Takeaways: InEqual Employment Opportunity Commission v. ITutorGroup, Inc., et al., No. 1:22-CV-2565 (E.D.N.Y. Aug. 9, 2023), the EEOC and a tutoring company filed a Joint Settlement Agreement and Consent Decree in the U.S. District Court for the Eastern District of New York, memorializing a $365,000 settlement for claims involving hiring software that automatically rejected applicants based on their age. This is first EEOC settlement involving artificial intelligence (“AI”) software bias. As we previously blogged about here, eradicating discrimination stemming from AI software is an EEOC priority that is here to stay. For employers who utilize AI software in their hiring processes, this settlement highlights the potential risk of legal and monetary exposure when AI software generates hiring decisions that disparately impact applicants from protected classes.

Case Background

Defendants iTutorGroup, Inc., Shanghai Ping’An Intelligent Education Technology Co., LTD, and Tutor Group Limited (collectively “Defendants”) hired tutors to provide English-language tutoring to adults and children in China.  Id. at *3.  Defendants received tutor applications through their website.  The sole qualification to be hired as a tutor for Defendants is a bachelor’s degree.  Additionally, as part of the application process, applicants provide their date of birth.

On May 5, 2022, the EEOC filed a lawsuit on behalf of Wendy Pincus, the Charging Party, who was over the age of 55 at the time she submitted her application.  The EEOC alleged that Charging Party provided her date of birth on her application and was immediately rejected.  Accordingly, the EEOC alleged that Defendants violated the Age Discrimination in Employment Act of 1967 (“ADEA”) for programming its hiring software to reject female applicants over 55 years old and male applicants over 60 years old.  Id. at *1. Specifically, the EEOC alleged that in early 2020, Defendants failed to hire Charging Party, Wendy Pincus, and more than 200 other qualified applicants age 55 and older from the United States because of their age.  Id.

The Consent Decree

On August 9, 2023, the parties filed a “Joint Notice Of Settlement Agreement And Requested Approval And Execution Of Consent Decree,” (the “Consent Decree.”).  Id.  The Consent Decree confirmed that the parties agreed to settle for $365,000, to be distributed to tutor applicants who were allegedly rejected by Defendants because of their age, during the time period of March 2020 through April 2020.  Id. at 15.  The settlement payments will be split evenly between compensatory damages and backpay.  Id. at 16.

In terms of non-monetary relief, the Consent Decree also requires Defendants to provide anti-discrimination policies and complaint procedures applicable to screening, hiring, and supervision of tutors and tutor applicants.  Id. at 9.  Further, the Consent Decree requires Defendants to provide training programs on an annual basis for all supervisors and managers involved in the hiring process.  Id. at 12-13.  The Consent Decree, which will remain in effect for five years, also contains reporting requirements and record-keeping requirements.  Most notably, the Consent Decree contains a monitoring requirement, which allows the EEOC to inspect the premises and records of the Defendants, and conduct interviews with the Defendant’s officers, agents, employees, and independent contractors to ensure compliance.

Implications For Employers

To best deter EEOC-initiated litigation involving AI in the hiring context, employers should review their AI software upon implementation to ensure applicants are not excluded based on any protected class.  Employers should also regularly audit the use of these programs to make sure the AI software is not resulting in adverse impact on applicants in protected-category groups.

This significant settlement should serve as a cautionary tale for businesses who use AI in hiring and are not actively monitoring its impact.  The EEOC’s commitment to its Artificial Intelligence and Algorithmic Fairness Initiative is in full force.  If businesses have not been paying attention, now is the time to start.

EEOC Issues New ADA Guidance On Visual Disabilities And Discussing AI Impact

By Alex W. Karasik, Gerald L. Maatman, Jr., and George J. Schaller

Duane Morris Takeaways:  On July 26, 2023, the EEOC issued a new Guidance entitled “Visual Disabilities in the Workplace and the Americans with Disabilities Act” (the “Guidance”).  This document is an excellent resource for employers, and provides insight into how to handle situations that may arise with job applicants and employees that have visual disabilities. Notably, for employers that use algorithms or artificial intelligence (“AI”) as a decision-making tool, the Guidance makes clear that employers have an obligation to make reasonable accommodations for applicants or employees with visual disabilities who request them in connection with these technologies.

The EEOC’s Guidance

The EEOC’s Guidance endeavors to address four subjects, including: (1) when an employer may ask an applicant or employee questions about a vision impairment and how an employer should treat voluntary disclosure; (2) what types of reasonable accommodations applicants or employees with visual disabilities may need; (3) how an employer should handle safety concerns about applicants and employees with visual disabilities; and (4) how an employer can ensure that no employee is harassed because of a visual disability.

The EEOC notes that if an applicant has an obvious impairment or voluntarily discloses the existence of a vision impairment, and based on this information, the employer reasonably believes that the applicant will require an accommodation to perform the job, the employer may ask whether the applicant will need an accommodation (and, if so, what type). Some potential accommodations may include: (i) assistive technology materials, such as screen readers and website accessibility modifications; (ii) personnel policy modifications, such as allowing the use of sunglasses, service animals, and customized work schedules; (iii) making adjustments to the work area, including brighter lighting; and (iv) allowing worksite visits by orientation, mobility, or assistive technology professionals.

For safety concerns, the Guidance clarifies that if the employer has concerns that the applicant’s vision impairment may create a safety risk in the workplace, the employer may conduct an individualized assessment to evaluate whether the individual’s impairment poses a “direct threat,” which is defined as, “a significant risk of substantial harm to the health or safety of the applicant or others that cannot be eliminated or reduced through reasonable accommodation.”  For harassment concerns, the EEOC notes that employers should make clear that they will not tolerate harassment based on disability or on any other protected basis, including visual impairment.  This can be done in a number of ways, such as through a written policy, employee handbooks, staff meetings, and periodic training.

Artificial Intelligence Implications

As we previously blogged about here, the EEOC has made it a priority to examine whether the use of artificial intelligence in making employment decisions can disparately impact various classes of individuals.  In the Q&A section, the Guidance tackles this issue by posing the following hypothetical question: “Does an employer have an obligation to make reasonable accommodations to applicants or employees with visual disabilities who request them in connection with the employer’s use of software that uses algorithms or artificial intelligence (AI) as decision-making tools?”According to the EEOC, the answer is “yes.”

The Guidance opines that AI tools may intentionally or unintentionally “screen out” individuals with disabilities in the application process and when employees are on the job, even though such individuals are able to do jobs with or without reasonable accommodations. As an example, an applicant or employee may have a visual disability that reduces the accuracy of an AI assessment used to evaluate the applicant or employee. In those situations, the EEOC notes that the employer has an obligation to provide a reasonable accommodation, such as an alternative testing format, that would provide a more accurate assessment of the applicant’s or employee’s ability to perform the relevant job duties, absent undue hardship.

Takeaways For Employers

The EEOC’s Guidance serves a reminder that the Commission will vigorously seek to protect the workplace rights of individuals with disabilities, including those with visual impairments. When employers are confronted with situations where an applicant or employee requests reasonable accommodations, the Guidance provides a valuable roadmap for how to handle such requests, and offers a myriad of potential solutions.

From an artificial intelligence perspective, the Guidance’s reference to the use of AI tools suggests that employers must be flexible in terms providing alternative solutions to visually impaired employees and applicants. In these situations, employers should be prepared to utilize alternative means of evaluation.

© 2009- Duane Morris LLP. Duane Morris is a registered service mark of Duane Morris LLP.

The opinions expressed on this blog are those of the author and are not to be construed as legal advice.

Proudly powered by WordPress