ARTICLE
1 August 2024

Mobley v. Workday: A Potential Shift In Employment Discrimination Liability

As recruiting with Artificial Intelligence (AI) continues to increase in popularity, so does the litigation surrounding it. Specifically, concerns continue to grow regarding...
United States Employment and HR
To print this article, all you need is to be registered or login on Mondaq.com.

As recruiting with Artificial Intelligence (AI) continues to increase in popularity, so does the litigation surrounding it. Specifically, concerns continue to grow regarding the conscious and unconscious biases that are alleged to creep into the recruiting process. In a case of first impression (Mobley v. Workday, Inc.), class-action Plaintiffs allege that Workday's AI screening software tool is biased and that Workday is directly liable for alleged unlawful employment discrimination caused by an employer's use of Workday's AI-powered hiring tools. On July 12, 2024, the court issued a mixed ruling in the closely watched class action case. By partially denying Workday's motion to dismiss, Judge Rita Lin of the Northern District of California determined earlier this month that the Plaintiff's claims would be heard. While the court dismissed the claims that Workday acted as an “employment agency,” it allowed claims that Workday acted as an “agent” of employers, allowing the case to proceed to discovery. This ruling has significant implications for both AI vendors and employers using AI-powered hiring tools, potentially expanding the scope of liability under federal anti-discrimination laws.

In this class action lawsuit, the named Plaintiff alleges that since 2017, he has applied to over 100 positions at companies utilizing Workday screening tools for recruiting and was rejected from each of those 100 positions, despite his claim that he possessed the relevant qualifications.

Mobley's Complaint alleged race discrimination in violation of Title VII of the Civil Rights Act (Title VII), age discrimination in violation of the Age Discrimination in Employment Act (ADEA), and disability discrimination violative of the Americans with Disabilities Act (ADA). In his Complaint, Mobley asserts that Workday's software repeatedly rejected him based on a number of identifying criteria, such as his graduation date, his alma mater, a historically Black college, and a plethora of personality tests and assessments which he claims screened him out due to his depression and anxiety. Mobley's assertions that Workday's rejections were automated were grounded in that he often received rapid rejections and rejections in the middle of the night.

Workday filed a motion to dismiss, alleging that Mobley failed to state a claim upon which relief can be granted and further argued that, as a software vendor, Workday is not liable for employment discrimination. The court denied the motion on the grounds that Workday acts as an agent of employers under the relevant laws, and therefore, the laws which would otherwise apply to the employer also apply to Workday.

The federal district court judge held that “Workday's software is not simply implementing in a rote way the criteria that employers set forth, but is instead participating in the decision-making process by recommending some candidates to move forward and rejecting others.” “Given Workday's allegedly crucial role in deciding which applicants can get their ‘foot in the door' for an interview, Workday's tools are engaged in conduct that is at the heart of equal access to employment opportunities.”

Judge Lin accepted the Plaintiff's claim that an AI vendor could be directly subject to liability for employment discrimination under Title VII, the ADA, and the ADEA, specifically under the theory that the AI vendor was acting as an “agent” of the employer. However, the court rejected the theory that Workday, the AI vendor, was an “employment agency” under federal law, finding that Workday's alleged activities did not meet the statutory definition of “procuring” employees for employers.

By allowing the Plaintiff's agency theory to proceed, as supported by the EEOC in its amicus brief submitted to the court, the ruling opens the door for a significant expansion of liability for AI vendors in the hiring process, with potentially far-reaching implications for both AI service providers and for employers using those tools.

In light of this decision and the EEOC's overt support of the Plaintiff's novel theory of liability, employers using AI-powered recruiting and hiring tools should review their processes to ensure they can clearly articulate the role these tools play in their hiring decisions and ensure that these tools are not granted definitive discretion in the hiring process that may result in disparate impacts on protected groups. Employers also should review their contracts with these vendors in order to fully understand the scope of liability and whether there is any obligation on the part of the vendor to indemnify the employer in the event of similar litigation. 

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More