You spend hours working on your resume, ensuring it is perfect. You fill out a job application and later hit submit. Twenty minutes later, you receive a rejection email. How is this possible? It is very likely that a human did not review the application whatsoever. Today, 88% of companies use AI for initial candidate screening.[1] Although it is likely that AI decided not to move forward with your application, it is unclear why it made that decision. The AI model may have filtered your application out because you did not meet the minimum qualifications.[2] But who is responsible if the AI decides not to hire you because of your race or age? Who is responsible for that discrimination? 

Derek Mobley believes that these AI models are making decisions based on classifications such as race, age, and disability.[3] Mobley sued Workday after receiving many rejections from employers utilizing Workday’s software.[4] Mobley is a forty-year-old African American. Thus far, Mobley’s case has cleared two significant hurdles. First, it survived a motion to dismiss, indicating that Mobley’s disparate impact claim is at least plausible.[5] Second, Mobley secured preliminary class certification for his Age Discrimination in Employment Act (ADEA) claim.[6] Mobley’s case is one of the first to sue not only the employer but also the software provider for employment discrimination.[7] Mobley’s case is yet to reach a resolution on its merits or undergo appellate review. This raises the question contemplated by Mobley’s case. Is an AI Vendor, like Workday, an employer under Title VII of the Civil Rights Act? 

Title VII of the Civil Rights Act makes it unlawful to discriminate based on an applicant’s race, color, religion, sex, or national origin.[8] The Americans with Disabilities Act (ADA) further extended these protections against disability-based discrimination.[9] Generally, liability attaches to employers, employment agencies, and labor organizations.[10] Mobley contends that Workday functioned as an employment agency because employers delegated traditional hiring functions, including applicant screening, to its AI.[11] On its face, this assertion makes sense. Workday can be set up to automatically reject candidates based on answers to “knockout questions.”[12] Workday is effectively procuring candidates for hire and introducing them to customers to enable further hiring decisions.[13] Even if Workday is not an employment agency under Title VII, Mobley contends that Workday is an agent of its clients.[14] If Workday is ranking candidates based on protected factors, that would violate Title VII. But Workday argues that it makes no hiring or firing decisions and asserts no control over employment decisions; it simply provides a tool. Employers can still control all factors that determine a candidate's ranking.[15] This is a powerful argument. If, as a vendor, Workday falls outside of one of the categories enumerated in Title VII, there is no claim against Workday. 

This is a frightening prospect for employers – if Workday is not on the hook, they might be.[16] Employers could be exposed to numerous Title VII discrimination claims arising from any candidate rejected by Workday or another vendor. There is substantial evidence that AI does discriminate based on race.[17] Cases like Mobley may clarify Title VII’s scope and hold Workday and other vendors liable for underlying biases in their models. However, until there is a nationwide ruling regarding these Title VII issues, relief will be limited to circuits that have adopted such a rule. As noted above, almost any job application will involve AI in the hiring decision.[18] Instead of waiting for the judiciary, Congress should amend Title VII to explicitly include vendors like Workday. Otherwise, there will be little motivation for vendors to ensure their tools do not discriminate against applicants. 


[1] Utkarsh Amitabh & Ali Ansari, Hiring with AI Doesn’t Have to Be So Inhumane, World Economic Forum, https://www.weforum.org/stories/2025/03/ai-hiring-human-touch-recruitment/ [https://perma.cc/W4NY-LRSX]. 

[2] Id.

[3] Mobley v. Workday, Inc., No. 23-CV-00770-RFL, 2025 WL 1424347, at 1 (N.D. Cal. May 16, 2025).

[4] Id. at *2. 

[5] Id. at *3. 

[6] Id. at *1. 

[7] Catie A. Wheatley, AI on Trial: Mobley v. Workday and the Future of Employment Law, Faegre Drinker, https://www.faegredrinker.com/en/insights/publications/2025/10/ai-on-trial-mobley-v-workday-and-the-future-of-employment-law [https://perma.cc/XWP6-QCZB].

[8] See, 42 U.S.C. § 2000e et seq.

[9] See, 42 U.S.C. § 12101 et seq.

[10] 42 U.S.C. § 2000e-2.

[11] Mobley v. Workday, Inc., No. 23-CV-00770-RFL, 2025 WL 1424347, at *3 (N.D. Cal. May 16, 2025).

[12] James Hu, Knockout Job Application Questions: How One Answer Can Kill Your Chances, Jobscan, https://www.jobscan.co/blog/knockout-questions-answer-application/ [https://perma.cc/KZF4-SD4Y].

[13] Workday, Use an Applicant Tracking System to Modernize Hiring, https://www.workday.com/en-us/topics/hr/applicant-tracking-system.html [https://perma.cc/3EZD-42J8] (explaining how Workday can rank candidates). 

[14] Mobley v. Workday, Inc., 740 F. Supp. 3d 796, 804 (N.D. Cal. 2024).

[15] Workday, supra note 13. It is worth noting though, employers are generally not making inputs into the underlying algorithm. In short, the algorithm still may be biased. 

[16] Fisher Phillips, Another Employer Faces AI Hiring Bias Lawsuit: 10 Actions You Can Take to Prevent AI Litigation, https://www.fisherphillips.com/en/insights/insights/another-employer-faces-ai-hiring-bias-lawsuit[https://perma.cc/4B99-PFWS]. 

[17] Katharine Miller, Covert Racism in AI: How Language Models Are Reinforcing Outdated Stereotypes, Stanford University Human-Centered Artificial Intelligence, https://hai.stanford.edu/news/covert-racism-ai-how-language-models-are-reinforcing-outdated-stereotypes [https://perma.cc/5KK8-SNTK].

[18] Amitabh & Ansari, supra note 1. 

Published:
Tuesday, March 10, 2026