top of page

The Workday lawsuit highlights AI recruiting risks

  • 2 days ago
  • 3 min read

Updated: 16 hours ago

Those of you who have worked in the hiring process know the frustrations and challenges of screening job applications to find the qualified applicants.


In my career, it wasn’t uncommon to have 200 on-line applicants for one customer service representative (CSR) position.  How in the world is HR going to sift through all of those and choose who meets threshold requirements?  And who are the most qualified applicants, at least on paper?


Mistakes are made – qualified applicants are overlooked or culled by exhausted HR partners or hiring supervisors, and unqualified applicants slip through the cracks and waste everyone’s time, including their own.


And mistakes in screening applicants can be illegal and costly. 


If a screened-out candidate can allege race, gender, religion, disability, age or other protected-class bias, the company may face agency action or even civil litigation.  And just as important, the best applicants may be left behind while lesser candidates with relatives or friends in the company get preferred treatment.


So in these modern times, let’s use AI-based hiring processes working at lightning speed, assessing all candidates fairly for proper qualifications.  After all, a computer program is not going to have human biases…or is it?


In Mobley v. Workday, Inc., Case No. 3:23-CV-00770 in the U.S. District Court for the Northern District of California, plaintiff Derek Mobley has alleged that HR vendor Workday’s algorithm-based applicant screening system discriminated against him and other applicants on the bases of race, disability and age.  Mobley is African-American and disabled. 


The court dismissed plaintiff’s claims of intentional discrimination, but allowed the case to go forward based on the old Title VII “disparate impact theory” where, despite good intentions, unintended flaws in the hiring process tend to disadvantage or exclude protected-class applicants. 


More plaintiffs have joined the case, and similar cases exist across the country.


According to a January 8, 2026 article by Naveen Kumar of Demandsage, 87% of companies are now using AI-driven tools to save time and money in the hiring process. 


And a quick Google search on the subject this morning provided some gloomy news for company management teams – many AI hiring tools have been found to be rife with bias.  For example, Amazon abandoned an AI-based tool in 2018 that was found to have discriminated against women in the hiring process.


So how does management protect itself from illegal discrimination, yet use AI’s time- and cost-sharing capabilities to streamline and improve the hiring process?


Above all, a company must ensure that human review and evaluation of the hiring process is an on-going and prioritized matter.


AI vendor programs should be thoroughly vetted and audited for potential bias, both before selection and then periodically throughout the program’s use.  The management team needs to understand in detail how the AI program is designed, how algorithms are developed and deployed, and what independent, third-party audits show about the data and potential biases. 


Companies should spot-check the AI employment process periodically, looking at applicant flow and hiring data to see whether there are any suspicious patterns of exclusion or hiring based upon applicant characteristics. 


Legal counsel should be involved in the AI-program selection and implementation process.


Keep records of all audits, checks and balances, and evaluations of the program.  As with all labor and employment processes, contemporaneous documentation is key to company protection. 


I retired before our company joined the AI revolution, and I profess that I know very, very little about it.  But I hope this article at least piques your interest in making sure you don’t just outsource your hiring to computers and algorithms, thinking everything will be okay.


The bottom line – never take the “human” out of Human Resources or your hiring practices.


Call Gary Kleckner, MARC Vice President: 216-973-7323 and see how MARC can help identify and satisfy your interests in a strong and engaged workforce.


 

 

 

 

bottom of page