FREE CONSULTATIONS:
415.925.5161

Can You Trust AI to Review Your Job Application Fairly? Study Finds Screening Tools May Be Biased
When you apply for a job online, there’s a good chance your resume will be reviewed by software before it ever reaches a real person. Many companies now use artificial intelligence to scan resumes, rank applicants and match candidates with job descriptions. These tools are meant to save time, but recent research suggests they may also be quietly introducing bias into the hiring process.
In a new study presented at the Association for the Advancement of Artificial Intelligence/Association for Computing Machinery Conference on AI, Ethics and Society, researchers looked at how AI models evaluate resumes. The results were troubling. Across millions of comparisons between job descriptions and resumes, AI tools favored applicants based on the race and gender associated with their names.
The researchers submitted 550 actual resumes to AI systems developed by three companies: Mistral AI, Salesforce, and Contextual AI. The only thing that changed on each resume was the name. Some were given names typically linked to White applicants, while others were assigned names commonly associated with Black applicants. Each name was also gendered to see if that would affect the results.
It did. Resumes with White-associated names were selected as the top candidate more than 85 percent of the time, while those with Black-associated names were chosen less than 10 percent of the time. In fact, the AI systems never once ranked a resume with a Black male-associated name above one with a White male-associated name. They were also far more likely to favor male names than female ones. Resumes with Black female names performed better than those with Black male names, but they were still chosen far less often than their White counterparts.
The study focused on just three AI providers, but the findings point to a much broader issue. Many companies use AI tools that are proprietary and not open to outside scrutiny. Without public audits or transparency about how these tools are developed, it’s hard to know how often gender or racial discrimination happens or how to fix it.
Kyra Wilson, the study’s lead author and a researcher at the University of Washington, emphasized the importance of oversight. She explained that most of these systems are rolled out without any formal testing for bias. Unless a city or state has specific rules requiring it, companies are not legally obligated to evaluate their AI tools for fairness.
In California, workers have slightly more protection, Wilson said. State labor laws recognize that discrimination is often complex. For example, someone might be treated unfairly not just because they are Black, but because they are a Black woman. That idea, known as intersectionality, helps capture the way multiple identities can overlap to create unique forms of discrimination. The AI study highlights how this plays out when both race and gender interact.
At the federal level, some progress is being made. The U.S. Department of Labor last year released an inclusive hiring framework that addresses the growing role of AI in employment decisions.
Still, it’s clear that many employers are not doing enough. Some use AI only for simple tasks like sending emails or scheduling interviews, but others rely on it to make critical decisions about who gets hired. If those decisions are based on flawed or biased algorithms, qualified candidates may never get a fair chance.
Job seekers might not even realize they have been filtered out by a machine. If you are not selected for an interview, and your resume was ranked by AI, you may never know why. You may not even know the tool was used in the first place.
Under both the California Fair Employment and Housing Act and federal law, it is illegal to discriminate against job applicants based on their race, gender or other legally protected characteristics. This includes discrimination that happens because of automated decision-making. Employers can’t use AI as a shield to avoid responsibility for unfair hiring practices.
If you suspect you have been passed over for a job because of something other than your qualifications, consulting an experienced employment lawyer can help you determine whether you were treated unlawfully. Sometimes the bias is unintentional, but that doesn’t make it legal. When discrimination happens at the application stage, it can be especially difficult to detect — let alone prove — without help.
McCormack Law Firm is dedicated to helping workers resolve employment disputes and take legal action when their rights are violated. If you believe you faced discrimination in the workplace or during the hiring process, our San Francisco employment lawyers are here to listen. Contact us today to schedule a free initial consultation.
Read more
Black Worker Wins $10 Million in Stanford Health Care Racial Harassment Case
Workplaces should be places of professionalism and mutual respect. Yet, for some workers, they become environments where discrimination and harassment thrive unchecked. For one health care worker at Stanford Health Care, her…
Caught Between Policy and Safety: Former Home Depot Employee Sues for Age Discrimination
Navigating the complexities of workplace dynamics and company policies can be daunting, especially for older employees who may face age-related biases. A 72-year-old cashier at a Home Depot in San Ramon, California,…
Black Worker at SFO Files Lawsuit Over Racial Harassment and Discrimination
Workplaces are meant to foster collaboration, respect and inclusion. Unfortunately, this is not always the case. Some employees struggle with harassment and discrimination on the job, leaving them feeling isolated, demeaned and…
Retaliation in the Workplace: Okta Faces Gender Discrimination Lawsuit Amid Layoffs
For employees, the right to voice concerns about unfair treatment should be given in any workplace. Yet, in many cases, reporting discrimination results in retaliation rather than the employer resolving the issue…
SEEN ON




