Across millions of comparisons between job descriptions and resumes, AI tools favored applicants based on the race and gender associated with their names.

Can You Trust AI to Review Your Job Application Fairly? Study Finds Screening Tools May Be Biased

When you apply for a job online, there’s a good chance your resume will be reviewed by software before it ever reaches a real person. Many companies now use artificial intelligence to scan resumes, rank applicants and match candidates with job descriptions. These tools are meant to save time, but recent research suggests they may also be quietly introducing bias into the hiring process.

In a new study presented at the Association for the Advancement of Artificial Intelligence/Association for Computing Machinery Conference on AI, Ethics and Society, researchers looked at how AI models evaluate resumes. The results were troubling. Across millions of comparisons between job descriptions and resumes, AI tools favored applicants based on the race and gender associated with their names.

The researchers submitted 550 actual resumes to AI systems developed by three companies: Mistral AI, Salesforce, and Contextual AI. The only thing that changed on each resume was the name. Some were given names typically linked to White applicants, while others were assigned names commonly associated with Black applicants. Each name was also gendered to see if that would affect the results.

It did. Resumes with White-associated names were selected as the top candidate more than 85 percent of the time, while those with Black-associated names were chosen less than 10 percent of the time. In fact, the AI systems never once ranked a resume with a Black male-associated name above one with a White male-associated name. They were also far more likely to favor male names than female ones. Resumes with Black female names performed better than those with Black male names, but they were still chosen far less often than their White counterparts.

The study focused on just three AI providers, but the findings point to a much broader issue. Many companies use AI tools that are proprietary and not open to outside scrutiny. Without public audits or transparency about how these tools are developed, it’s hard to know how often gender or racial discrimination happens or how to fix it.

Kyra Wilson, the study’s lead author and a researcher at the University of Washington, emphasized the importance of oversight. She explained that most of these systems are rolled out without any formal testing for bias. Unless a city or state has specific rules requiring it, companies are not legally obligated to evaluate their AI tools for fairness.

In California, workers have slightly more protection, Wilson said. State labor laws recognize that discrimination is often complex. For example, someone might be treated unfairly not just because they are Black, but because they are a Black woman. That idea, known as intersectionality, helps capture the way multiple identities can overlap to create unique forms of discrimination. The AI study highlights how this plays out when both race and gender interact.

At the federal level, some progress is being made. The U.S. Department of Labor last year released an inclusive hiring framework that addresses the growing role of AI in employment decisions. 

Still, it’s clear that many employers are not doing enough. Some use AI only for simple tasks like sending emails or scheduling interviews, but others rely on it to make critical decisions about who gets hired. If those decisions are based on flawed or biased algorithms, qualified candidates may never get a fair chance.

Job seekers might not even realize they have been filtered out by a machine. If you are not selected for an interview, and your resume was ranked by AI, you may never know why. You may not even know the tool was used in the first place.

Under both the California Fair Employment and Housing Act and federal law, it is illegal to discriminate against job applicants based on their race, gender or other legally protected characteristics. This includes discrimination that happens because of automated decision-making. Employers can’t use AI as a shield to avoid responsibility for unfair hiring practices.

If you suspect you have been passed over for a job because of something other than your qualifications, consulting an experienced employment lawyer can help you determine whether you were treated unlawfully. Sometimes the bias is unintentional, but that doesn’t make it legal. When discrimination happens at the application stage, it can be especially difficult to detect — let alone prove — without help.

McCormack Law Firm is dedicated to helping workers resolve employment disputes and take legal action when their rights are violated. If you believe you faced discrimination in the workplace or during the hiring process, our San Francisco employment lawyers are here to listen. Contact us today to schedule a free initial consultation.

Read more

In California, workers who earn commissions rely on clear contracts and timely payments to keep their finances on track. When employers fall short in those areas, the consequences can be serious.

What Commissioned Workers in California Can Learn from the Oracle Settlement

In California, workers who earn commissions rely on clear contracts and timely payments to keep their finances on track. When employers fall short in those areas, the consequences can be serious. Commission…

READ ARTICLE
It can be hard to speak up at work when you know something isn’t right.

When Speaking Up at Work Leads to Losing Your Job: What California Workers Should Know

It can be hard to speak up at work when you know something isn’t right. Many California workers fear retaliation if they report a work injury or try to assert their employee…

READ ARTICLE
A new report draws attention to what many Black women in California already know from experience: the workplace is not an equal playing field.

Barriers at Every Level: Report Highlights Workplace Discrimination Against Black Women in California

A new report draws attention to what many Black women in California already know from experience: the workplace is not an equal playing field. According to the findings, more than half of…

READ ARTICLE
A recent case involving Costco highlights what workers should know about requesting reasonable accommodations after an injury affects their ability to do their job.

Costco Disability Discrimination Case: What It Teaches About Reasonable Accommodations

In California, employees have legal protections when they get hurt on the job and can’t return to work right away. Sometimes, even with those protections in place, things don’t go the way…

READ ARTICLE
SEEN ON
Fox40-bw
KPIX-bw
SFGate-bw
marin-ij
Abc10-bw