The management of diversity, equity, and inclusion efforts has developed greatly in recent years. Evolving from a focus on compliance requirements to how they can benefit an organization’s performance, D,E,&I efforts today go beyond counteracting biases and have proven to be crucial competitive advantages. Research from McKinsey shows that companies with diverse workforces consistently perform better than those with homogeneous workforces.
Artificial intelligence (AI) may not be the perfect solution to reducing biases in hiring, but it’s still a powerful tool. Not only that, when applied effectively, AI can also simplify the entire hiring process. That’s why we’re sharing exactly how AI can bolster D,E,&I efforts by helping reduce bias in hiring.
Real-life cases of unconscious bias
During hiring, unconscious bias happens when someone forms an opinion about a candidate based on reasons outside of their own conscious awareness. This can look like preferring one candidate over another simply because the first one grew up in the same town as you.
Even in the early stages of the hiring process, elements such as a candidate’s profile photo, alma mater, or personal interests could influence your opinion more than you think. Whether positively or negatively, unconscious bias can impact someone’s perception—and even their hiring decision—using criteria irrelevant to the job.
Although incompatible with one’s conscious values, unconscious bias is far more prevalent than conscious prejudice and is a serious problem in many workplaces. In 2017, Palantir paid $1.7 million to settle a lawsuit from the U.S. Department of Labor, which accused Palantir of disproportionately turning down qualified Asian candidates who applied for certain engineering positions.
Although Palantir disagreed with the allegations and denied knowingly discriminating, the numbers showed otherwise: For one software engineer job, Palantir hired 14 non-Asian engineers and 11 Asian engineers, even though 85% of the 1,160 applicants were Asian.
The Palantir case is a prime example of unconscious bias in hiring, and unfortunately it’s not an isolated incident. Research from the University of Toronto shows that candidates with Asian names are 28% less likely to get an interview than equally qualified candidates with Anglo-Canadian names.
In addition, a 2016 study found that people with resumes containing minority racial cues — such as a distinctively African-American or Asian name — received 30 to 50% fewer callbacks from employers than those who had equivalent resumes without racial cues. When these candidates “whitened” their resumes — concealing or downplaying racial cues — they were significantly more likely to receive a callback, even though their qualifications were unchanged.
And it’s not just race that makes a difference. Other studies reveal biases against female and older candidates.
Unconscious bias in the hiring process
To be clear, unconscious bias in hiring doesn’t stem solely from people. Hiring processes can also prevent qualified candidates from getting hired. For example, candidates who were referred by current employees are more likely to be hired than non-referred candidates. But referrals often result in candidates who are very similar to those who referred them, effectively boxing out candidates who don’t already have an in at the company.
The college-to-job pipeline has inherent bias as well: Overburdened hiring managers who don’t have time to sort through the pile of job applications often sort based on the college a candidate attended. This results in preference for candidates who graduated from traditionally “elite” or “prestigious” colleges, which, of course, can have their own biases in admissions. Even worse, some companies recruit directly from elite universities, actively homogenizing their workforces.
So how exactly can hiring teams reduce unconscious bias throughout the entire hiring process?
How AI reduces bias and can match your organization with top talent
AI has the ability to reduce hiring bias, pushing the hiring and recruitment process into a more fair, more diverse future. By integrating into the hiring process, AI platforms can help sort resumes by desired qualifications while ignoring demographic data. Conversational AI can be used to collect additional information from candidates. AI tools can also streamline the day-to-day work of a hiring department, freeing up more time for fair consideration.
But an AI tool is only as effective as the data that goes into it. If the creators of AI solutions aren’t careful, bias can sneak its way into AI-supported decisions. Luckily, projects such as the Open AI Charter aim to limit implicit bias in AI. IBM Research, too, has produced a series of principles aimed specifically at mitigating bias in AI solutions. In other words, AI is not yet a silver bullet for completely eliminating bias in hiring, but we’re getting there.
Perhaps most promisingly, AI can be used to examine big data sets and identify common traits of successful candidates, giving hiring managers a more reliable heuristic than a candidate’s name, education, or even work history. This is a critical opportunity for HR teams who, in trying to counteract unconscious biases, may have begun choosing resumes with a focus on hiring for diversity over anything else. That’s why when hiring platforms such as GoodJob apply AI and leverage it alongside psychological principles, data science, and years of data and studies, the result is an all-in-one hiring solution that reduces unconscious biases and finds top talent for your organization.
Your PATH to Better D,E,&I
Are you ready to simplify your hiring process, reduce unconscious bias, and find qualified job-seekers who are suited to fuel your organization’s performance?
GoodJob makes it easy for you. No more posting to flooded job boards or paying hefty recruiting fees—and no more worrying about whether unconscious biases are holding your organization back. Simply post a job with a short description and key details, and we’ll both pair you with the top applicants on our platform and help you use the PATH Assessment to vet candidates in your pipeline. Click here to learn more.