AI in Hiring: A Diversity Killer or Inclusive Tool?

AI-driven hiring tools, while intended to be efficient and objective, may unintentionally reinforce biases against neurodivergent candidates if trained on data from mostly neurotypical individuals. It's crucial to examine the data behind AI systems and adapt recruitment practices to ensure inclusivity and avoid unintentionally excluding neurodiverse candidates.

As someone who passionately believes in the power of neurodiversity in the workplace, I am increasingly concerned about the unconscious biases embedded within some AI-driven hiring tools. While these technologies promise efficiency and objectivity, they may inadvertently perpetuate discrimination, undermining efforts to create diverse and inclusive workplaces – training on data based upon neurotypical individuals may well be widespread in some of these tools.

 

Clearly then, the data that trains these AI systems is of crucial significance. If historical hiring data reflects neurotypical biases then the AI tool may learn to replicate these patterns, effectively institutionalising past prejudices. An insight, published by Deloitte (2022), showed that “AI hiring systems coded using mostly neurotypical candidates’ data could be biased against applicants with autism due to atypical facial or speech expressions; this could result in a higher probability of neurodivergent individuals being eliminated if the algorithm is given disproportionate weightage in the hiring process.” 

 

Similarly, the Australian Government’s JobAccess program advises employers to check recruitment programs that rely on AI, ensuring that “the data coded into the system isn’t already biased (based solely on neurotypical candidates). This could result in a higher probability of neurodivergent individuals being unfairly eliminated (in the recruitment process)”. My words used in the parenthesis. 

 

Moreover, an interesting article by Clyde & Co, “examines how neurodiverse candidates can be affected by unconscious bias in the recruiting process and provide our recommendations to assist employers with making recruiting a fairer and more accessible process – which ultimately allows employers to harness untapped talent in the recruitment pool.” 

Of course, the use of AI systems in recruiting new talent is not the only way in which unconscious bias may be mitigated within the recruitment process – others include adapting assessments to fit neurodiverse needs, training recruiters on what neurodiversity actually is, etc – but its important that AI tools may be portrayed to be objective when inherently they simply aren’t. Ask your AI product supplier the right questions if you really want diversity in your team!

LAST UPDATED

March 28, 2025

CATEGORY

Strategy

Back
to
top