Skip to content

Can AI eliminate unconscious bias in recruitment?

Most of us appreciate how artificial intelligence can simplify tasks. It’s a satisfying feeling when your phone suggests a great restaurant while you’re travelling in a foreign city. Meals are of course one thing, but when it comes to crucial choices, such as unbiased recruiting, can AI be just as reliable?

A key issue for employers when they recruit is assessing if a person can carry out the requirements of the job. They must clearly outline the essential duties of all positions when hiring, being careful to not to confuse ‘abilities’ with ‘characteristics’.

To be unbiased means employers select the best person for the job. They make no assumptions about what people can and can’t do, or if they will ‘fit in’ as a result of their background.

During the early stages of recruitment, it can be easy for biases to creep into the process. For instance:

  • Advertising jobs – the Australian Human Rights Commission calls out commonly used phrases such as ‘join a dynamic team’ or ‘seeking a mature, experienced professional’ as possibly discriminatory against certain age groups.
  • Shortlisting candidates – forming an opinion about the candidate’s suitability based on their name or geographic location and what that might tell you about their cultural or racial background.

Could this be where an unbiased selector, such as AI, step in to eliminate any inherent prejudices?

 

Where AI has an advantage

Speed and accuracy. An area where machines outperform humans is around processing and analysing vast amounts of data. AI-enabled software and machines can carry out processes in a more advanced way. Although they mimic human behaviour when inputting and consuming information from multiple sources; they can gather, process and record data in more efficient way and in larger volumes. This means a larger candidate pool could be considered for roles, allowing for better diversity amongst the shortlist.

Perspective shift. Unlike humans who can at times respond based on gut feeling, AI always responds based upon evidence. In addition, AI’s ability to find patterns within data mean they could disrupt common thinking. In a similar way to how Deep Blue defeated Kasparov by making moves that his opponent didn’t anticipate, so can AI-enabled systems explore alternative avenues when selecting candidates.

 

Where AI falls short

In theory that may all sound convincing, putting AI to work however is a lot more complex. For instance, when Amazon created an AI-based recruiting system they failed to shield the system from bias. By teaching the system to rate male candidates as the ideal fit, not explicitly but by deducing through previous criteria that put female candidates at a disadvantage.

Ultimately, humans build the parameters that define AI’s decision-making. If bias creeps in then the system’s integrity is inevitably compromised.

 

Use AI to support a bias-proof recruitment process

For now it appears the best approach is not to view AI as superceding all human participation in recruiting. Rather it is best used as a support mechanism for unbiased recruitment.

Use the technology to hone in on specific processes that require objectivity. Software that check the language you use, such as Textio, can help edit out words or phrases that could limit the type of people who would respond to job ads. In addition, there are also existing systems that assist by removing information, such as names and geographical information, that may trigger bias.

Until completely bias-proofed technology exists, the onus still rests upon managers to use all possible options to ensure unbiased recruitment.

Office Closure

Thank you for visiting our website.
Please note during the holiday season our offices will be closed from the 22nd of December 2023 to the 2nd of January 2024.

For all enquires please refer to the contact us section of the website.