In recruitment, Artificial Intelligence (AI) has gained plenty of attraction. Many innovators and technology leaders invest in its potential to transform the way we hire. Artificial intelligence promises to mimic human intelligence in order to complete recruiting tasks and make important hiring decisions. Ultimately with the aim to complete workloads faster and reduce labour costs.
Incorporating AI into a hiring strategy definitely has its performance benefits and enables a solution for many hiring organisations looking for a technology workload takeover. But it is not without its risks. Employers have to trust in artificial intelligence to undertake tasks effectively without human input – which means giving up a certain level of control.
Has AI lived up to its expectations?
Whilst many talent acquisition teams implement AI to solve recruiting problems, candidate perception isn’t always positive. According to a recent study, AI feels impersonal to many candidates, with 90% stating they favour human interaction over a robot. Highlighting a clear need for human recruiters to build strong candidate relationships with effective communication for a personalised hiring experience.
Recruiter push back
The message and connotations that surround artificial intelligence in recruitment can cause a hindrance for technology providers. With many recruiters associating AI with a ‘take over’ or ‘replacement’ of their roles it can cause hesitancy in their support for its implementation.
Artificial intelligence undertakes hiring tasks and makes decisions, so humans don’t have to. But dehumanising the recruitment process creates uncertainty within workers about the future of their role.
Lack of trust
Recruitment and hiring are very people-centric roles. There are aspects of the recruiter’s daily duties that require effective communication between themselves and candidates. If they were to rely heavily on AI-powered solutions it can be damaging to the hiring strategy and candidate experience. Research has found that 76% of job seekers feel less trusting towards AI than a person to help guide their job search process.
Many organisations implement AI to eliminate bias and unconscious human motives to hire. But can AI learn bias? Bias has been proven to creep into hiring algorithms in different ways. Take, for example, Amazon. They scraped their AI tool in 2018 because it allegedly showed a bias towards women. The algorithm picked up on female associated gender terms within a job application and taught itself to rank these lower than other applicants with masculine associated terms.
Transfer of power
Tasks that were traditionally human require a transfer of power to AI. Hiring decisions are now in the hands of technology and it can be difficult to undo or ‘fix’ steps that humans may not agree with. It’s a big risk undertaking 100% of recruiting tasks with AI and replacing humans with technology may provide quick results but can be damaging in the long term.
What’s the alternative? Hybrid hiring
Keep the human at the heart of recruitment
It’s inefficient to let technology undergo all recruiting tasks and vice versa for humans. Instead, the key is to maximise the power from both humans and technology.
That’s why many organisations use Augmented intelligence to improve, not replace human intelligence. This approach enables organisations to still incorporate new technologies and innovations but also ensure humans are kept at the heart of all decisions. Augmented tools help tackle repetitive, back-office recruitment tasks which provide recruiters more time to invest in the candidate and key decision making.
We want to hear your thoughts! Let us know what you think in the comments.