How AI can help businesses hire better
Could artificial intelligence unpick the knottiest problems – or create more issues than it solves?
Artificial intelligence is changing how many companies recruit. From helping managers optimise their job ads, to sifting through CVs and cover letters to automating the interview process, there are multiple ways the technology can be used to improve hiring for both the employer and the potential employees.
Yet there are challenges, too. AI can be biased, just like humans, and it lacks transparency. In 2019, US video recruiting company HireVue came under fire for screening out candidates based on facial cues and body language. The feature has since been removed but it is a reminder of the ethical questions that surround the use of such technology.
So, should your business be using AI to help recruit? And if so, what are the issues you need to watch out for?
AI’s biggest strength is in its ability to cut down repetitive tasks and streamline processes – the mundane side of the recruitment process. That’s why “parsing” is one of the key ways it is used. This pulls data from a CV and puts it into a database, where it is matched with keywords for open positions. The goal is to automate reading and sorting CVs, making it faster and easier to find qualified candidates.
AI tools can also be used to match candidates based on job descriptions, then to use screening questions to ask candidates about experience, skills, and job interests. The technology has also been used to help broaden the talent pool by sourcing candidates through searching databases, social media sites and professional networks to find potential candidates who match specific roles but may not be on the lookout for a new opportunity.
However, concerns have been raised. Critics point to bias amplification, privacy concerns, a lack of transparency and a loss of human touch as weaknesses in the tech.
That is something Chris O’Brien, products and services director at IT services company Advania, is all too aware of. He sees AI as a tool to optimise hiring, rather than something that should be relied on to make hiring decisions.
“We’re very careful when we’re using AI in actual decision-making,” he says. Advania has rolled out Microsoft Copilot to every employee in the company as part of a wider AI strategy. O’Brien points to discrepancies in how job descriptions were written in different departments before the tool was brought in as an example. Now, AI is used not just to write them, but to optimise them.
“What we tell people to do is ask Copilot to critique them,” he says. “We give them a couple of prompts to essentially ask the AI to critique the role description they have written and look for improvements.
“The AI will often say things like, ‘balanced job description, but you’re asking for evidence of X,’ and, ‘you’re not very specific in this piece of the criteria here, it would benefit from some examples to make it clearer’. Nine times out of 10 it’s spot on and it picks up something that we had missed.”
Another area where AI can help to improve hiring is in the candidate experience. Much of the communication with candidates can be time-consuming, especially in roles with hundreds of applicants. AI can automate much of that, with chatbots, for example, able to respond to specific questions asked by those applying.
Andrew Grill is the author of Digitally Curious: your guide to navigating the future of AI and all things tech. He says: “If you’ve got five open roles, you may have 50 or 100 people applying for those roles. If every single person comes back and says, ‘Where am I in the process?’, you’re never going to get anything done if you respond to all of them. Or maybe they want to ask a little question about holidays and don’t want to annoy anyone. A chatbot would be great here.”
But before you head off and integrate AI into your hiring process, it’s important to press the pause button. Relying on AI to be a golden bullet is wishful thinking.
“The first thing I would recommend is to take a step back,” says Darren Warrener, Microsoft’s global data and AI lead for retail & consumer goods. “Where are the areas that are blindingly obvious and that you spend a tonne of time? You should also ask: ‘What is it we’re trying to achieve? What will have a big impact?’”
That leads to another issue around AI. Because it’s the current business buzzword, too many companies are issuing edicts telling staff to use the technology without thinking strategically about how it fits and where it works best.
“We have a fundamental problem that the people saying we have to do AI don’t understand what it is and what it isn’t,” says Grill. “I’m sure the frazzled HR manager probably has a list of the five systems they would like to get but they’re told, ‘No, we can’t afford that, just do AI’.”
“The key thing for any organisation, big, medium or small,” says O’Brien, “is if you just throw the AI technology out there, you are most likely going to get poor results or fail. We thought that because we were technologists, our people would just gravitate to it. When we’re talking to clients about AI, we’re often saying a quick rule of thumb is to hold back 20 per cent of the overall budget for business change work. If you spend 100 per cent of it on AI licensing and technology, you will fail.”
Implementing AI is also not always simple. Current systems need to work with the technology and complement it. Simply adding a layer of AI may complicate hiring if the basics aren’t fixed first.
“Hiring managers can’t write job descriptions – so there’s flawed data,” says Katrina Collier, a recruitment specialist who wrote the book Reboot Hiring: the key to managers and leaders saving time, money and hassle when recruiting. “Then you’ve got humans who can’t write CVs – that’s bad data and we’re trying to match them. One example: you have people doing tick-box exercises, saying, ‘I don’t want to necessarily say I’ve got dyslexia, because if I say that are they going to rule me in or rule me out?’”
Which tools are best? “Fools rush in,” says Warrener. “Everyone’s going mad digging for gold but ultimately very few are successful in making a real impact or driving real business value. Plus, there’s a new tool every week. How do you go and decipher which ones to go for? My advice would be, stick with the brands you trust as a general principle.”
Collier recommends that AI tools should be used as aids for the people doing the recruiting, in part because otherwise companies risk losing what makes their business stand out. She says: “Use it to do the repetitive tasks and allow the humans to be the creatives they are. You’re seeing that now with tools that write job descriptions. It’ll be vanilla and exactly the same as everybody else’s because it’s AI-fed.”
Of course, no conversation around AI is complete without mentioning bias. Legislative moves, which are expected to go further than the EU’s AI Act, may require companies to be able to explain how their algorithm works. For recruitment, this means being able to explain why a candidate was excluded from consideration or didn’t get the job.
“This needs to be a GDPR [General Data Protection Regulation] moment,” says Warrener. “You have to publish your privacy policies and data protection policy, and you need to do the same thing with AI and make sure that, from a legal perspective, you’re covered. You also need to understand what your partners are doing because, at the end of the day, they’re a representation of your brand.”
Collier points to a high-profile lawsuit brought against Workday by Derek Mobley last year. Mobley took legal action against the software company, alleging he was overlooked for more than 100 job opportunities with companies partnered with Workday due to his race, age (over 40) and experiences with anxiety and depression. The case is continuing.
“As a company, you have a responsibility to make sure that the technology is not biased,” adds Collier. The right AI tool should offer the ability to find out why a candidate has been screened in or out.
As Grill says, it should be able to list the top 20 candidates and also be able to justify each one. “At least then a human can look across that data and see if, for example, it’s because they like red M&M’s. That’s a silly reason and the model’s a bit skewed there,” he says.
Why is this so important? Grill finishes with an analogy. When a competition was run for the design of the Sydney Opera House, the winning entry was found in the bin. It was deemed too different, but after sifting through all the entries, that’s what gave the design its edge. Don’t let the gems slip through AI’s cracks.