MarketWatch

It's getting harder to find a job - and AI tools aren't making it any easier

By Hannah Erin Lang

Companies are increasingly leaning on AI tools in hiring, as candidates are using the same technology to beat the algorithm and get eyes on their résumé

Praying you've picked the right keywords for your résumé. Applying to hundreds, sometimes thousands, of postings. Automated rejections that appear in your inbox just five minutes after clicking "submit" - if you're lucky enough to hear back from the employer at all.

These days, looking for a job can feel a lot like screaming into the void. As the labor market has cooled, finding work as a white-collar professional has started to feel like a herculean task for many job seekers: Openings are dwindling and employers are asking more of applicants - and nowadays, you've got to beat the algorithm on the other side of the "apply" button.

If you've applied for a job in the past few years, it's likely that an algorithm helped decide whether or not you made it to the next round. Ninety percent of executives surveyed by Harvard Business School said their companies are using software to initially filter and rank candidates. Almost every such system has AI built in, experts tell MarketWatch, and overwhelmed hiring teams are increasingly reliant on the technology.

But employers are no longer holding all the cards when it comes to AI. In the past couple of years, job seekers, too, have gained access to similar tools. Now, most candidates with a computer can use the technology to polish their cover letter, stuff their résumés with the right phrases and even prepare responses to interview questions. In such a competitive market, some workers feel they have no choice.

The result? A job market that, increasingly, boils down to "AI versus AI," as one researcher of the phenomenon put it.

'A lot of [job seekers] know that companies use AI. So there's a sort of joke that it's... 'may the better AI win,'" said Hilke Schellmann, a journalist, professor at New York University and author of "The Algorithm: How AI Decides Who Gets Hired, Monitored, Promoted, and Fired, And Why We Need To Fight Back."

For applicants, AI is complicating a job market that has already grown much more challenging for workers. For companies, it's not always the best way to find the right candidate. And on both sides of the job market, it might be doing more harm than good: The tools can have a discriminatory impact, sidelining workers already on unequal footing in the workplace.

The proliferation of AI in the hiring process is "a little bit of an arms race," Schellmann said. "A lot of job seekers are really, really frustrated."

More applications, fewer recruiters

Using AI in the hiring process was already common among major companies. But recently, the tools have become even more essential.

Recruiting teams were hit hard by a wave of corporate job cuts in 2023 and 2024, said Liora Alvarez, a former agency and in-house recruiter who now works as a career coach and consultant. That's left many hiring teams stretched thin.

As the job market has slowed, many companies have brought fewer recruiters on board. Postings for human-resources roles were down more than 60% in August from the same month two years earlier, according to ZipRecruiter data.

This reduction in the HR workforce happened just as the volume of applications for open roles reached new heights, said Alvarez, who uses the pronouns they and them.

"A job could easily get 5,600 applications in a couple days," they said, attributing the influx to the state of the job market and the convenience of sites like Indeed and LinkedIn. "That's a lot for one person to go through."

For well-known companies, the numbers can be much higher. Companies like Google and IBM literally receive millions of résumés a year, Schellmann noted.

Many companies are now using AI tools to scan résumés and sort them for recruiters, sources told MarketWatch. Sometimes, they use AI for much more: testing a candidate's personality by having them play a videogame, as Schellmann writes in her book, or evaluating footage from a video interview.

In the introduction to her book, Schellmann explains one such product she saw presented at a 2018 conference. The technology analyzed a candidate's facial expression, from the breadth of their smile to the furrow of their brow, to gauge emotions like joy or disgust and predict job success.

As more job seekers become aware of these practices and start to use tools themselves to improve their chances, "we end up with AI talking to AI," Alvarez said. That's a troubling trend for a process that can already feel cold and impersonal to some workers, they said.

Alvarez isn't alone in their concerns. A 2023 survey from the Pew Research Center found that 41% of Americans oppose the use of AI to review job applications. Nearly three-quarters of respondents said they object to the technology making a final hiring decision.

"It doesn't seem very effective," Alvarez said. In fact, the technology can have discriminatory impacts, research has shown, and employers themselves say it screens out qualified applicants.

"We add AI into what often feels like a broken system," they added, "and it kind of feels like slapping a Band-Aid on a sinking ship."

Using AI to find a job

As recruiting teams increasingly rely on AI to rank résumés and sort through candidates, job seekers are using the same technology to try and increase their chances of landing a job, or even just an interview.

Some 58% of job seekers have used AI tools in their job search, a July survey of job seekers in 12 countries by the software-search site Capterra found.

There are AI services that will generate résumés, help you prepare for interviews and even send out multiple applications with the click of a few buttons. Many of the services are free, and using them can be as easy as typing a question into ChatGPT.

Some managers think the tools are giving candidates an unfair advantage. In a survey of Australian business leaders commissioned by the staffing company Robert Half, nearly one in three said they thought using generative AI to craft job-application materials was somewhat or totally unacceptable.

But the majority of candidates are simply using the tools to polish their own work on something like a résumé or cover letter, said Dani Herrera, a hiring consultant who worked in recruiting for more than a decade - and there's nothing unethical about that.

"They're using AI to improve some of their chances," Herrera said. "It's exactly what people have been doing manually for decades."

Though some job seekers feel pressured to use AI tools, it doesn't always give them much of an edge.

One 29-year-old job seeker told MarketWatch she thought it would be much easier to find a job when she moved from the East Coast to Las Vegas, Nev., last year. Instead, she applied for hundreds of roles and only heard back from a couple of employers - including one who simply informed her the company was no longer hiring.

Eventually, the job seeker, who asked that her name not be published as she is still searching for full-time work, tried using generative AI to optimize her résumé.

The AI-edited document didn't seem particularly well-suited for the writing and communications roles she was applying for, she said - and ultimately, it didn't win her any interviews.

"If I sent a résumé with zeros on it, with no content, I would get the same result as I'm getting," she said. "I'm looking at it like, 'OK, well, maybe the algorithm that is hiring may put this résumé through, but it's not a better résumé. I don't feel better submitting this.'"

Potential legal risks of AI in hiring

AI isn't just creating another hurdle or throwing another new variable into a complicated and challenging job market. It's also raising legal and ethical concerns for companies.

Civil-rights advocates have criticized the use of AI in hiring, arguing that the technology will inherently reproduce labor-market inequities that hurt women, people of color and workers with disabilities.

"Workers often have no idea that an AI tool is being used, let alone how it works or that it might be discriminating against them."Olga Akselrod, a senior staff attorney at the American Civil Liberties Union

AI tools predict future outcomes based on past data that's been used to train them. But in the workplace, that historical data is often itself reflective of systemic disparities, explained Olga Akselrod, a senior staff attorney at the American Civil Liberties Union.

"Algorithmic tools are trained to make decisions based on historical data - based on what has happened before," Akselrod said. "That data is going to bake in any biased human decisions and biased systems."

So entrenched is that flaw that even algorithms that have been trained to avoid bias can result in discriminatory impact.

She pointed to one infamous example at Amazon several years ago, when an algorithmic tool the company built "learned" that male candidates were preferable to female ones, due to the higher volume of male résumés received by the company. The tool penalized résumés that referenced "women's" activities, like running a women's club, Reuters reported in 2018.

Amazon told Reuters the tool was never used to evaluate candidates, and scrapped the technology.

There are more recent examples of AI discrimination in hiring, as well. One 2023 paper from researchers at Penn State University found that many AI models used by companies to analyze text demonstrated a bias against people with disabilities - with the algorithms tending to characterize sentences with disability-related terms as negative.

A lack of transparency between companies and workers compounds the discriminatory implications of AI, Akselrod said.

(MORE TO FOLLOW) Dow Jones Newswires

10-04-24 0601ET

Copyright (c) 2024 Dow Jones & Company, Inc.

Market Updates

Sponsor Center