AI Isn’t Replacing Recruiters. It’s Replacing Bad Recruiting.
Companies are laying off recruiters because of AI. Not because AI is doing recruiting work. Because executives think it will, eventually, and they want the cost savings now. The data on how that’s going is already in, and it’s ugly.
A Harvard Business Review study published in January 2026 surveyed over 1,000 global executives. Sixty percent of organizations have already reduced headcount in anticipation of AI’s future impact. Only 2% made those cuts based on actual AI implementation results. HBR’s term for it is “AI redundancy washing,” which means attributing layoffs to AI because it plays well with investors, not because the technology is doing the job.
We run IT and technical staffing at KORE1 out of Irvine. We see this from both sides. Clients asking whether they still need recruiters. Candidates getting ghosted by automated systems that never forwarded their resume to a human. And the uncomfortable reality that the companies investing the most in AI hiring tools are, by every measurable standard, hiring worse than they were two years ago.

The Metrics Went Backward
Not sideways. Backward.
According to SHRM’s 2025 benchmarking data, cost-per-hire has increased during the AI adoption wave. Executive cost-per-hire is up 113% since 2017 and 21% just from 2022. AI adoption in HR tasks climbed to 43% in 2025, a 65% jump from 26% the year before. The tools got faster. The outcomes got more expensive.
Time-to-hire went up too. The whole pitch was speed. Faster screening, faster shortlisting, faster everything. Companies are screening faster. They’re also screening out qualified people and then spending months trying to find replacements who match criteria that a human would have recognized as flexible from the first conversation.
SHRM found that 19% of organizations using AI in hiring confirmed their tools screened out qualified candidates. Not “might have.” Verified, documented, admitted to by the companies themselves. People who had the skills, the experience, and the availability, and got bounced because a parser couldn’t read their resume format or dinged them for using “CI/CD” instead of “continuous integration.”
| What AI Promised | What Actually Happened | Source |
|---|---|---|
| Lower cost-per-hire | Cost-per-hire increased, exec CPH up 113% since 2017 | SHRM 2025 |
| Faster hiring | Time-to-hire also increased | SHRM 2025 |
| Better candidate matching | 19% of orgs confirmed AI screened out qualified candidates | SHRM 2025 |
| Improved candidate experience | 65% of HR pros say AI drives candidate ghosting | LiveCareer 2025 |
| Bias reduction | AI screening favored older male candidates over equally qualified women and younger applicants | Stanford, Oct 2025 |
Gartner’s Jamie Kohn, their senior director of research, didn’t sugarcoat it. Her quote: “We are trying to make the process better, but at the same time, the technology is making the process worse.” That was in a published SHRM interview in 2025, not an offhand remark.
Candidates Are Noticing. And Leaving.
A LiveCareer survey of nearly 1,000 HR professionals found that 65% believe AI and automation tools are contributing to candidates dropping out of the hiring process. Eighty-eight percent say they’ve lost contact with candidates midway through. And 71% say the problem has gotten worse in the past 12 months, which tracks with the timeline of when most companies rolled out generative AI tools for hiring.
Worse, 66% of job seekers say they’d skip the application entirely if they knew a company was using AI to make hiring decisions. Two-thirds of your talent pool, gone before you even see their resume. The system you built to find people faster is the reason those people aren’t showing up.
We had a client in Q4 last year, mid-size SaaS company in Orange County, running an AI-first hiring stack. ATS with AI screening, automated scheduling, chatbot for candidate questions, AI-scored video interviews. The whole thing. They came to us after four months of trying to fill a senior DevOps role. Their AI had processed 340 applicants. Surfaced 12 to the hiring manager. The manager rejected all 12. When we reviewed the full applicant pool manually, we found six candidates who were strong fits. Three of them had been filtered out because their resume formatting confused the parser. Two because they used different terminology for the same skills. One because she’d taken a two-year career break that the AI scored as a negative signal.
We placed one of those six in 18 days.

What Meta and ProPublica Tell Us About Where This Is Headed
Meta laid off 700 employees on March 25, 2026. Recruiting was one of the departments hit. Less than 24 hours before announcing those cuts, the company disclosed a stock compensation program that could pay six top executives up to $921 million each over the next five years, contingent on Meta reaching a $9 trillion valuation by 2031.
Fire the people who find the people. Pay the executives who decided to fire them almost a billion dollars each. That’s the math.
The same week, ProPublica’s newsroom authorized the first U.S. newsroom strike with AI protections as a central demand. The vote wasn’t close, 92% authorization with 99% of the union participating. They want a ban on AI-related layoffs, the right to decline AI tools without being disciplined, and mandatory labeling of AI-generated content. Management’s counter-offer was “regular discussion and training.” The union called that insufficient. Hard to argue.
Recruiting isn’t journalism, obviously. But the underlying anxiety is identical. And for recruiting specifically, it signals something the data already showed: people don’t trust AI to evaluate them, and the professionals who build hiring systems don’t trust it to replace their judgment. The backlash is real and it’s accelerating.
Where AI Actually Works in Recruiting
None of this means AI is useless. It means it’s being used wrong.
At KORE1, we use AI in three places where it genuinely makes us faster without degrading the process:
- Sourcing. Scanning LinkedIn, job boards, and internal databases to surface candidates who match technical and experiential criteria. A search that took a recruiter four hours now takes twenty minutes. The AI surfaces the pool. A human reviews it.
- Scheduling. Coordinating interviews across three calendars, two time zones, and a hiring manager who’s in back-to-back meetings until Thursday. AI eliminates the email ping-pong entirely. Nobody misses this part of recruiting.
- Market intelligence. Pulling compensation data, analyzing talent supply in a specific metro area, benchmarking a job description against similar active postings. AI aggregates faster than any analyst could. The recruiter interprets what it means for this specific client, this specific role, this specific candidate.
Three things. The rest stays human.

Because here’s what AI cannot do in a recruiting context, and I don’t mean “can’t do yet” in the way tech executives say it when they mean “give us another funding round.” I mean structurally cannot do.
AI can’t tell when a candidate is genuinely excited about a role versus performing enthusiasm because they need a paycheck. A recruiter who’s been doing this for a decade catches that in the first five minutes of a call. AI can’t navigate a counteroffer conversation where the candidate’s current employer just offered a 15% raise and a title bump, and the only reason the candidate is still considering leaving is that their manager micromanages every code review. That’s a trust conversation. It runs on reading between the lines, on knowing when “I’m open to hearing more” means “I’m leaving” versus “I’m being polite before I say no.”
AI can’t evaluate cultural fit. And before someone objects that cultural fit is a bias vector, yes, it can be. But done correctly, with a recruiter who understands the team dynamics and the working style of the hiring manager, cultural fit evaluation is the difference between a placement that lasts three years and one that lasts three months. Ask anyone who’s recruited long enough and they’ll tell you the same story. Resume was perfect. The skills matched. The candidate left in 90 days because nobody told them the team works in-office four days a week and the “flexible” in the job posting meant you could leave at 4:30 on Fridays.
The Companies Getting This Right
They’re not the ones spending the most on AI. They’re the ones being specific about where AI fits and where it doesn’t.
A Fortune report from March 2026 found that CFOs privately admit AI-driven layoffs will be nine times higher this year than in 2025. The cuts are accelerating. But the productivity gains that were supposed to justify them haven’t materialized at scale. Nobody’s saying the productivity gains won’t come eventually. But the layoffs are running on investor timelines, not implementation timelines, and in recruiting that gap shows up fast because open reqs don’t pause while your AI vendor ships the next update.
SHRM’s own conclusion, in their report titled “Recruitment Is Broken,” was blunt. Nichol Bradford, SHRM’s chief innovation officer, said: “The AI arms race does not benefit either side.”
If you’re a company rethinking your workforce planning strategy right now, the worst thing you can do is cut your recruiting function and assume AI will fill the gap. The data says it won’t. Cost goes up. Time goes up. Candidate quality goes down. The best people refuse to apply.
If you’re a startup building your first hiring pipeline, this is exactly why human staffing partners still matter. Not because AI is bad. Because AI without a human making the final judgment call produces worse outcomes than the process it replaced.
And if you’re already seeing the effects, if your pipeline is thinner, your time-to-fill is longer, and the candidates who do make it through aren’t sticking around, it’s worth asking what your succession plan looks like for a recruiting function you’ve been quietly hollowing out.
Things Companies Ask Us About AI and Hiring
Should we use AI in our recruiting process at all?
Short answer: yes, for the right things. Sourcing, scheduling, market data. Keep it away from candidate evaluation and final decisions. The 19% false-negative rate SHRM documented isn’t a rounding error, it’s one in five qualified candidates getting rejected by a system that can’t tell the difference between a career break and a red flag.
Our AI screening tool claims it reduces bias. Does the research back that up?
Stanford’s October 2025 study found the opposite. AI resume-screening tools gave systematically higher ratings to older male candidates over equally qualified female and younger candidates. The training data reflects historical hiring patterns, and historical hiring patterns are where the bias lives. You’re not reducing it. You’re automating it at scale.
How much is AI actually saving companies on recruiting costs?
On average? Nothing. SHRM’s 2025 data shows cost-per-hire went up during the AI adoption wave, not down. Executive cost-per-hire increased 21% from 2022 alone. The tools themselves cost money. The integrations cost money. And when the tools screen out good candidates and you have to start over, that costs the most of all. One of our clients burned through $28,000 in four months before calling us. We filled the role for less than half that.
If AI isn’t working for hiring, why are so many companies using it?
Because 60% of executives cut headcount based on what AI might do, not what it’s doing. HBR calls it “AI redundancy washing.” It tells investors a story about efficiency and automation. The problem is the story hasn’t matched reality yet, and the people who would have told the executives that, the recruiters, were the first ones let go.
What’s the right staffing model if we want AI and human recruiting together?
Use AI for sourcing and scheduling. Full stop. Have humans review every candidate an AI surfaces before they get rejected. Have a recruiter run every interview debrief. Have someone with judgment and context handle offer negotiations and candidate communications. That’s the model we run at KORE1, and the reason it works is that we’re not asking AI to do things it’s structurally bad at. Reach out to our team if you want to see how that looks for your specific roles.
