Back to Funnies
Funnies

AI Resume Screener Rejects Candidate for Being 'Overqualified' (The Job Required 10 Years Experience)

December 22, 2025
3 min read
Share this article:

AI resume screening is supposed to solve the problem of recruiters manually reviewing hundreds of applications. It identifies qualified candidates quickly, filters out unqualified applicants, and streamlines the hiring process. It's definitely not supposed to reject candidates for having exactly the qualifications the job requires.

But that's exactly what happened when a senior software engineer with 12 years of experience applied for a role requiring "10+ years experience" and got an instant rejection email citing "overqualification."

The candidate posted the rejection email on Twitter with the caption: "Applied for a senior role requiring 10+ years experience. I have 12 years. AI rejected me for being overqualified. Make it make sense."

It went viral. Recruiters, hiring managers, and job seekers all piled on with their own AI screening horror stories. The thread got 50,000 likes and is still growing.

The Rejection Email

The automated rejection email was a masterpiece of algorithmic confusion:

"Thank you for your interest in the Senior Software Engineer position. After careful review, we have determined that your qualifications exceed the requirements for this role. We believe you would be better suited for more senior positions within our organization. We encourage you to monitor our careers page for opportunities that match your experience level."

AI screening tools are used by companies to filter candidates quickly, but "careful review" is generous when the entire process took 14 minutes from application submission to rejection.

The candidate responded publicly: "The job posting literally says '10+ years required.' I have 12 years. How is that overqualified? The plus sign means 'or more,' right? Did the AI fail basic math?"

How This Happens

Most AI resume screening tools use keyword matching, experience thresholds, and scoring algorithms to evaluate applications. AI-powered resume screening can filter out great candidates just because they don't fit a rigid algorithm.

In this case, the company apparently programmed "overqualification" rules to filter candidates with significantly more experience than required, assuming they'd get bored, demand too much money, or leave quickly for better opportunities.

Over-reliance on AI tools can backfire when automated screening filters out great candidates. Someone likely set a rule: "Flag candidates with more than 15 years experience as overqualified." But nobody thought about candidates with 12 years applying for roles requiring 10+ years.

The AI sees: Required = 10 years. Candidate = 12 years. Difference = 2 years. Threshold for overqualification flag = 5 years above requirement. 12 - 10 = 2, which is less than 5. So why did it reject?

Because the AI was apparently also trained to detect "flight risk" candidates who might be applying for roles below their experience level. The algorithm decided 12 years of experience applying for a role that accepts 10+ years means the candidate is probably desperate, between jobs, or will leave as soon as something better comes along.

That's genuinely sophisticated AI reasoning. It's also completely wrong.

The Candidate's Background

The candidate wasn't desperate or between jobs—they were employed at a major tech company and interested in the new role because it offered remote work, better tech stack, and more interesting problems.

AI tools trained without proper guardrails will make whatever calculation they think is most accurate, regardless of whether that logic makes sense to humans.

The candidate had 12 years of relevant experience in the exact technologies the job required. Their resume included successful projects, leadership roles, and technical achievements. On paper, they were the ideal candidate.

The AI rejected them in 14 minutes without human review.

The Company's Response

The company's recruiting team only discovered the rejection when the Twitter thread went viral and someone tagged them. "Wait, we rejected this person? Let me check... oh no. Oh no no no."

They sent a profuse apology email: "We sincerely apologize for the error in our screening process. Your application was incorrectly flagged by our AI system. We would very much like to reconsider your candidacy and invite you to interview."

The candidate's response: "Thanks, but I'm no longer interested. If your hiring process rejects qualified candidates automatically and only reconsiders them when they go viral on Twitter, that tells me everything I need to know about how you operate."

Ouch. But fair.

Companies using AI recruitment tools need to audit for bias regularly and be prepared to defend their use of AI in hiring. When the first time you review an AI rejection is after it becomes a PR disaster on social media, you've failed at AI governance.

The Broader Pattern

This isn't an isolated incident. Other candidates shared their "overqualified" rejection stories:

  • Applied for entry-level role requiring 0-2 years experience with 1 year experience. Rejected for overqualification.
  • Applied for mid-level role requiring 5+ years with 6 years. Rejected for overqualification.
  • Applied for senior role requiring "10+ years" with 11 years. Rejected—you guessed it—overqualification.

One candidate was rejected from a "junior" position requiring 5 years of experience. That job posting was already nonsense (junior roles shouldn't require 5 years), but the AI made it worse by rejecting candidates with 6 years for being overqualified.

The pattern suggests many companies are using overly aggressive "overqualification" filters without understanding how those filters interact with their actual job requirements.

The "Overqualification" Myth

The fear of hiring overqualified candidates is based on assumptions that don't hold up under scrutiny:

"They'll demand too much money": Maybe, but that's what salary negotiations are for. If compensation doesn't align, the candidate declines the offer. You don't need AI to pre-reject them.

"They'll leave quickly": Some might. Some won't. Rejecting all candidates with more experience than the minimum requirement eliminates potentially great long-term hires.

"They'll get bored": People with 12 years of experience applying for roles requiring 10+ years aren't looking for unchallenging work—they're looking for roles that match their current skills. The job matches their experience level.

Companies that filter out overqualified candidates automatically are eliminating a significant portion of their talent pool based on assumptions, not evidence.

What Happened Next

The company issued an internal memo: "Effective immediately, all AI-flagged rejections for overqualification require human review before sending." That's how it should have been configured from the start.

They also adjusted their AI settings to eliminate overqualification filters for roles where experience ranges include "10+ years" or similar open-ended requirements. If you're willing to hire someone with unlimited years of experience, you can't then reject them for having too much.

The candidate who sparked the viral thread declined the company's interview offer but did receive multiple offers from other companies who saw the Twitter thread and reached out directly. The rejection might have been the best thing that happened to their job search.

The Vendor's Defense

The AI screening software vendor released a statement: "Our platform allows companies to configure filtering rules based on their specific hiring needs. The overqualification threshold was set by the client, not our default settings."

Translation: we built the tool, but the company used it poorly. Which is technically true but also misses the point that AI tools should probably have guardrails preventing obviously counterproductive configurations.

If your AI platform lets companies set rules that reject candidates who exactly match the job requirements, maybe your platform needs better safeguards.

The Lesson Nobody Will Learn

Companies should review AI-filtered rejections before sending them. They should test their filtering rules with real candidate data. They should question whether "overqualification" filters make sense for roles with open-ended experience requirements.

Will they do this? Some will. Most won't, because AI screening saves time and manual review takes time, and companies consistently choose efficiency over accuracy until efficiency creates viral PR disasters.

Candidates will keep getting rejected by algorithms that don't understand context. Recruiters will keep scrambling to apologize for "technical errors." And somewhere, an AI is rejecting a perfectly qualified candidate right now for reasons that would make no sense to any human.

The math still isn't mathing. But at least the Twitter thread was entertaining.

The final comment on the viral thread that summarizes it perfectly: "AI doesn't have common sense. It has optimization algorithms. And when you optimize for something stupid, you get stupid results efficiently."

Reject qualified candidates speedrun: AI edition.

AI-Generated Content

This article was generated using AI and should be considered entertainment and educational content only. While we strive for accuracy, always verify important information with official sources. Don't take it too seriously—we're here for the vibes and the laughs.