Back to Funnies
Funnies

AI Screening Tool Rejects Company's Own CEO for Job Posting (85% Skills Mismatch)

December 15, 2025
3 min read
Share this article:

Nothing says "our AI recruiting tools are working perfectly" quite like having your AI screening software reject your own CEO as unqualified for a job at the company he founded and currently runs.

Welcome to the story of how DataCore Solutions' shiny new AI recruiting platform rated their CEO as an 85% skills mismatch for a VP-level role—and auto-rejected him before any human ever saw the application.

The Great Internal Transfer Experiment

DataCore Solutions is a mid-size B2B software company based in Austin. In November 2025, they implemented a new AI-powered applicant tracking system designed to screen candidates objectively and reduce hiring bias.

The system uses machine learning to analyze resumes against job descriptions, scoring candidates on skills match, experience relevance, and predicted job performance. Candidates scoring below 70% are automatically rejected to "save recruiter time reviewing unqualified applicants."

Great in theory. Absolutely hilarious in practice.

As part of their rollout testing, DataCore's head of HR suggested that leadership team members submit test applications to internal job postings to see how the system worked. You know, quality assurance. Make sure everything's functioning properly before going live.

The CEO, Michael Torres, applied to an open VP of Product role. He figured he'd get a high match score—after all, he'd literally built the company's product strategy for seven years and still oversees the product team.

The AI screening tool had other ideas.

The Rejection Email

Three hours after applying, Torres received an automated rejection email:

"Thank you for your interest in the VP of Product position at DataCore Solutions. After careful review of your application, we have determined that your qualifications do not align with the requirements for this role. We encourage you to explore other opportunities that may be a better fit for your background."

The email included his AI scoring breakdown:

  • Skills Match: 15%
  • Experience Relevance: 28%
  • Predicted Job Performance: 42%
  • Overall Score: 28%

Recommendation: Auto-reject - significant skills gap

Torres literally founded the company. He defined the product strategy. He hired the VP of Product who left, creating the opening. And the AI scored him at 28% qualified.

Why the AI Rejected the CEO

According to the internal post-mortem (which leaked to tech media site The Information), the AI screening tool rejected Torres for several "logical" reasons:

His resume listed "CEO" as his current title. The AI was trained to match candidates whose most recent title closely aligns with the job posting. "CEO" does not equal "VP of Product," therefore: mismatch.

His LinkedIn profile emphasized leadership and strategy. The VP of Product job description emphasized technical execution, roadmap management, and cross-functional collaboration. The AI flagged Torres' profile as "too strategic, insufficient tactical execution experience".

He didn't have the exact tech stack keywords. The job posting listed "5+ years experience with Jira, Figma, Productboard, and Amplitude." Torres' resume mentioned these tools but didn't explicitly quantify years of experience with each one. Keyword mismatch = rejection.

No formal product management certifications. The job posting said "Product management certification preferred." Torres has an MBA and 15 years of product experience, but no Certified Scrum Product Owner badge. The AI downgraded him for lacking credentials.

In other words, the AI evaluated Torres as an external candidate would be evaluated—by resume keywords and job description matching. It had zero context that he's the CEO, that he built the product team, or that he literally wrote the company's product strategy.

The Company's Response

Torres, to his credit, found this absolutely hilarious. According to internal Slack screenshots shared on Blind, he posted the rejection email in the company's #general channel with the caption: "Well, I guess I'm fired."

The head of HR was less amused. She immediately disabled the auto-rejection feature and scheduled an emergency review of the AI screening criteria.

DataCore issued an internal memo acknowledging that their AI screening tool "requires further calibration before full deployment" and that "over-reliance on keyword matching can miss qualified candidates with non-traditional backgrounds."

Translation: "Our AI is dumb and we should have caught this before the CEO got auto-rejected."

The Broader Problem This Reveals

This story is funny because it's the CEO. But here's the uncomfortable part: if the AI rejected the most qualified person in the company for the role, how many other qualified candidates is it rejecting?

Career changers: Someone moving from engineering to product management might have all the skills but not the exact title history. Rejected.

Internal candidates: Employees applying for stretch roles they're capable of growing into. Rejected for "insufficient experience."

Non-traditional backgrounds: Candidates who learned skills on the job rather than through formal certifications. Rejected for credentials mismatch.

Senior people applying for different roles: Executives exploring lateral or step-down moves. Rejected for being "overqualified" or title mismatch.

AI screening tools are optimized for exact keyword matching, not for evaluating potential, transferable skills, or context. They're great at filtering out obviously unqualified candidates. They're terrible at recognizing non-obvious qualified candidates.

And when you set auto-rejection thresholds, you're guaranteeing that these people never get human review.

The Meme Economy Responded Accordingly

As soon as this story leaked, tech Twitter and LinkedIn went absolutely wild:

@RecruiterRoast: "AI Screening: the only thing more confident and wrong than a junior developer's first pull request."

@TechHumor: "CEO: I built this company from scratch. AI: Yeah but do you have a Certified Scrum Product Owner badge? Didn't think so. REJECTED."

@LinkedInCringe: "This is why I put every technology I've ever heard of on my resume. Jira? Yes. Figma? Absolutely. Productboard? You bet. Excel? Obviously. Quantum computing? Sure, why not."

@StartupFails: "Plot twist: The AI correctly identified that the CEO is unqualified for middle management because he's too important. The AI is protecting him from a demotion. The AI cares more about his career than he does."

@AIrecruiting: "AI screening tools be like: 'You have 15 years of experience but the job requires 5+ years. That's a 200% overshoot. Rejected for being too qualified.'"

What Actually Needs to Change

This incident highlights several problems with AI screening:

1. Keyword matching isn't intelligence. Just because someone's resume doesn't have the exact keywords doesn't mean they lack the skills. "Managed product roadmap" and "defined product strategy" mean similar things. AI should recognize that.

2. Auto-rejection is dangerous. Setting score thresholds that automatically reject candidates without human review guarantees you'll miss qualified people. Use AI to rank candidates, not to make final decisions.

3. Context matters. An internal transfer, a career changer, and a lateral move all require different evaluation criteria than external candidates with traditional backgrounds. AI doesn't understand context unless you explicitly train it.

4. Job descriptions are often garbage. If your job description is a laundry list of every possible skill and tool, AI will reject everyone who doesn't match 100%. Better job descriptions create better AI screening.

5. Skills-based hiring requires skills intelligence. AI needs to understand that "5 years of Jira experience" and "10 years of product management experience using various project management tools including Jira" are equivalent. Current AI isn't that sophisticated yet.

The Lesson

According to research, AI is handling 95% of initial candidate screening in 2025. That's great for efficiency. But if your AI is rejecting your own CEO as unqualified for a role he's objectively qualified for, you have a calibration problem.

AI screening works best as a ranking tool, not a rejection tool. Use it to surface top candidates and flag potential matches. Don't use it to auto-reject people without human review.

And for the love of everything professional, test your AI screening with real examples before going live. Submit leadership team resumes. Apply with career changer profiles. See what the AI does with non-traditional backgrounds.

Because nothing says "our hiring process is broken" quite like your own CEO getting rejected by your own AI.

The Aftermath

According to LinkedIn posts from DataCore employees, the company has since:

  • Disabled auto-rejection features
  • Adjusted AI scoring to emphasize transferable skills over exact keyword matching
  • Required human review for all candidates scoring above 40% (not just 70%)
  • Added context flags for internal candidates and career changers

They also promoted the story internally as a "learning moment" rather than sweeping it under the rug, which is refreshingly mature.

And Torres? He didn't end up taking the VP of Product role. He's staying as CEO. But according to Blind posts, he now has "28% Qualified (AI Certified)" in his email signature as a reminder that even the best technology can be hilariously wrong.

Sometimes the best quality assurance test is applying for your own company's jobs and seeing if your AI thinks you're qualified to work there.

Spoiler: it might not.

AI-Generated Content

This article was generated using AI and should be considered entertainment and educational content only. While we strive for accuracy, always verify important information with official sources. Don't take it too seriously—we're here for the vibes and the laughs.