Back to Funnies
Funnies

AI Video Interview Freezes Mid-Answer and Scores Candidate 'Poor Communication'

December 22, 2025
3 min read
Share this article:

One-way video interviews are already awkward. You're talking to a camera, answering pre-recorded questions, hoping you come across as professional and competent while essentially having a conversation with yourself.

Now imagine doing that, the video freezes mid-answer, and the AI evaluating your responses gives you a failing score for "poor communication skills" because all it analyzed was a frozen frame of you mid-word looking confused.

That's what happened to a candidate applying for a customer service role, and the screenshots of the AI feedback are absolutely wild.

How One-Way Video Interviews Work

Companies use one-way asynchronous video interviews for high-volume screening. Candidates record responses to standardized questions on their own time. Recruiters review recordings later, or AI analyzes responses automatically.

AI-powered video interview platforms can analyze speech patterns, evaluate responses, score candidates, and predict job fit. The technology is supposed to make screening faster and more objective.

The technology is definitely not supposed to freeze, analyze frozen video, and reject candidates because the software broke.

But here we are.

What Actually Happened

The candidate started the video interview on their laptop. Question one: "Tell me about a time you handled a difficult customer situation." They began answering confidently, describing a specific scenario, demonstrating problem-solving and empathy.

Thirty seconds into their response, the video froze. The recording interface showed "Processing..." but never resumed. After two minutes of staring at a frozen screen, the candidate refreshed the page.

The platform showed "Interview completed" and wouldn't let them re-record the response. "Completed? I only answered half of one question. How is that completed?"

They contacted support. No response. They emailed the company's recruiting team. Auto-reply: "Thank you for your application. We'll be in touch if you're selected to move forward."

Three days later, they received a rejection email with AI-generated feedback:

"Communication Skills: 2/10 - Candidate demonstrated poor verbal communication with long pauses, incomplete responses, and difficulty articulating thoughts clearly. Eye contact was inconsistent. Overall presentation lacked professionalism."

The candidate was baffled. Long pauses? They'd answered for 30 seconds before the freeze. Incomplete responses? The platform froze mid-answer. Inconsistent eye contact? They were staring directly at the camera the entire time until the video stopped working.

The AI's "Analysis"

AI video interview platforms analyze responses in real-time, evaluating speech patterns, word choice, facial expressions, and body language. When the platform froze, the AI apparently analyzed whatever video it had—30 seconds of answer plus two minutes of frozen frame.

To the AI, the candidate's response looked like this:

  • Spoke for 30 seconds
  • Paused for 2 minutes (actually: video froze)
  • Provided no further response (actually: couldn't because software broke)
  • Maintained unnaturally still posture (actually: frozen video frame)
  • Fixed facial expression suggesting disengagement (actually: still image captured mid-word)

The AI doesn't understand technical failures. It sees data patterns and evaluates them according to its training. A 2-minute pause in a video interview looks like poor communication skills—even if that pause was caused by the platform malfunctioning.

The Candidate's Reaction

The candidate posted the rejection email on LinkedIn with the caption: "I got rejected from a job because the company's video interview software froze and the AI thought my frozen video was me being a bad communicator. I can't make this up."

The post went viral. Hundreds of comments from candidates sharing similar stories:

  • "Same thing happened to me. Platform crashed, AI gave me 0/10 for 'failing to complete the interview.'"
  • "My video had audio issues. AI scored me poorly for 'mumbling and unclear speech' when the problem was their software."
  • "Platform froze on the last question. AI rejected me for 'incomplete responses.' I literally couldn't finish because the software broke."

Companies trust AI to evaluate candidate quality, but when the AI can't distinguish between candidate performance and technical failures, that trust is misplaced.

The Company's Non-Response

The candidate tried contacting the company again, this time with screenshots showing the video freeze and the AI feedback. "Your platform malfunctioned. Your AI rejected me for communication problems caused by your software breaking. I'd like to redo the interview."

Radio silence for a week. Then a generic response: "Thank you for bringing this to our attention. Unfortunately, we've moved forward with other candidates. We wish you the best in your job search."

No acknowledgment of the technical failure. No apology. No offer to redo the interview. Just a standard rejection confirming that the AI's flawed evaluation stood.

When AI makes hiring decisions and companies don't review them, technical failures become candidate rejections. Nobody caught the error because nobody was reviewing AI-generated feedback before sending rejections.

How This Keeps Happening

Video interview platforms can have bugs, crashes, and technical issues. Candidates applying from phones, using different browsers, or dealing with internet connectivity problems experience failures regularly.

The problem is what happens next. When AI analyzes failed recordings without detecting technical issues, it generates misleading evaluations. Those evaluations get sent to candidates as official feedback, and companies treat them as legitimate assessment results.

Better platforms detect technical failures and flag them for human review. They identify when video freezes, audio drops out, or recordings fail to complete properly. Those candidates get invited to redo interviews instead of getting rejected for problems caused by software.

Cheaper platforms don't have these safeguards. They analyze whatever video exists, technical failures and all, and generate scores that companies use to make hiring decisions.

The Vendor's Response

The video interview platform vendor was tagged in the viral LinkedIn post. Their response: "We take candidate experience seriously. Our platform includes technical checks to ensure recording quality. If candidates experience issues, we recommend they contact support immediately."

The candidate replied: "I did contact support. Nobody responded. Your platform showed 'Interview completed' when I'd answered one question for 30 seconds. How is that a completed interview?"

No further response from the vendor.

Companies using these platforms often don't know how frequently technical failures occur because vendors don't proactively report malfunction rates and candidates who experience problems often just withdraw from the process instead of complaining.

What Should Have Happened

The platform should have detected the video freeze and prompted the candidate to re-record. When recordings fail to meet minimum quality thresholds, candidates should automatically get retry options.

The AI should have flagged the incomplete response for human review instead of automatically scoring it. Video interviews that are unusually short, have extended pauses, or show technical anomalies should trigger manual review.

The company should have reviewed AI-generated feedback before sending rejection emails. Automated evaluations claiming candidates demonstrated "poor communication" should be spot-checked by recruiters.

None of that happened because automation is only valuable if it eliminates human involvement, and introducing human review "defeats the purpose" of AI screening.

Except when AI evaluates broken video recordings and rejects qualified candidates, maybe human review is exactly what's needed.

The Broader Implications

74% of recruiters use video interviewing tools, resulting in an average 30% reduction in hiring time. Those efficiency gains are real—when the technology works properly.

When it doesn't, candidates get rejected for technical failures beyond their control, companies lose qualified applicants, and everyone blames the AI.

Complex application processes and buggy video interfaces lose candidates. You might save recruiting time with AI screening but lose your top candidates because the experience sucked.

The candidate who got rejected for communication problems caused by frozen video? They're now employed at a competitor who conducted normal phone screens instead of AI video interviews. The company using buggy AI screening lost a qualified candidate to their competition because they trusted automation too much and reviewed it too little.

The Lesson

If you're using AI video interview platforms:

  1. Test them thoroughly with real technical scenarios (poor internet, mobile devices, different browsers)
  2. Build in technical failure detection so broken recordings get flagged, not scored
  3. Review AI-generated feedback before sending rejections, especially extreme scores
  4. Provide easy ways for candidates to report technical issues and redo failed interviews

Or, radical idea: use AI video interviews for initial screening but don't let them make final rejection decisions without human review.

Until then, candidates will keep getting rejected for "poor communication" that was actually frozen video, and companies will keep wondering why their AI screening isn't finding good candidates.

The AI can't tell the difference between a bad answer and a broken video. Maybe that's something humans should still be checking.

Your Ad Could Be Here

Promote your recruiting platform, tools, or services to thousands of active talent acquisition professionals

AI-Generated Content

This article was generated using AI and should be considered entertainment and educational content only. While we strive for accuracy, always verify important information with official sources. Don't take it too seriously—we're here for the vibes and the laughs.