Back to Funnies
Funnies

AI Job Description Generator Creates Posting Requiring "Unicorn Who Walks On Water"

November 12, 2025
3 min read
Share this article:

A San Francisco marketing startup learned an important lesson this week: just because AI can write your job description doesn't mean it should.

The company, which we'll call "DisruptAI" (because of course that's what they're called), decided to use an AI job description generator to craft a posting for a Senior Marketing Manager role.

What came out was a masterpiece of absurdity that immediately went viral on LinkedIn and Twitter.

The Greatest Hits

Let's review some of the actual requirements from this AI-generated job posting:

"Must be a unicorn who walks on water"

Apparently, the AI took common recruiting phrases like "we're looking for a unicorn" and "someone who can walk the walk" and decided to merge them literally. Mythical creature status and aquatic ambulation are now job requirements.

"5-7 years of experience with platforms that don't exist yet"

The AI listed required experience with "HyperReach 3.0" and "QuantumPost Pro"—tools that no human has ever used because they're not real. Did the AI hallucinate marketing platforms? Did it glimpse the future? We may never know.

"Ability to work in a fast-paced, slow-paced, medium-paced environment"

The AI clearly wanted to cover all its bases. The company is simultaneously fast-paced, slow-paced, and medium-paced. Schrodinger's startup culture.

"Must have MBA and also disagree with traditional business education"

This one is chef's kiss. You need to have an MBA but also think MBAs are useless. The AI achieved peak Silicon Valley cognitive dissonance.

"10+ years of experience in role (recent graduates encouraged to apply)"

So you need 10 years of experience, but fresh graduates should definitely apply. This is the job posting equivalent of "entry-level position requiring 5 years of experience" but somehow even more contradictory.

"Salary: Competitive (must be willing to work for exposure)"

The AI promised competitive salary in one bullet point, then immediately followed up with "payment may include equity, exposure, and the opportunity to disrupt an industry." Exposure doesn't pay rent, AI. We've covered this.

How Did This Happen?

According to sources familiar with the company, here's what went down:

The founder decided to "leverage AI to optimize recruiting workflows" (translation: save money by not hiring an actual recruiter).

They used an AI tool that promised to "generate compelling job descriptions using advanced language models trained on thousands of successful postings."

They fed the AI some bullet points about what they wanted. The AI took those inputs, combined them with whatever data it scraped from the internet, and produced... this.

The founder, in a hurry, posted it without reading it.

The job went live on LinkedIn, Indeed, and the company's careers page. Within hours, it was screenshot and shared across recruiting Twitter with people asking "is this real?"

It was very real.

The Fallout

The post was live for six glorious hours before someone on the team noticed their job description had gone viral for being completely unhinged.

By that time:

  • 2,000+ people had seen it on LinkedIn
  • 47 people actually applied (respect to those chaos agents)
  • The posting was shared 500+ times on Twitter with commentary ranging from "this is peak tech bro nonsense" to "I mean, I can't walk on water but I'm willing to learn"
  • Several recruiting influencers used it as an example of "what not to do"

The company pulled the posting and issued a statement calling it "an experimental approach to job descriptions that didn't land as intended."

Translation: "We let AI write our job posting without reading it and now we're embarrassed."

The Applications They Received

Because the internet is beautiful, several of the 47 applicants who applied decided to have fun with it.

One candidate submitted a resume listing:

  • "Expert at walking on water (ocean, lake, and swimming pool experience)"
  • "Certified unicorn with horn maintenance certification"
  • "5 years experience with QuantumPost Pro (in my dreams)"

Another applicant's cover letter:

  • "While I cannot technically walk on water, I did once walk on a frozen pond, which is basically the same thing."

A third applicant attached a photo of themselves in a unicorn costume with the message: "When do I start?"

These people are heroes.

What We Can Learn From This Disaster

This story is hilarious, but it also highlights real problems with AI-generated job content:

AI Doesn't Understand Context

The AI scraped phrases like "unicorn candidate" and tried to be clever. It doesn't understand that "unicorn" is recruiting slang, not an actual requirement.

Lesson: AI tools trained on internet data will confidently produce nonsense because they don't actually understand what they're writing.

AI Optimizes For The Wrong Things

The AI was probably trained to create "engaging" and "dynamic" job postings. So it threw in buzzwords and contradictions because those things appear frequently in training data.

Lesson: AI optimizes for patterns in data, not for logic or coherence.

Humans Still Need To Read Things

The founder's biggest mistake wasn't using AI. It was trusting AI output without review. AI is a tool, not a replacement for human judgment.

Lesson: Always review AI-generated content before it goes live. Always.

Job Descriptions Actually Matter

This company treated job descriptions as an afterthought—something to automate away. But your job description is often a candidate's first impression of your company. Making that impression "we let robots write our job postings without checking them" is not ideal.

Lesson: Job descriptions are marketing materials. Treat them accordingly.

The Bigger Picture

This story is funny, but it's part of a larger trend: companies rushing to implement AI without thinking through the implications.

We've seen:

AI can be useful for recruiting. But it requires:

  • Proper training data
  • Human oversight
  • Understanding of limitations
  • Testing before deployment

"Upload some bullet points and let AI handle the rest" is not a strategy. It's a recipe for viral embarrassment.

Update: The Company's Response

After the story blew up, DisruptAI's founder posted an apology on LinkedIn:

"We experimented with AI tools to streamline our hiring process and learned a valuable lesson about the importance of human review. We're grateful for the community's feedback and are now taking a more thoughtful approach to recruiting."

Translation: "We got roasted online and now we're going to actually read our job postings before publishing them."

The revised job posting—written by an actual human this time—went up two days later. It was clear, specific, and refreshingly free of unicorn requirements.

They reportedly received 200+ applications for the revised posting, proving that well-written job descriptions actually work.

The Bottom Line

AI can help with job descriptions. Tools like Textio and others provide useful suggestions for improving clarity and inclusivity.

But AI isn't a substitute for human judgment, strategic thinking, or basic reading comprehension.

If you're using AI to write job descriptions:

  1. Review the output thoroughly
  2. Edit for coherence and accuracy
  3. Remove obvious nonsense (like unicorn requirements)
  4. Remember that real humans will read this

And if you're looking for a unicorn who walks on water? Good luck with that. Those candidates are pretty rare.

Sources:

AI-Generated Content

This article was generated using AI and should be considered entertainment and educational content only. While we strive for accuracy, always verify important information with official sources. Don't take it too seriously—we're here for the vibes and the laughs.