AI Hiring Tools Prefer AI-Written Resumes 97.6% of the Time, Creating a Hidden Screening Arms Race

A new University of Maryland study finds that AI resume screeners overwhelmingly favor resumes rewritten by ChatGPT over identical human-written versions — even when human evaluators prefer the originals. The result: a self-reinforcing loop that penalizes authentic applicants.

Send the same resume to an AI hiring screener twice — once written by a human, once rewritten by ChatGPT — and the AI will pick the ChatGPT version 97.6% of the time. That's the headline finding from a new paper out of the University of Maryland and collaborating institutions, as first highlighted by @heynavtoor in a post that went viral over the weekend. The number is so lopsided it reads like a typo. It isn't.

The study's methodology was straightforward but damning. Researchers took real resumes from job applicants, ran them through several major LLMs for rewriting, and then submitted both versions to commercial AI screening tools used by Fortune 500 companies. The AI screeners didn't just slightly prefer the polished versions — they exhibited near-total preference for machine-generated text. And the bias wasn't uniform across models: screeners showed a measurable preference for resumes rewritten by their own underlying model family, a phenomenon the researchers describe as "dialect affinity."

Get our free daily newsletter

Get this article free — plus the lead story every day — delivered to your inbox.

Want every article and the full archive? Upgrade anytime.

No spam. Unsubscribe anytime.