About this case: At the client’s request, company name and specific figures have been anonymized. Given the sensitivity of AI in recruitment and their market position, they do not want to make their approach fully public.
Tens of thousands of CVs per year, and no time to read
The operations director of this recruitment agency, specialized in tech profiles, faced a familiar problem. “We get on average more than a hundred applications per vacancy. Of those, maybe fifteen are relevant. But to find those fifteen, you have to look at all of them.”
Her team of recruiters collectively spent dozens of hours per week on initial screening. “It’s work no one enjoys, but where you can’t afford to make mistakes. Missing one good candidate can cost a client.”
The solution seemed obvious: AI. “Every week two vendors would call. ‘Our AI screens CVs in seconds.’ ‘Save 80% of your time.’ It sounded perfect.”
But when she asked follow-up questions, a different story emerged.
”How do we know it doesn’t discriminate?”
At the first vendor demo, she asked the question she would repeat at every subsequent demo: “How do we know your AI doesn’t discriminate on age, gender, or background?”
“The salesperson started stammering about ‘advanced algorithms’ and ‘trained on millions of CVs’. But he couldn’t explain how the system reached a decision. And he certainly couldn’t explain what would happen if a rejected candidate asked why.”
This wasn’t a theoretical problem. The GDPR gives applicants the right to know how automated decisions about them are made. And equal treatment legislation prohibits discrimination in recruitment and selection—even if an algorithm does it.
“Our legal department was clear: if we can’t explain why someone was rejected, we can’t use the tool.”
“Every AI vendor promised 80% time savings. None could explain how we would stay compliant.”
The question behind the question
When this agency came to us, we didn’t start with technology. We started with the question: what do you actually want to solve?
The answer was more nuanced than “screen CVs with AI”:
- Time savings, but not at the expense of quality
- Consistency in how CVs are evaluated
- Explainability for candidates and for the legal department
- Control remaining with recruiters—AI as tool, not as replacement
“You asked questions the vendors didn’t ask. Not ‘which AI tool do you want?’ but ‘what are you actually allowed to automate?’”
Suggestions instead of decisions
The core of our solution: the AI doesn’t make decisions. The AI makes suggestions that recruiters must confirm.
How it works:
- CV comes in and is automatically parsed
- AI compares the CV with the job requirements and gives a match score
- Each score includes an explanation: “Match on Python (5 years experience requested, 7 years found)” or “Mismatch on location (remote requested, only on-site available)”
- Recruiter sees the suggestion and decides themselves to proceed or reject
- Every decision is logged with reason
The crucial difference: the AI never automatically rejects. Even a CV with a low score reaches the recruiter—just at the bottom of the list.
“This was the breakthrough. Our lawyers accepted it because we could demonstrate that a human always makes the final decision. And we could explain every rejection.”
“The AI doesn’t say ‘no’. The AI says ‘maybe look at this one last, and here’s why’.”
No black box, but a glass box
Transparency was a hard requirement. We built the system so that every step is explainable:
Scoring based on explicit criteria No mysterious “fit percentage”, but concrete matches on skills, experience, and availability. The criteria come directly from the job description.
No training on historical data Many AI recruitment tools are trained on who was hired in the past. That builds in existing bias. Our system only compares with stated requirements—not with who “typically” gets hired.
Audit trail for every candidate For every applicant you can see: what score did the CV get, what explanation went with it, who reviewed it, and what was the decision. Perfect for GDPR requests.
Regular bias checks We periodically analyze the data: are certain groups systematically scored lower? Are they more or less often passed through by recruiters? The reports go to HR and legal.
The tech under the hood
For the technically interested—this is what we built:
- FastAPI backend for fast CV processing and APIs toward their ATS
- LangChain for structured prompts that generate explainable output
- OpenAI API (GPT-4) for the actual CV analysis
- PostgreSQL for the audit trail and reports
- React frontend integrated into their existing recruiter dashboard
- AWS hosting with data in EU region (GDPR requirement)
The integration with their existing Applicant Tracking System meant recruiters didn’t have to switch tools. The AI suggestions just appear in their familiar interface.
Results
The numbers speak for themselves:
- Significant time savings on initial screening per recruiter
- Faster time-to-hire through faster initial response
- Passed external audit by a specialized firm
- Adoption by partners in their network who wanted the same solution
But what they’re most proud of: “We haven’t had a single complaint from a candidate about how they were evaluated. Because we can explain it.”
“This is AI as it should be: it makes us better at our job, not obsolete. And we can sleep at night because we know it’s fair.”
Lessons learned
-
Compliance first, technology second
Don’t start with “which AI tool do we buy?” but with “what are we allowed to automate?” That question determines the solution direction. -
Explainability is not a nice-to-have
In recruitment it’s a hard requirement. But in other domains too: if you can’t explain why the AI does something, you can’t justify it. -
AI as assistant, not as decision-maker
Keeping the human in the loop is not only ethically right—it also makes adoption easier. Recruiters felt helped, not replaced. -
Bias is everywhere, even in “neutral” data
Training on historical hiring data reproduces historical bias. Sometimes the best AI solution is one that doesn’t learn from the past. -
Integration > standalone tool
Nobody wants an extra tool. The AI suggestions appear in the system recruiters already use. That made adoption self-evident.