Skip to Content

Your Excel test is measuring the wrong thing

Written by
Oliver Savill
Updated
decorative gradient bars

Picture two candidates for the same analyst role. One scores 85% on your Excel test and can recite XLOOKUP syntax in their sleep. The other scores 60%, but spots patterns in a dataset faster than anyone on your current team.

Hire the first, and you've hired a spreadsheet. Hire the second, and you've hired an analyst.

The instinct to screen for Excel skills is fair — data-heavy roles need people who can handle data-heavy tools. But an Excel test answers a surprisingly narrow question: does this person know this particular software, right now, at this particular moment? Judgement, reasoning, the ability to ask the right question of a dataset — all of it sits outside the test.

Excel testing in hiring: the honest picture

  • clock, icon 1 minute

Ben Schwencke covers what Excel tests genuinely offer in a selection process, and what most hiring teams should be prioritising instead.

What Excel tests actually measure

Excel tests measure current familiarity with Excel. That's it. Formulas, pivot tables, formatting, the usual drill. It's a learnable skill, which means someone who's been using Excel daily for three years will beat someone who hasn't opened it since university — regardless of which of the two is the sharper thinker.

Our own Excel skills test sits at beginner-to-intermediate level, which is exactly the right pitch for the job it does: gatekeeping roles where basic proficiency is a genuine week-one requirement. It is not, and was never designed to be, a measure of analytical potential.

Where they genuinely help

There is a real case for Excel testing in the right spot. If a role demands unsupported spreadsheet work from day one, and your onboarding will not cover the basics, a screening test filters out candidates who'll struggle before they've started. These tests are cheap to run, fast to score, consistent across applicants, and they scale effortlessly to volume hiring. For the right role, that is meaningful.

Where they quietly go wrong

Excel skills close fast. A bright candidate with no Excel background reaches competence within weeks. Which leads to an uncomfortable implication: your test is probably filtering out people on the basis of a gap that would have disappeared before their first performance review.

Graduate and early-careers candidates get hit hardest. High cognitive ability, limited workplace software exposure — exactly the profile an Excel test punishes for the wrong reasons.

There is also a ceiling effect. Once past a basic threshold of proficiency, Excel scores stop predicting anything interesting about analytical performance. The gap between 70% and 90% tells you almost nothing about who will make better decisions with the data underneath.

And then there's Copilot

Microsoft 365's AI tools now write formulas, clean datasets, and draft pivot tables from a plain-language prompt. The routine Excel work that used to require deliberate skill is being automated at speed.

What AI cannot do is interrogate the data. It cannot tell you which question to ask, whether an answer makes sense, or what any of it means for the business. That judgement layer is exactly what cognitive ability tests measure. Excel is not becoming irrelevant — but the hierarchy is shifting. Reasoning is rising. Tool fluency is not.

What actually predicts performance

The research here is clear. Cognitive ability is the single strongest predictor of job performance, and for data-heavy roles, numerical reasoning is the specific ability that matters most. It measures how well a candidate interprets quantitative information, spots patterns, and draws accurate conclusions under pressure. That is the skill that separates a strong analyst from a mediocre one — and it has nothing to do with whether they can freeze a pane.

Candidates who score well on numerical reasoning pick up Excel quickly. They also pick up whatever replaces Excel, and whatever comes after that. You are not hiring for familiarity with a software package; you are hiring for the capacity to do the analytical work.

AssessmentWhat it measuresBest suited for
Excel skills testCurrent Excel knowledgeRoles needing day-one unsupported Excel use
Numerical reasoning testAnalytical thinking and data interpretationAny role involving analysis, reporting, or commercial judgement
Inductive reasoning testPattern recognition and abstract problem-solvingEarly careers, technical and STEM roles

The better question

If you're running an Excel test as your main screen, you're asking: can they use Excel right now? That is a useful question for a narrow set of roles. For most data-heavy hires, the question worth answering is: can they think analytically, learn quickly, and make good decisions from data?

Numerical reasoning tests answer it. They are better predictors of performance, harder to coach for, and fairer across candidates at different stages of their careers. Pair one with an Excel test, or replace the Excel test entirely, and you will get a sharper read on who will actually thrive once they are in the role.

The Excel gap closes fast. The reasoning gap does not.

author profile oliver savill
Primary author

Oliver Savill

Director and Founder of Test Partnership. Over 10 years experience in the psychometric testing industry.