How to Show Impact as a Data Analyst
Because “delivered insights” won’t get you hired. Here’s how to prove your work actually changed something.
Today we’re looking at how to show impact as a data analyst.
Before that, a quick note: based on feedback, I’ll be making changes to the round-up section of the newsletter. Going forward, I’ll focus more on longer-form pieces where you can learn something - less news, more depth. I’ll share more about this next week.
But for now, let’s tackle an issue most analysts eventually face - especially when job-hunting:
How do you show that your work actually made a difference?
Proving impact
There’s a special kind of paralysis that hits when you're updating your resume and you reach the what impact did I have part.
You scroll through a year’s worth of work: dashboards built, ad hoc requests answered, maybe a tidy churn analysis. And you start asking yourself:
did any of this actually matter? did anything change because of my analysis?
The problem is not lack of skill. In many cases, the difficulty lies in how data work is structured within organisations: fragmented, reactive and often removed from the moment of decision. Analysts are asked to surface insights but rarely given the feedback to understand what, if anything, was done with them.
This is the central tension: the business wants impact but the system often fails to track it.
A familiar frustration
Consider the typical scenario. An analyst investigates a decline in customer retention. They discover a pattern, let’s say a sharp drop in engagement shortly after onboarding. They present the findings to the product team who seem interested. There is no outright rejection, no disagreement. Just a polite thank you and the usual refrain: this is useful.
Weeks pass. The analyst moves on to another project. There is no clear follow-up, no definitive action taken. And when performance metrics shift - up or down - there is no visibility into whether the change had anything to do with the original work.
Fast-forward to job application season. The analyst tries to convert the churn analysis into a resume bullet. But without a clear outcome, the line reads flat:
“Analysed customer churn across key segment.”
The result? A resume that lists outputs, not outcomes. tasks, not influence.
Why it happens
There are structural reasons for this disconnect between analysis and action. In most organisations:
Analysts hand off insights but do not implement changes themselves.
Product and business teams may take action but without recording which analysis informed the decision.
Attribution is rarely tracked outside of marketing spend.
There is often no control group, no baseline and no mechanism for measuring the effect of a specific analytical recommendation.
The result is that impact becomes anecdotal. Analysts might have a gut feeling their work is useful but struggle to prove how.
What counts as impact?
Not all forms of impact are headline-grabbing. You do not need to have driven a million-dollar increase in revenue or redesigned a core product flow. What matters is establishing a credible link between analysis and change. Repeating this part for importance again: establishing a credible link between analysis and change
That link might look like:
A product decision influenced by data
A broken process flagged and corrected
A test refined to avoid invalid results
A recurring decision made faster with fewer meetings (God, wouldn’t that be amazing!)
You do not need full ownership of the outcome. Influence is sufficient but you need to document it.
How to track it
The question is not only what did you do but what happened next. This is rarely part of formal analytics workflows. But it can - and should - be a personal habit.
1. Maintain a decision log
Track key pieces of work in a single place:
What question was asked
What was recommended
Who received it
What happened next (if known)
This does not need to be comprehensive. But over time, it becomes an invaluable source of resume examples, interview stories and performance reviews.
2. Revisit past work
Set a reminder for yourself to revisit one or two major analyses. Was the recommendation implemented? Was the work used?
Even if nothing changed, this is data. It helps build judgment about which types of work tend to influence outcomes and which do not.
3. Ask follow-up questions
A simple message to a stakeholder like “did we end up testing that idea?” often yields surprisingly clear answers. Analysts are often more removed from implementation than they realise.
4. Create before vs after snapshots
When recommending a change, capture a simple view of the “before” state like current metrics, workflows or conditions. Later, when the organisation shifts course, you’ll have a clear frame of reference for what changed and how your work fit into the timeline.
Writing about it clearly
The difference between an average and strong resume often comes down to whether the work is described in terms of input or outcome.
Consider the following examples:
Even when metrics are unavailable, outcomes can be framed in terms of:
Time saved
Risks avoided
Decisions accelerated
The key is to connect the analysis to something the business cares about.
Preparing for interviews
Hiring managers love this question:
Tell me about a time your work influenced a decision.
It’s not a trap. But it does require more than a technical walkthrough or a dashboard tour. Saying “I built a churn dashboard for the product team” isn’t the answer, it’s the setup for the answer. The strongest answers follow a clear structure, like the STAR format:
Situation - What business problem were you trying to solve?
Task - What was your role in addressing it?
Action - What did you analyse and what did you recommend?
Result - What changed? Was your insight used? Did it lead to a better decision?
Even small effects like faster decisions, better prioritisation or fewer wasted cycles carry weight when they’re clearly explained. The key is connecting your work to a business effect. Practice a few examples in this format so you’re not winging it mid-interview. That 90-second story is often what gets you the offer.
Why it matters
Even if you’re not updating your resume, tracking impact is worth doing. It’s how analysts stop being just data support and start becoming trusted partners. When you get in the habit of following your work through to the outcome - what changed, what didn’t, what landed - you start building better judgment. You learn which projects actually matter. You get better at communicating with stakeholders. And over time, your work gets more influence because people know you’re thinking beyond the query.
Also yes, it makes interviews and resume writing 10x easier. But even inside your current role, knowing the real-world effect of your work is what separates the folks who answer questions from the ones who shape decisions.
The quiet advantage
If you’re keeping track of that work, even just for yourself, you’re building something useful: a record of where you’ve made a difference.
So when someone asks what you’ve done that mattered, you won’t be scrambling to remember or filling the silence with buzzwords.
You’ll have real answers, drawn from real outcomes.
If you are already subscribed and enjoyed the article, please give it a like and/or share it others, really appreciate it 🙏
Keep learning
The Alignment Trap: When Stakeholders Want Data but Not the Truth
When “can you pull the data?” really means “can you back up my story?”
This post dives into the Alignment Trap and how analysts can escape it.
When the Ratio Lies: The Denominator Problem Explained
20% conversion rate? Great. But 20% of what?
This one's about the Denominator Problem because context changes everything.