From Process to Performance: Making Metrics Work for You
How often do you review your KPIs? And how often do they surface real issues? - 5 mins read

By Stefani Markov
Authors

Stefani Markov
Lean Six Sigma Black Belt, PMP, and MOS: Expert(Microsoft)
Founder & CEO

Orlin Markov
Lean Six Sigma Black Belt, PMP
2/3/26, 7:00 AM
How often do you review your KPIs? And how often do they surface real issues?
Completion rates look good. Approvals are on time. And yet, the same issues appear again next month. Everything is green, yet ask what actually happened, and the story is rarely just "Ok".
If a metric moves and nothing changes in how we act, it isn’t managing performance. It’s just confirming that a step was completed.
We stare at the same Key Performance Indicators (KPIs) week after week, month after month, until they become little more than static numbers on a dashboard. Somewhere along the line, the why gets lost in the routine of the what.
Like many of us, I periodically find myself in this very situation. I’m so accustomed to preparing and reviewing a specific set of metrics that I forget the fundamental reasons for their existence. This is why I’ve learned to take a deliberate step back and ask a crucial set of questions:
What is the purpose of this metric?
Why do I measure it?
How does this KPI help me achieve our goals?
If it moves tomorrow, do we know exactly what to do differently?
The answers to these questions are what transform simple data points into drivers of performance. This shift - from process to performance - happens when metrics are deliberately linked to strategic objectives and decision-making, a linkage strongly supported by methodologies like Lean Six Sigma and the discipline of SMART goal setting.
The Lean Six Sigma Lens: Purpose Before Measurement in RtR
Lean Six Sigma (LSS) focuses on minimizing waste while reducing variation and defects in business processes. At its core is a simple principle: metrics must reflect value, not just activity.
That is why LSS starts with Define before Measure. Before selecting KPIs, thresholds, or dashboards, we must first reconfirm the actual purpose of the process or control we are measuring. Otherwise, we risk optimising the measurement rather than the outcome.
In the Record-to-Report (RtR) cycle, this question is especially important. What is the goal we are truly trying to achieve?
Is it to maintain a clean and correct balance sheet?
Is it to surface and resolve old or unclear items?
Is it to reduce future close effort and audit risk?
Is it to demonstrate that reconciliations were completed, approved 100%, and approved on time?
These objectives are not the same - and they do not require the same metrics.
Timely approval and completion rates are often treated as proxies for quality. But a reconciliation can be approved quickly and still not be meaningfully reviewed. A signature confirms that a step occurred; it does not confirm that professional judgement was applied. When metrics reward speed and completion alone, they can unintentionally encourage tick-the-box behaviour rather than critical review.
From a Lean perspective, this is another form of waste: effort spent proving compliance, while the control’s real purpose - detecting, understanding, and preventing misstatement - is weakened.
Reconfirming the goal allows us to design metrics that reinforce the right behaviour. If the objective is resolution, metrics should surface persistence and recurrence. If the objective is balance sheet integrity, metrics should be risk-weighted and focused on materiality. And if the objective includes review quality, metrics must distinguish between approval and actual challenge. If a metric mainly proves how well documentation was prepared or how fast something was signed off - but not whether an issue was understood or resolved - it is measuring effort, not performance.
💡 Example 1: The Journal Entry (JE) Process
Consider the process of inputting and approving Journal Entries.
Wrong Metric (Process Focus):
Total number of JEs processed per week
This metric incentivises volume over quality. A high JE count may indicate upstream process issues, poor automation, or manual workarounds. It tells us something is happening, but nothing about whether it’s happening well.
Lean Six Sigma Metric (Performance / Value Focus)
Percentage of JEs requiring rework or correction
Cycle time for high-risk JEs (submission to posting)
Rework percentage exposes defects and variation. Cycle time links JE quality to a broader RtR outcome: Days to Close. Together, these metrics shift focus from activity to value - and they influence behaviour accordingly.
💡 Example 2: Balance Sheet Reconciliations - When Lagging Metrics Still Matter
Balance sheet reconciliations are inherently after-the-fact controls. By definition, they confirm what has already posted. This often leads to the assumption that reconciliation metrics can only ever be lagging - and therefore limited in value.
The real risk with reconciliations is not that they are lagging, but that they are easy to make look clean without actually resolving the issue. Items can be netted, reclassified, parked, or rolled forward, giving the appearance of control while the underlying problem persists.
Trying to prevent this through highly detailed tracking often turns measurement itself into waste.
Traditional Metrics (Process / Compliance Focus)
% of reconciliations completed by deadline
% of reconciliations approved on time
Number of overdue or missing reconciliations
These metrics answer one question: Was the control performed?
They do not tell us whether issues were challenged, understood, or resolved. A reconciliation can be approved on time every month and still carry the same aged items indefinitely.
Performance-Oriented Reconciliation Metrics (Without Heavy Analysis)
Rather than trying to prove that every reconciling item was “truly cleaned,” effective metrics focus on signals that make superficial fixes and weak review visible.
Persistence of Aged Items
% of aged reconciling balances that remain unchanged month-over-month Persistent items indicate that problems are being moved rather than resolved.
Account-Level Recurrence
% of accounts with reconciling differences across consecutive periods Most recurring issues sit in the same accounts. Measuring at this level applies the 80/20 rule without item-level analysis.
First-Time Clean Reconciliation Rate (Leading Indicator)
% of accounts reconciled with no reconciling items on first submission Although reconciliations are lagging controls, this metric reflects upstream quality and predicts future effort.
Reappearance After Clearance
% of cleared reconciling balances that reappear within one or two periods Items that return often point to temporary fixes rather than real resolution.
Review Outcome Indicator
% of reconciliations where review resulted in a question, correction, or follow-up A review that never challenges anything is unlikely to be a review at all.
These metrics do not require line-item forensics or extensive commentary. They are designed to trigger investigation and corrective action - not to demonstrate perfection.
When Measurement Becomes the Waste - Again
Balance sheet reconciliations are especially vulnerable to over-measurement: excessive attributes, manual aging schedules, long explanations, and detailed trackers that no one actively uses.
From a Lean Six Sigma perspective, this is a warning sign. Metrics should enable better decisions, not create defensive documentation. In practice, simple, directionally correct indicators are far more effective than perfectly engineered KPIs that consume time but fail to change outcomes.
The goal isn’t measurement purity. It’s better decisions and upstream fixes.
Setting the Target: Metrics Driven by SMART Goals in Finance
A KPI only becomes meaningful when anchored to a clearly defined goal. This is where the SMART framework becomes essential.
When you ask, “How does this KPI help me achieve the set goals?”, you are testing whether the metric is both Relevant and Measurable - and whether movement in the KPI leads to a specific response.
This also highlights the importance of distinguishing between:
Lagging indicators (e.g. post-close reconciling items), and
Leading indicators (e.g. first-time clean reconciliations or JE rework rates).
If all your KPIs are lagging, you’re managing history - not performance.
Before adding or refining a KPI, ask:
If this metric moves tomorrow, who decides what to do - and what decision does it trigger?
If the answer isn’t clear, the problem isn’t the target. It’s the metric.
The Takeaway: From Data Viewer to Performance Driver
To move metrics from static reporting into performance management:
Define the Value Start with what truly matters: timely, accurate, and compliant reporting.
Align to SMART Goals Ensure every KPI tracks progress toward a clear outcome, not just activity.
Design the Decision Path A KPI should have predefined thresholds, responses, and escalation.
Use Metrics to Control, Not Just Report As in the Control phase of LSS, metrics should help prevent repeat issues - not just document them.
Assign Ownership and Review Cadence Without clear ownership and a defined review rhythm, even the best KPIs fade into background noise.
By challenging why you measure what you measure - and how those metrics influence behaviour - you ensure that your RtR KPIs don’t just confirm compliance, but actively drive better performance, lower risk, and more sustainable processes.
Ready? Let's talk.
Related Consulting Serivces
How We Can Help