Previous page

Why data quality breaks when accountability is unclear

26/02/2026

In many organisations, data quality is discussed as if it were primarily a technical matter. Systems, interfaces, and legacy applications are blamed, and the conversation quickly turns to tooling, architecture and data platforms. 

Robust data architecture is a prerequisite for decision making. Without proper integration, governance frameworks, data models and quality controls, reliable information cannot exist. Yet in practice, persistent data quality problems rarely originate in technology alone. They emerge where strategy, structure and ownership are unclear.  

When definitions drift away from strategy 

When numbers differ between reports, the instinct is often to ask IT to “fix the data.” But when examined closely, the issue is frequently not corruption or system failure. It is ambiguity. 

Different teams apply slightly different definitions. KPIs evolve without formal alignment. New products, services or pricing models are introduced without updating the underlying data logic. Data may be technically correct within each domain, yet inconsistent across domains. 

This is not irrational behaviour. Departments optimise locally. They adapt metrics to support their operational reality. Over time, these adjustments become embedded practice. 

The root cause is often strategic. If the organisation has not clearly defined which metrics are critical for steering the business, and how they connect to strategic priorities, definitions start drifting. Data quality deteriorates because the operating model does not anchor what truly matters. 

Data quality therefore begins with clarity about what the organisation is trying to achieve and how performance should be measured consistently across functions. 

The same metric, multiple versions 

In many companies, the same metric exists in several versions. Revenue may exclude certain items in one report and include them in another. Customer numbers are counted differently across departments. Operational performance indicators are adapted locally to reflect specific process realities. 

We worked with a manufacturing company where on time delivery was tracked in three different ways. Logistics measured it from shipment confirmation to customer receipt. Sales referred to the original promised date. Operations tracked production completion against planning. Each number was technically correct, but none of them aligned. 

When management asked for the real on time delivery performance, discussions stalled. The issue was not system failure. It was the absence of cross functional alignment on which definition supported strategic decision making. 

This illustrates how data architecture, operating model design and leadership accountability intersect. Without clear strategic anchoring and defined ownership, even technically correct systems produce fragmentation. 

When shared interpretation becomes critical 

The issue becomes visible when decisions depend on shared interpretation. Executive meetings are then consumed by reconciling numbers before discussing what action to take. We have seen management teams spend significant time debating whether customer growth was 4.2 percent or 5.8 percent before addressing the strategic question of whether to invest in acquisition or retention. 

Analysts are asked to explain discrepancies instead of exploring options. Confidence in data erodes. People rely on personal extracts and local calculations. 

At this point, the data platform is not the sole issue. The deeper problem is unclear ownership of definitions and insufficient governance linking metrics to decision forums. 

Technology can consolidate and validate data. It cannot decide which definition is strategically relevant. That requires leadership alignment. 

Data ownership is not a job title 

Many organisations formally appoint data owners. This is an important step in strengthening governance. Yet the problem rarely disappears automatically. 

Data owners are often positioned within IT or a central data organisation. They ensure technical consistency and data lineage. However, they are not always aware of the precision required for specific business decisions. 

A financial services organisation had appointed data owners for all major domains. When a pricing error caused significant financial impact, it turned out that a key product margin logic had not been updated for over a year. The data owner knew the field was outdated. What was missing was clarity about how critical that field was for pricing decisions. 

Ownership exists on paper. It must also exist in practice, embedded in decision processes. 

Data quality improves when ownership is tied to strategic relevance and reinforced by leadership behaviour. 

The Excel compensation layer 

In parallel, business teams often compensate for data limitations. Figures are adjusted in spreadsheets. Definitions are slightly modified to fit local needs. Known inconsistencies are corrected manually. 

Because these adjustments happen informally, there is limited feedback to the data platform. Errors remain in core systems. Workarounds become institutionalised. 

A controller exports ERP results monthly, corrects known issues in Excel, adds calculated fields and redistributes the file. Everyone knows this happens. No one documents it. The ERP logic remains unchanged. 

As long as operations continue smoothly, this layer of human correction masks structural weaknesses. 

Here, culture plays a decisive role. If leadership tolerates silent correction instead of structured escalation, fragmentation persists. If discrepancies are surfaced openly and connected to governance, improvement becomes possible. 

Why data quality programmes disappoint 

Large data quality initiatives often focus on cleansing, migration and tooling. These are essential investments. Strong data platforms, master data management and governance frameworks are critical foundations. 

However, without clarifying strategic definitions, decision ownership and feedback loops between business use and data governance, quality deteriorates again after the project ends. 

We have seen organisations invest heavily in data platforms and quality tools. The technology functions as designed. Six months later, inconsistencies reappear. The tools did not fail, the organisation just never aligned on who decides what a customer is, who approves product hierarchy changes or who ensures new services update the reference data. 

Aligning strategy, data and leadership 

Sustainable data quality rests on three interconnected building blocks. 

First, strategic clarity. Organisations must define which metrics are critical for steering the business and ensure that definitions are aligned across functions. 

Second, robust data capability. Data integration, architecture, governance and tooling must support consistent definitions, controlled changes and transparent lineage. 

Third, leadership ownership. Someone must be accountable not only for maintaining the system, but for ensuring that definitions remain aligned with decision needs and that discrepancies are addressed structurally rather than compensated informally. 

When these elements align, trust increases. When they remain disconnected, technology alone cannot prevent fragmentation. 

Data quality is therefore not merely a technical challenge, nor solely a cultural issue. It is an organisational design question that sits at the intersection of strategy, data governance and leadership behaviour. 

It starts with a simple but powerful question: who is responsible for ensuring that this number means what we believe it means, today and tomorrow? 

Webinar: “Why organisations struggle to decide, even when data is abundant” 

In the webinar “Why organisations struggle to decide, even when data is abundant”, we explore why dashboards, strong data platforms and advanced analytics do not automatically translate into decisive action. 

We step back from tools and examine the structural conditions required to move from information to choice. The conversation connects strategic clarity, data capability and leadership ownership, and invites reflection on why decision pressure often increases in data rich environments. 

Register here