Is poor quality data jeopardising my analytics investment?

19th May 2020

Local Authorities and public service providers continue to invest heavily in analytical tools, but poor data is preventing the promised returns. An-Chan Phung, Chief Innovation Officer, Civica MDM suggests master data management is the answer.

According to a recent report by the Guardian, over one third of local authorities have already invested in predictive algorithms to support better decisions in their revenues and benefits processes in the face of Welfare Reform.

As citizens' finances are squeezed by increasing household costs and rising unemployment during the COVID-19 pandemic, the ability to produce trusted and actionable insights to maximise incomes, mitigate fraud and enable focussed investment in frontline services with evidenceable outcomes, is now more critical than ever. Even today, the potential financial benefits of such insights are clear to be seen:

  • Council tax debt currently stands at £3 billion
    Driving a need for a complete and accessible ‘single view of debt’, maintaining accurate contact records to focus recovery efforts and mitigate future debt
  • Fraud costs local government £2.1 billion per year
    Trusted insights on housing, single person discount (SPD) and blue badge abuse being the most significant opportunities to drive recoveries and ensure prevention

  • Late interventions cost the Tax payer £17 billion per year
    Taking an ‘invest to save’ approach, using insights to ensure on-time interventions with proven outcomes, reducing the costs downstream.

However, despite their sizeable investments and the scale of the potential reward, many have found the results of their predictive models to be unreliable and their reports untrustworthy, business adoption has been underwhelming and they have failed to drive the returns they forecast.

Rather than blame the BI/Predictive tools - often the easy scape-goat - many organisations are calling out a more fundamental issue: the data simply isn’t complete or accurate enough to drive reliable results. In Hackney Council, “issues of variable data quality meant that the system wasn’t able to provide sufficiently useful insights”.

When seeking to make critical decisions that could dramatically impact a citizen’s health and welfare, taking inappropriate action on unreliable analytical outputs can have life changing consequences. Inaccuracies in data often prevents the entire output being actionable and leads to a lack of confidence in the data assets.

Is my data undermining the value of my analytics deployment?

With many organisations feeling they’ve fallen foul of the age-old adage “garbage in = garbage out”, what are the common symptoms that poor data is the root cause of a faltering analytics deployment?

  • Results from predictive models are unreliable – trusted predictive insights rely on quality data. Fed anything other than accurate and broad records and you can expect low levels of confidence in the results, especially for propensity models, and a lot more time spent talking about results than actioning them
  • Reports are incomplete and aren’t trusted – full of duplicates, quality and consistency issues. They’re often missing vital information we know we hold
  • Information is difficult to understand - it doesn’t conform to a common business terminology
  • Reporting is inconsistent across departments and teams – we’re not able to effectively communicate and collaborate
  • Misalignment between strategy and execution – the exec has a strategy, management have a KPI pack, operations have system reports, data scientists are building predicative models… but they’re not aligned and driving towards the same goal.

How can I fix my data and rescue my analytics deployment?

Thankfully it’s not too late: If poor quality data is jeopardising your analytics investments then Master Data Management (MDM) could be the key to quickly getting back on track.

Drilling into the issues ‘in data’, below are some of the most commonly cited challenges that MDM can directly help you address:

Resolving duplicates: Accurately matching and merging records using robust and trusted algorithms (including synonym, phonetic and misplaced value matching).

Delivering more complete records: Integrating fragmented records from all data sources (e.g. Council Tax, Social Care, Housing, CRM...) to create one complete record for each individual.

Identifying and addressing data quality issues: Some of which include:

  • Invalid data entries: That £1m repair job somebody logged, or that set of records whose last names look like telephone numbers
  • Placeholder values: It’s surprising how many people are born on 1/1/1900 or whose telephone numbers is 01010101010
  • Incorrectly held data: E.g. those NI numbers that probably should have been removed or secured
  • Discrepancies: It’s possible, but unlikely, there are many people whose first name is “Thomas” and title is “Miss”, it’s also unlikely someone “under 20” would have an “@aol.com” email address.

Improving record accuracy and completeness: Having addressed data quality issues (see above), Survivorship Rules intelligently build a person’s Golden Record from the most trusted and up-to-date sources.

Verifying and enriching records: to further increase accuracy and completeness

  • Validating records against external sources such as the myaccount profile
  • Augmenting our view of person or property against third party reference sets. E.g. the NLPG or credit reference agencies datasets.

Discovering relationships: Finding and building the relationships between records including the "household view", which links people and locations together – critical to ensuring safeguarding or fraud prevention initiatives.

Ensuring integrity: Automating manual data processes and building the links between records to enable trusted drill-to-detail or roll-up.

Driving conformity: Ensuring information conforms to agreed business standards and definitions (e.g. enforcing a single set of agreed Gender codes).

Maintaining consistency: Synchronising changes in records across departments and service partners, making sure everyone has up-to-date information.

What does this mean for my information consumers?

MDM delivers a trusted data foundation for analytics projects, ensuring you achieve:

  • Reliable outcomes from predictive models that can be applied with confidence
  • Trusted KPI reports, built on complete and accurate information with the integrity to enable trusted drill-to-detail
  • Breaking-down departmental siloed working and driving collaboration around a shared dataset
  • Better alignment of Exec Strategy and Operational Execution – making your organisation more agile and responsive to changing conditions
  • Ensuring information and KPIs conform to common business definitions, which are known and understood
  • Automating data integration and accelerating analytics delivery, driving time & cost savings
  • Maximising business adoption of analytical tools and ownership of the data that powers them, driving enterprise data quality improvement.

More Information:

If you feel data quality is undermining your analytics investments, we’d be pleased to health-check your data and show you what it could look like. Get in touch.

Find out how better data helped the London Borough of Ealing save £1.3m in its first year of combatting SPD fraud. Read the case study.