How to lay solid data foundations for AI
This is a guest blogpost by Jamie Hutton, CTO of Quantexa.
A Salesforce study has that around 33% of executives view AI as overhyped. But this scepticism may be misdirected. As Quantexa we are finding the real challenge isn’t with AI itself, but rather the foundation on which it thrives: data. Without addressing fundamental data issues, even the most advanced AI technologies cannot deliver on their transformative promise.
Moving from demystification to operational success and growth
The odds are stacked against CTIOs, CTOs, CDOs, and Chief Architects across public and private organisations. They’re under pressure from management, the board, civil servants and/or politicians to deliver tangible business outcomes but are understandably focused on getting the fundamentals right: system architecture, data unification, data quality, and more. There’s a temptation to wait until these challenges are solved before deploying AI and decision intelligence systems. But data will never be perfect, and substantial value is being lost by waiting.
Organisations need to think beyond traditional data management approaches. Modern data strategies require unifying disparate information, putting it in proper context, and enabling better decision-making through technologies like entity resolution, knowledge graphs, generative AI, and machine learning systems that improve over time.
Without trusted data foundations, even the most sophisticated AI systems will falter. This challenge is particularly acute in the public sector, where organisations work with outdated systems and siloed information while balancing the safety and security of citizens, all under immense pressure to deliver on time and on budget.
Many organisations don’t trust data-led decisions because they lack confidence in their data sources. Information silos, duplicate records, inconsistent formatting, and disconnected systems create a fragmented view that prevents accurate analysis. According to Gartner, poor data quality costs organizations an average of $12.9m manually, with decision-making being the most impacted.
Joining real world data together
The fundamental challenge is that most organisations cannot effectively join their data into a trusted foundation accessible by different parts of the business or departments. Across industries, organisations accumulate vast amounts of structured and unstructured data, both internally and externally, all referring to real-world people, businesses, and places. The key is developing the ability to resolve these references into a complete, unified view.
Entity resolution addresses this challenge across sectors. When the same person appears under different names, nicknames, abbreviations, or due to data entry errors, traditional systems often create multiple profiles. Entity resolution techniques identify them as a single individual, enabling more accurate analysis and pattern recognition.
Unlocking data value with entity resolution
Unifying data with entity resolution is industry-agnostic. It’s as suitable for helping a global bank, such as our customer HSBC, to understand its customers and counterparties as it is for a government creating digital identities for its citizens. At the core of both ambitions is the problem of eliminating duplicate records and combining multiple records for the same entity.
For example, when the same person is referred to using different names, such as nicknames, abbreviations or errors, it becomes challenging to identify them as a unique individual. By resolving these references to specific entities and bringing in contextual data to understand connections, AI tools can help public and private sector organizations identify patterns invisible to traditional analysis.
Unlike traditional master data management approaches, ER doesn’t require a data transformation exercise to put information into a specific format. This approach to data is continuous. For example, if an existing customer signs up for a new product, or to a GP hospital office, without continuous entity resolution, an organization may consider them a new customer rather than correctly linking them with their existing profile.
Context is everything
Increasingly digital leaders are under pressure to explain or justify decisions. No decision stands on its own, which is why decisions should be evaluated in a context-sensitive manner, beyond the scope of any individual or event.
Using our work in the financial services as an example, contextual monitoring allows organizations to secure a wider and enriched view of the customer associated with any given transaction. A bank’s monitoring system may flag transactions above £50,000 because it represents a change in behaviour. However, understanding the context around this payment may reveal that it’s a deposit for a house purchase, which dramatically changes the risk profile.
With contextual monitoring, the bank can see connections that reveal crucial information: the source of funds, the businesses they are associated with, any negative news mentions, or links to other people on a watch list. This capability helps build a solid data foundation from which enterprises can deliver more detailed analysis and ultimately improve decision-making.
The insights generated through contextual analysis can be shared between public and private sectors, creating a multiplier effect. For example, findings from the Cabinet Office’s COVID bounce back loan fraud investigation can benefit the banking sector and international counterparts pursuing similar objectives with their pandemic finance schemes.
This collaborative approach embodies the cross-economy collaboration championed in the UK’s AI Opportunities Action Plan, demonstrating how shared data foundations amplify benefits across sectors. Another example is HMRC’s “unique customer record” project, which creates a comprehensive view of taxpayers by resolving entity information across numerous systems and datasets.
For the UK to realize its AI ambitions, both public and private sectors must prioritize building robust data foundations. Entity resolution, contextual intelligence, and graph analytics provide the critical infrastructure needed to overcome data quality challenges without requiring massive system replacements.
With these foundations in place, AI can deliver on its promise of transformation and growth, moving beyond the hype to generate tangible value for organizations and citizens alike.
Jamie Hutton is a co-founder and the CTO of UK “decision intelligence” company Quantexa. He is also an advisor to the government’s AI Opportunities Action Plan.