NicoElNino - stock.adobe.com

CIO interview: David Jack, chief technology and product officer, Dunnhumby

Analytics pioneer’s CTPO is working to accelerate and scale up the company’s platforms, introduce new products and migrate to the cloud

This will be a busy year for UK customer data specialist Dunnhumby as the company seeks to launch a range of new products to fend off competition while driving a large cloud computing implementation. 

The firm has a broad technology agenda, covering everything from C-level consulting to a large range of products that include applied data science tools and fully automated real-time data science platforms that handle billions of bid requests a day for retailers, consumer goods companies and firms from other sectors.

To modernise the foundations supporting its product portfolio, there are some large IT programmes under way at Dunnhumby, such as the ongoing decommissioning of datacentres and big-ticket traditional hardware that will see the firm’s analyst platforms being entirely provisioned in the cloud. 

Dunnhumby faces fierce competition in the analytics space, so it also plans to bring a lot of products it has been working on to light this year. Chief technology and product officer (CTPO) David Jack joined the firm in May 2018, with a remit covering all these fronts and more – such as venture capital investment and skills partnerships with universities. 

“My first areas of attention over the last six or seven months have been about the platforms that we build our products on and use to interface to our client systems,” Jack tells Computer Weekly. “I have also been accelerating the building of the scale of these platforms and driving down the economic cost of operating them.

“That means we can exploit more science, innovate and execute faster, bring more value to our clients. And that’s really exciting for us.”

Dunnhumby recognises it is up against large players with serious firepower and needs to remain at the top of customers’ minds, so it will be releasing a number of new products in 2019. According to Jack, these new systems will be either completely new offerings or delivered to markets where they are not already present.

“We have accumulated over 1,000 years of corporate memory about how to apply data insights and data modelling”

David Jack, Dunnhumby

“We have a lot of ‘almost products’ that are well executed, easy to deploy and mature from the point of view of both development and operation,” he says. 

“A number of these offerings are nearly there in terms of seeing the light of day. I’ve spent the last few months sifting through those and figuring out which ones we should be accelerating into the markets.”

Another key area of focus for Jack is to evolve Dunnhumby’s concept of global code lines, whereby computational models are rewritten in the latest technologies and then made available to customers across multiple markets. 

“This spring, we will be demonstrating a new look and feel and a new sense of what we mean when we say we are a customer data science platform,” he says. “The idea is that different product propositions feel connected and, as clients adopt one product, they will start seeing the journey to more of the products in our inventory.”

According to Jack, Dunnhumby has “the right to win” when it comes to competition – partly because of its history of nearly three decades serving some of the most demanding retailers in the world, including its parent, Tesco.

We have accumulated over 1,000 years of corporate memory about how to apply data insights and data modelling to bringing value to retailers over our nearly 30-year history,” he says. “Very few competitors can point to that level of intellectual capital.”

Cloud evolution 

Dunnhumby has been evolving in its cloud adoption and will be moving forward with a large migration of its analytics product sets this year. 

According to Jack, the project consists of taking products as they are, putting wrappers around them and migrating them into the cloud, as well as some new products that are cloud-native. 

“The cloud migration is driven by the economic advantage, but there is also the point that in data and computational science, cloud provisioning is not just a case of sounding sexy – it is phenomenally helpful to run science models at huge scale for reduced cost,” he says. 

“It would be almost unthinkable in a traditional infrastructure, processing thousands and thousands of instances of insight at any given time, then collapsing that infrastructure when it is no longer needed.”

But in the short to medium term, the company will have to live with the fact that there will also be non-client clients – for example China, where Dunnhumby has its platform hosted in a Shanghai datacentre because of regulations. 

When it comes to the cloud data that the company migrates, there are three buckets. The first is data brought in from clients, which is ingested, conformed, mapped to a common data model, then made available to Dunnhumby’s data lake.

Read more CIO interviews

The second bucket is providing a data analytics platform that is used as though all the data conforms and powers the insight tools that come with it. The third bucket is when all products are natively hosted in a cloud environment. 

“Considering those three blocks, the first is lifting and shifting our data, [running] ingestion processes and putting them into the cloud,” says Jack. “Through a large majority of clients that we intend migrating, we’re already there – that’s what we’ve been running in the last year.

“On the second piece, which is the related analyst tools and platforms, we run that entirely in parallel – and that’s a huge way through as well. The third piece, which is lifting our current product set and putting it into the cloud, and new products being developed for cloud delivery, that’s at a much earlier stage.”

When it comes to suppliers, one of the products that Dunnhumby acquired in North America recently is currently hosted in Microsoft Azure, and the company’s data platform is hosted in the Google Cloud Platform. 

Another big partner is Oracle, as the firm has big Exadata setups in its traditional datacentres that power many client production systems. On the business system platforms front, the company is 75% of the way through deploying Oracle Fusion across its global estate.  

Maintaining focus 

With so much on his plate, Jack’s number one challenge is the classic task of winning the hearts and minds of the organisation. The added complexity here is that the IT chief needs to keep a demanding 1,200-strong technical team interested in the job while not having the answers to everything. 

“That is challenging, especially when it gets hard and when we’ve got to maintain our current business growing at the rate we’re growing and changing as much as we’re changing,” he says. “That’s the thing that keeps me awake at night. 

“To work around that, one of the things I immediately set about doing was to strengthen my technical leadership by injecting some additional experience and skills and having some of my amazing people focus on a smaller number of things.”

To illustrate his point of focusing on a more reduced scope, Jack cites the company’s head of data science, who now focuses on “lifting and enhancing the company’s science to a new level”, rather than covering a huge range of responsibilities. 

Focus is also the personal command word for Jack as a leader as he tackles the challenge of managing the broad agenda of technical evolution at the data specialist company.

“Focusing on the real prize, which is the fast delivery of that 1,000 years of cumulative knowledge to our existing client base and our new clients, and not get lost in every single challenge along the way, is the way that we will win,” says Jack.

Read more on CW500 and IT leadership skills

CIO
Security
Networking
Data Center
Data Management
Close