funkyfrogstock - Fotolia

Dunelm's journey to micro front ends

We find out how Dunelm has modernised its platform and web development with in-house ecommerce and serverless systems

Software engineering covers all of Dunelm’s business domains. Over the past few years, the retailer has updated its IT infrastructure from having primarily off-the-shelf packages in datacentres, to running everything in the cloud with an increasing number of differentiating custom built systems.

“Like many companies, digital transformation starts with the website and goes deeper and wider after that,” says Paul Kerrison, director of engineering and architecture at Dunelm. “And that’s the phase we’re in now – we’ve got our digital offering on this leading-edge platform.” 

Discussing the transition to a more modern architecture, Kerrison says: “When we moved away from buying big ecommerce packages off-the-shelf, we decided to build our engineering capability and our own website, and we wanted to do this in as forward-thinking of a way as possible.”

This forward-thinking approach, says Jan Claeyssens, principal engineer at Dunelm, meant the company decided against deploying workloads on Kubernetes. Instead, it deployed mainly serverless systems and build out a DevSecOps capability.

At that time, as it was building the e-commerce platform, Kerrison says Dunelm decided to put a custom front-end on its content management system and provide a host of microservices. According to Kerrison, at the time, the result of this effort was that Dunelm had the largest serverless Lambda implementation in Europe.

With help from AWS principal serverless specialist Luca Mezzalira, Kerrison says the team at Dunelm is chopping up the website into micro front ends, as it had become quite big and bloated. This, he says, is quite a new concept: “There’s no one way to do it. We’ve had a lot of help from AWS and a guy there called Luca who literally wrote the book on micro front ends.”

He says the next stage of the company’s web presence is being built from smaller pieces that can be deployed and scaled independently, and which are owned by a particular team. For Kerrison, this architecture will help with the next phase of growth at Dunelm.

Looking at how the company is evolving its overall IT architecture, he says: “ The digitalisation of the business is more than just the website.”

But Kerrison acknowledges that the business will need to assess what he describes as “commodity capabilities”, which do not need to be built in-house. Instead, he says: “We’re going to buy off-the-shelf SaaS [software-as-a-service] applications and concentrate our build capabilities on what really makes a difference to customers or colleagues.”

Claeyssens says Dunelm initially used the open source Jenkins automation server to run its software development pipeline. When it came to choosing a new platform, he says Dunelm selected GitLab. While not the fastest product available, he says it offered Dunelm the most all-encompassing suite of features and functionality, including security.

At the highest level, Claeyssens says GitLab runs as Dunelm's source code repository and continuous integration/continuous delivery platform, adding: “It keeps all our source code safe and under control.” For instance, Claeyssens says the team has introduced “quite a neat way of committing code”, using Gitlab’s compliance framework functionality. “The compliance framework enforces certain checks across our entire Gitlab group, like mandatory peer review of committed code and approval if security scanners find unsecure libraries or code,” Claeyssens added.

Previously, Kerrison says all website updates were committed to a single codebase, which was getting cumbersome. The micro front-ends idea simplifies this. “By breaking that sort of big web monolith up into separate areas, each piece can be brought together by the front-end and get assembled into a page that makes sense from an experience point of view,” he says.

Behind the scenes, individual teams are able to work on individual pieces in their own GitLab repositories. “They all have their own pipelines, using standardised pipeline libraries, which reduces the risk of changing something here and affecting something over there,” Claeyssens adds. This way of working, organised around smaller chunks of code, also means Dunelm is able to deploy new functionality more frequently and safely.

“I want to see more and safer deployments and shorter lead times,” says Kerrison. Discussing how this could help the business, he adds: “With a fair wind, you could have an idea one day, and push it through a pipeline as an urgent task so it can go right the way through to production very, very rapidly.” 

Given the cost of software engineering, Kerrison says Dunelm is having success in running small, low-fidelity experiments to prove the value of a new idea, before committing the engineering effort required to put the idea into production. For Kerrison, the ability to deploy quickly with small pieces of code and control the rollout, means the software engineering teams at Dunelm can run experiments more safely. 

Read more about software development

  • GitLab’s deal with Google lets it keep sensitive customer data in the GitLab cloud while training models, amid enterprise concerns about generative AI licensing and security risks.
  • Continuous Integration & Continuous Delivery (CI/CD) platform company CircleCI used the summer slowdown period to come forward with support for GitLab SaaS developers.

Read more on Microservices

Search CIO
Search Security
Search Networking
Search Data Center
Search Data Management
Close