spainter_vfx - stock.adobe.com

Why real-time data is key for enterprise AI

Moving AI from experiment to production requires high-quality, real-time data streaming. Australia tech leaders from Confluent, Bendigo Bank, Telstra, and Coles share how they are turning systems of record into systems of action

Organisations are investing heavily to harness artificial intelligence (AI) to solve business problems, but putting the technology into production presents significant data quality and availability challenges.

This is especially true for applications requiring real-time data, such as fraud detection, according to Confluent’s chief technology officer (CTO) Stephen Deasy.

Speaking to Computer Weekly on the sidelines of the Confluent Data Streaming World Tour in Melbourne, Deasy noted that organisations are moving beyond experiments and trials this year. They are taking what they have learned about AI and applying it in high-impact areas to deliver customer value, leveraging the latest streamed data to achieve their goals.

Locally, there is a move to build on existing systems of record and engagement to create “systems of action,” which require much fresher data, added Greg Taylor, senior vice-president and Asia-Pacific general manager at Confluent.

What’s needed is to capture data where it is created and then process it in real time so it can be used to inform action, he said, for example to automate parts of the business. Some Confluent customers in the region have been able to achieve 60 to 70% automation, although Australian organisations are generally not that advanced, he observed.

Taylor said such systems will require business experts as part of the governance structure, for example to check for AI hallucinations, but they do enable continuous improvement based on data.

Being able to react quickly to signals can increase revenue and reduce fraud, and the ability to run the Confluent platform on-premises, in the cloud, or in a hybrid environment helps customers achieve such goals.

“AI is top of mind for customers,” said Deasy, adding that Confluent helps enterprises deliver real-time data directly to their AI models.

The company’s adherence to open standards plus its investment in technology and support resonates with customers, he added, and the company often works with CTOs and systems architects to help accelerate the implementation process.

Technology and performance improvements for various workloads are coming, so Confluent can deliver for what people are doing today and what they will be doing in the future, said Deasy.

Taylor observed that customers traditionally rely on vendors for new features and capabilities, but the big improvements in software engineering brought by generative AI means they can potentially create their own features, and this gives them greater negotiating power.

“We see that regularly,” admitted Deasy, noting that this creates constant pressure on the company’s own software engineering teams to produce more, faster.

The Melbourne event also highlighted real-world deployments, featuring presentations from three major Australian enterprises:

Bendigo Bank

Sam Fursdon, principal AI engineer at Bendigo Bank, described how using the Confluent platform, including Confluent Flink, enabled the bank to significantly reduce the mainframe load generated by its open banking obligations and its mobile-only subsidiary, Up. Importantly, Confluent Flink integrated seamlessly with Bendigo’s continuous integration and continuous deployment (CI/CD) patterns, creating a “well-orchestrated deployment pipeline.”

By combining transaction and balance data in Confluent, the bank reduced mainframe application programming interface (API) calls by 50%. The overnight batch processing backlog was cleared and is now completed by 6am.

Furthermore, the average end-to-end latency between a transaction occurring and the information becoming available for consumption is now just 2.3 seconds during business hours. In practice, this means an ATM user can receive an app notification before the cash has even been dispensed.

Telstra

Telstra uses Confluent to improve its mobile network and customer experience. Quoting a colleague, Javed Bolim, Telstra’s technology product owner for observability, said: “You need to see it to action it.”

Events are captured continuously and streamed in real time for analysis. This helps with early detection of issues before the customer notices, provides richer context by correlating more signals, and the scalability of the system means new uses can be implemented without the need to reengineer data flows.

Data streams are filtered, enriched, and stored in a database for multiple uses. These include service assurance at major events, such as Boxing Day test matches at the Melbourne Cricket Ground (MCG), and proof of value, ensuring customers get what they pay for. Bolim added that new products and features enabled by this capability are currently in the pipeline.

Bolim advised other IT practitioners to focus on developing the right skills among team members and securing business buy-in when difficult choices must be made. He also recommended exceeding intermediate goals and developing shared capabilities, such as providing self-service access for other internal teams.

Coles

Supermarket chain Coles faced challenges stemming from the use of dozens of disparate event-based systems across various parts of the organisation. “The stuff was everywhere,” said principal engineer Simon Bedford. This sprawl led to duplication, additional costs, operational friction, and the need for a security redesign.

Coles invested heavily in deploying Confluent as a true enterprise platform, incorporating tooling, monitoring, observability, data products, and discoverability. Despite the scale of the deployment, it did not require a large team, relying on just three to five people at various stages of the project.

The initiative provided an opportunity to start afresh and enforce architectural principles, including consistent naming conventions, clear ownership boundaries, and automated provisioning.

Governance must be baked in and automated rather than manual, Bedford advised. He noted that the platform is treated “like an internal software-as-a-service [SaaS],” with developers acting as the customers. Because developers will typically work around anything they find overly restrictive, it was important to get the user experience right, making education a critical part of the process. Ultimately, providing a strong developer experience, including self-service provisioning through GitOps and CI/CD integration, resulted in strong internal adoption.

“We put in a lot of work to make sure [observability and monitoring] are enterprise-grade,” he added, noting that cost attribution is now achieved through telemetry.

As a result, “chaos and complexity” have been replaced with “structure and efficiency,” and improved data discoverability has led to high levels of reuse across the business.

“The platform is now easier than the alternatives and is more reliable,” said Bedford. Because it is trusted, it is widely used, delivering business outcomes that include faster time to market, reduced integration costs, and improved customer responsiveness. “We’ve got high-quality data, [and] we want to look at how we can use it for AI.”

Read more about AI in Australia

Read more on Big data analytics