Dmitry - stock.adobe.com

Data streaming matures, cultural shift is key

Confluent’s Tim Berglund warns that a cultural transformation towards creating valued data products is vital even as the technology for data streaming is maturing

The developer community is “getting its feet under us” with data streaming, but the transition will require organisations to create well-managed data products, Confluent’s vice-president of developer relations, Tim Berglund, told attendees at the company’s Data Streaming World event in Melbourne this month.

Noting that transitions such as the current move towards data streaming are difficult due to the steep learning curve, Berglund said that “if it seems like a lot of stuff is in transition, then it is”. However, he noted that application development is stabilising and that “we’re starting to get this figured out”.

While Apache Kafka is the de facto standard for data streaming, Berglund cautioned that its ease in enabling point-to-point integrations can create a mess, potentially complicating analytics and artificial intelligence (AI) projects. Instead, he advocated for using the Kafka ecosystem to build discoverable, contextualised, trustworthy and reusable data products.

This shift, he argued, is more of a cultural transformation where individuals create properly formed data products as an act of organisational citizenship. He stressed that while modern infrastructure components such as data streaming, compute and governance are necessary, they are not sufficient without people caring about the data products they create.

“The customers have become producers,” he said, meaning their data is used elsewhere, requiring a craft-like approach to data product creation. Confluent plans to offer more guidance to help internal champions drive this change.

Berglund highlighted the need to “shift left” on governance and security, applying them close to the data source rather than attempting to clean up a data lake. Using a water analogy, he likened raw data to dirty water, cleansed data to potable water, and business-ready data to flavoured water.

He also noted that technologies like Tableflow, which represents Kafka topics as Iceberg or Delta Lake tables, help unify operational and analytical estates, which is as much about unifying people as it is about technology. “Kafka’s a big part of that,” he said, acting as a universal data substrate.

It’s still early in adopting this data product mindset, Berglund conceded, but “it’s real, it’s happening, and the necessary technological components are there”.

Several major Australian organisations also shared their data streaming journeys at the event.

ASX: Sumit Pandey, ASX’s senior manager for data and integration, explained that the stock exchange is undergoing a major technology change to support growth in trade volumes and new data-based products. The goal is stable, reliable connectivity (zero data loss, 99.95% uptime, two-hour recovery time objective) across on-premise and cloud systems using Kafka.

ASX adopted Kafka in 2019 and selected Confluent in 2024 for its comprehensive offerings, noting that Confluent’s cluster linking capability generated a 20-30% saving in its first two years. The project, currently in development and testing, aims for its first go-live in the fourth quarter of 2025, testing applications for 20 million trades per day.

ANZ Bank: Louisa Leung, ANZ’s domain architect for integration, detailed the bank’s need for low-latency analytics, such as sub-second fraud detection. ANZ has adopted an event-driven architecture with an event mesh across cloud and on-premises datacentres. This eliminates point-to-point configurations and accelerates delivery. However, Leung warned, “it’s not simple”, emphasising the need for agreed data standards and sufficient test data to avoid masking latency issues.

Bendigo Bank: Dom Reilly, Bendigo’s service owner of databases and middleware, discussed using Confluent to optimise costs through internal chargeback. A Splunk-based dashboard tracks usage of resources such as schemas, storage and managed connectors, applying a cost function to encourage good behaviour. “We’re sending a cost signal and getting good behaviour as a result,” Reilly said, noting that users are now deleting unused resources and forecasting costs more accurately. Formal chargeback begins in July.

Virgin Australia: Nick Mouton, the airline’s integration platform lead, described real-time data as fundamental for faster decisions and improved customer experience, including fleet visibility, baggage tracking, and automated passenger rebooking. Kafka provided scalability, while Confluent added enterprise-grade features, boosting developer productivity. Mouton advised starting small with a good-fit project, though Berglund, in a fireside chat with Mouton, stressed delivering value early rather than perfecting foundational work first.

Livestock Improvement Corporation (LIC): Vik Mohan, LIC’s principal technologist, explained how the New Zealand company developed a unified stream processing system for operational and analytical use, combining data from previously disconnected systems. This enables data-driven advice on breeding, health, and productivity from milk analyses, genetic data and cattle collars. Confluent provides a single, governed view of this data. Mohan highlighted the mindset change required, noting the challenge of aligning software and data engineers’ perspectives. The benefits include enhanced data accessibility, improved collaboration and faster time-to-market.

Read more about IT in ANZ

Read more on Big data analytics