CWDN series: Dev-eXperience – Confluent: Building a cohesive frame of reference 

This is a guest post for the Computer Weekly Developer Network written by Kai Waehner in his capacity as global field CTO, Confluent.

Confluent is described as a full-scale data streaming platform that enables users to access, store and manage data as continuous, real-time streams – built by the original creators of Apache Kafka, Confluent expands the benefits of Kafka with enterprise-grade features, while removing the burden of Kafka management or monitoring.

Waehner writes in full as follows…

So then, the question is… should Developer eXperience (DX) focus on software application development tools – or should we absolutely equally also examine which APIs, SDKs, libraries and dependencies are at play?

Just as the volume of enterprise data is skyrocketing, so are the options with which developers and engineers have to engage with it.

With developers now able to take advantage of technologies like microservices architectures and data mesh, there’s more scope to introduce solutions for a particular need at a set point in the DX journey. From an environmental perspective (coding environment, not ecological planetary environment – although okay, that too) to specific tools, modern DX is the embodiment of the experience that those solutions create.

A cohesive frame of reference 

It’s not just about application development tools, but rather about identifying and deploying the right combination of specific technologies, APIs and SDKs for a problem. From there, you can combine them into a cohesive frame of reference for your developers.

This is a unique challenge for every company, as the DX environment must reflect the skills, intentions and size of the development team – as well as the project in question.

Systematic system standardisation

According to Confluent’s Data Streaming Report 2023, ‘system standardisation’ is a high priority for 45% of UK IT decision-makers – and the building blocks for that consistency are laid by the decision-makers, engineers and developers who are responsible for DX.

That means that there’s a balance between flexibility and simplicity to be struck, providing specialist experts with the specialised tools they need without sacrificing accessibility to less experienced team members.

For example, low-code/no-code tools are great for some, but others want to (or have to) write low-level code. You don’t want to bar low-code experts from their craft, but nor do you want the obligation to write low-level code acting as a barrier to progress or innovation. The suite of solutions at a developer’s disposal, creating that DX journey, should be able to offer both.

Confluent’s Waehner: The building blocks for standardisation & consistency are laid by the decision-makers, engineers &developers who are responsible for DX.

It’s also important to remember that some will go off-script and carve their own DX journey if certain avenues aren’t available to them.

Driven to shadow IT 

For example, while a DevOps engineer for enterprise software often prefers Java or C++, a data scientist will almost always use Python. The scientist may be driven to shadow IT measures – systems that exist out of the confines of the business’s tech stack – if it is forbidden.

With Gartner research suggesting that 41% of employees have acquired, modified, or created technology outside of IT’s visibility in 2022, this is a serious issue – and DX offers a solution. The versatility in a cohesive DX should drive down composed shadow IT by allowing these systems to exist within the confines of the company.

Hence, DX should enable the right consumption of technologies via a best-of-breed approach. 

It makes simple business sense to offer a broad spectrum of interfaces that connect to other systems or write custom applications. This includes a Stream Designer for visual coding, client APIs for Java/C++/Go/Python, REST APIs and connectors to many databases and applications.

Data Center
Data Management