Streaming specialist Confluent aims to drive rise of ‘contextual AI’
Confluent, Inc. used its annual data practitioner convention this month to detail Confluent Intelligence, a service designed to build context-rich real-time artificial intelligence (AI).
Built on Confluent Cloud, this technology is a fully managed stack that continuously streams and processes historic and real-time data, delivering this context directly into AI applications.
According to MIT’s The State of AI in Business 2025 report, Enterprises are pouring $30–$40 billion into generative AI and MIT estimates that as many as 95% of initiatives deliver zero return.
What is AI in context?
Confluent suggests that these failing projects suffer from context as their biggest hurdle i.e. a complete understanding of the events, relationships and meaning that off-the-shelf models need to reason effectively.
Getting this right demands infrastructure that can stream and process data continuously to evaluate what’s happened in the past, adapt to what’s happening in the moment and serve that information to AI applications without lag. This always-on cycle of evaluating, adapting and serving is needed to move beyond a simple chatbot and into production-level AI agents.
“We started Confluent to take on one of the hardest problems in data: helping information move freely across a business so companies can act in real time,” said Jay Kreps, co-founder and CEO at Confluent. “That same foundation positions Confluent to close the AI context gap. Off-the-shelf models are powerful, but without the continuous flow of data, they can’t deliver decisions that are timely and uniquely valuable to a business. That’s where data streaming becomes essential.”
Designed to enable real-time context-rich AI systems using Apache Kafka and Apache Flink, Confluent Intelligence provides what Kreps says is a “complete foundation” to launch and scale AI agents and applications. With built-in governance, low-latency performance and replayability, the technology streams structured context to any AI agent or application, whether built on Kafka and Flink or integrated externally via the Model Context Protocol (MCP).
Hands-off the backend
With the Real-Time Context Engine, teams can speed up AI initiatives by accessing real-time context and reliable data in one place, without ever touching Kafka or managing backend infrastructure.
Users can build, deploy and orchestrate event-driven agents natively on Flink – and the service unifies data processing and AI reasoning for agents that observe, decide and act in real time without the need for manual inputs. By bringing agentic AI directly into stream processing pipelines, Streaming Agents helps teams deliver intelligent, context-aware automation across an enterprise.
Confluent also notes that it is making Claude by Anthropic the default natively integrated large language model (LLM) on its Streaming Agents technology. Anthropic’s reasoning models and Confluent’s real-time data foundation are positioned here as a means for enterprises to build adaptive, context-rich AI systems, such as advanced anomaly detection.
Partner views & more
As a Confluent partner, MongoDB has much to say on the subject of real-time data and the conduits through which we seek to manage it.
“What we see as a top priority right now is bringing the power of real-time data directly to where developers already work. Batch processing remains a critical part of a broader ecosystem – but AI-driven applications demand streaming context to act instantly and intelligently. The future is about seamlessly combining historical insight with continuous streams, so organisations can build systems that learn, respond, and improve in the moment,” Kenny Gorman, head of streaming products, MongoDB.
Vast Data has plenty to say on the intersection of real-time data and its progression towards systems that we might define as AI-centric with inherently engineered contextual awareness. The company’s director of sales for UKI, Stuart Abbott, suggests that real-time systems only matter if the decisions they drive are reliable.
“What we’re seeing now is a convergence between data engineering and decision engineering; the infrastructure itself becoming a nervous system. As that happens, the winners won’t just be those who move data quickly, but those who can guarantee that the data is complete, consistent and ready to fuel intelligence of any kind,” said Abbott, speaking to the Computer Weekly Developer Network in line with Confluent Current 2025.
Further, he suggests that real-time has become “less about milliseconds” and “more about meaningfulness” so that the next wave of innovation won’t just deliver faster streams; it’ll give systems the context to reason and act responsibly in the moment.
“AI is one expression of that shift, but the deeper story is how enterprises design data flows that stay trusted, explainable, and adaptive over time,” said Abbott.
There’s more to discuss on real-time data and the debate here is (currently, unsuprisingly) centered on how and why the rise of AI has created an even more pressing need for it.
Jay Kreps, co-founder and CEO at Confluent.
Kreps: We started Confluent to take on one of the hardest problems in data: helping information move freely across a business so companies can act in real time
