Key Takeaways
- Real-time data is now about reaction and acting confidently on data, not just speed.
- The integration of AI with real-time systems is a key focus, but clarity on its practical application, cost, and architecture is needed.
- Practitioners are prioritizing the practical aspects of real-time systems, such as pipeline maintenance, schema management, and cost control.
- The focus is shifting from data movement to creating reusable, trustworthy data products, emphasizing governance and engineering discipline.
- The streaming industry is maturing, with a more cautious and reflective approach, focusing on responsible, efficient, and intelligent use of real-time data.
Last month, I attended StreamNative’s 2025 Data Streaming Summit, where one trend became unmistakably clear: the center of gravity in AI is shifting from offline model training to agentic AI running directly on real-time streaming systems. Instead of treating AI as a downstream consumer of data, the industry is embedding intelligence inside streaming pipelines so that agents can make decisions as events occur.
Here are my key takeaways from the event.
Table Of Contents
1. Agentic AI has moved from concept to infrastructure
A recurring theme across sessions was the rise of continuously adapting AI agents that subscribe to event streams, maintain context, and act without waiting for batch cycles. StreamNative’s Agent Engine preview made clear that agentic AI is now being designed as a first-class workload within streaming environments.
2. Streaming platforms are becoming AI execution layers
Vendors are repositioning streaming infrastructure from “data transport” to real-time AI compute environments.
Major announcements reinforced this shift:
- Ursa architecture claims up to 95% cost savings for real-time AI workloads
- Apache Flink 2.0 introduced stronger AI-native capabilities
- Databricks emphasized lakehouse-native support for live AI loops
The consistent message: streaming systems are evolving into the runtime layer for agentic AI.
3. Batch and streaming are converging into a unified data plane
The “streaming lakehouse” narrative appeared across sessions. Rather than separating historical and live data, platforms are collapsing that boundary so AI agents can reason across both real-time events and deeper historical context without architectural friction.
4. Deterministic real-time decisioning will be a differentiator
As agentic AI moves into live systems, guarantees around latency and correctness are becoming non-negotiable. General-purpose stream processors tend to lose determinism once AI hooks are introduced.
This is driving demand for runtimes that ensure:
- Sub-10ms decision loops
- Predictable behavior under concurrent load
- State isolation and backpressure control
These are the exact requirements Volt has built to address real-time decisioning in agentic workflows.
5. Ecosystem alignment is accelerating
Across vendors, there is a coordinated push toward embedding intelligence inside the stream fabric rather than adjacent to it. Instead of displacing that infrastructure, complementary decisioning engines that execute mission-critical actions in-stream are becoming strategically important.
This alignment is already materializing in the market. During the summit, StreamNative and Volt announced a strategic partnership to unify high-throughput event streaming with deterministic millisecond decisioning, thereby formalizing the ingest-to-action architecture the event had been pointing toward.
Final Thoughts
DSS 2025 signaled that a structural shift is happening. The next phase of AI will not be defined by how well a model predicts, but by how safely and quickly AI agents can act inside real-time streaming environments. Real-time reasoning is becoming a runtime problem, not just a modeling problem.
To explore how deterministic real-time decisions can run directly inside streaming workflows, try the Volt for Streaming Decisions Trial.


