Analyze-then-Store: The Journey to Continuous Intelligence – Part 5

A Technical Article Series for Data Architects This multi-part article series is intended for data architects and anyone else interested in learning how to design modern real-time data analytics solutions.

It explores key principles and implications of event streaming and streaming analytics, and concludes that the biggest opportunity to derive meaningful value from data – and gain continuous intelligence about the state of things – lies in the ability to analyze, learn and predict from real-time events in concert with contextual, static and dynamic data.

This article series places continuous intelligence in an architectural context, with reference to established technologies and use cases in place today.

Part 5: Swim Enables Continuous Intelligence at Scale Swim Continuum is the first open core enterprise platform to build and run continuous intelligence applications at scale.

It manages and automates the compute, persistence, security, scalability and operation of distributed real-time applications at the infrastructure layer.

At its core is SwimOS, a lightweight, vertically integrated and Apache 2.

0 licensed, distributed runtime for executing stateful data-driven applications without requiring a database, message broker or other streaming frameworks (although it can easily interface with those).

Swim adopts a simple application architecture in which each data source is represented by a stateful, concurrent live object called a Web Agent (see figure 5).

 Think of a Web Agent as similar to a digital twin, but it can process its own data, compute state changes and execute complex business logic, predicates and even predict in real-time, avoiding database round-trip delays.

 It can also react on behalf of its real-world data source, delivering responses in real-time.

Web Agents dynamically link to each other based on real-world relationships between the sources they represent, like containment or proximity, or even a computed relationship such as correlation.

  As Web Agents link to each other they form a dynamic graph of context-rich associations between data-sources at the most granular level possible.

Figure 5: Swim Web Agents (Source: Swim) A link is like a sub in event streaming that allows concurrent, stateful Web Agents to share their in-memory states.

  Web Agents dynamically make, and break links based on changes in the real-world as they process events.

The magic of linking is that it enables a Web Agent to compute, concurrently and in real time using both its own state and the state of other agents to which it is linked, enabling granular contextual analysis, learning and prediction, and an active response.

  So, the knock-on effects of changes on the part of an entity in the real-world are immediately visible as state changes in its Web Agent – to all related (linked) Web Agents in the digital application domain.

Each Web Agent continuously computes on its own state and the states of all other Web Agents to which it is linked.

  Computation can occur when an event arrives, at a per-agent scheduled time, according to triggers or windows, or a change in state of a linked Web Agent.

Furthermore, Swim attaches sophisticated relational logic to links: They can perform complex database-specific computations, concurrently and in real time including maps, joins, reductions, aggregations and more.

Swim builds the application layer graph of Web Agents from the data itself, letting each analyze, learn and predict in real-time on data that is granular and contextual (see figure 6).

  Each Web Agent streams the resulting state changes from its analysis to any application that subscribes to its streaming API.

  In the event streaming context, this lets Web Agents concurrently stream enriched and processed insights back to a broker, to a real-time UI, to end-user applications or storage.

Figure 6: Data Builds a Live Graph of Web Agents (Source: Swim) The resulting graph is a bit like a “LinkedIn for Things”: Web Agents, which are like ‘intelligent digital twins’ of data sources, dynamically inter-link to form a graph, based on real-world relationships.

 These relationships can be specified by the developer (eg: an intersection contains lights, loops and pedestrian buttons, all of which have the same lat/long), or derived from the data (eg: intersection A is within 1000m of intersection B).

 Web Agents use the event stream to build the graph of links: As data sources report their status in a stream of events the Web Agents that represent them are linked into the graph.

Linked Agents see each other’s state changes in real-time.

  Each Web Agents continuously analyzes, learns and predicts from its own state and the states of other Web Agents it is linked to, and streams granular, contextual results to users, and to other applications via the broker.

But there’s a second use for Web Agents that is vital in the delivery of continuous intelligence, namely their role as concurrent, application tier analytical objects that are the repository of both business logic and data processing: In our discussion of streaming analytics above we mentioned the need to deliver continuously coherent, real-time materialized views of the system that reflect degrees of ‘zoom’ for a top-down management view.

Web Agents that hold materialized views of the state of the system may link to millions of other Web Agents, and continually compute over their states to derive KPIs or aggregate views that are of value to various top-down users.

Using the power of links, relationships, analysis and learning in an application services tier of Web Agents allows developers to easily add additional tiers of services to an already active continuous intelligence deployment.

To read parts 1 to 4 of this guest article series, please visit the Swim blog.

About the Author Simon Crosby is CTO at Swim.

Swim offers the first open core, enterprise-grade platform for continuous intelligence at scale, providing businesses with complete situational awareness and operational decision support at every moment.

 Simon co-founded Bromium (now HP SureClick) in 2010 and currently serves as a strategic advisor.

Previously, he was the CTO of the Data Center and Cloud Division at Citrix Systems; founder, CTO, and vice president of strategy and corporate development at XenSource; and a principal engineer at Intel, as well as a faculty member at Cambridge University, where he led the research on network performance and control and multimedia operating systems.

Simon is an equity partner at DCVC, serves on the board of Cambridge in America, and is an investor in and advisor to numerous startups.

He is the author of 35 research papers and patents on a number of data center and networking topics, including security, network and server virtualization, and resource optimization and performance.

He holds a PhD in computer science from the University of Cambridge, an MSc from the University of Stellenbosch, South Africa, and a BSc (with honors) in computer science and mathematics from the University of Cape Town, South Africa.


Leave a Reply