The Advantages of AI-Powered Telemetry Data Pipelines over Traditional Pipelines
Introduction – The Steel Driving Man
History is filled with stories of human triumph. One of the most famous such stories is that of John Henry, “The Steel Driving Man.” As the traditional American folk story goes, John Henry and his fellow workers were faced with the arrival of the steam engine, which threatened to replace their manual labor. To prove that human strength and skill could outperform the new technology, John Henry challenged the machine to a contest. With his mighty hammer, he drove steel spikes into the rock at an astonishing pace, racing against the steam engine.
In the end, John Henry emerged victorious, having driven more spikes than the machine. However, his triumph came at a great cost. Exhausted from the superhuman effort, John Henry collapsed and died with his hammer in his hand, a symbol of man’s triumph over machines that has endured for over a hundred years.
This reminds me of the debate some Security and DevOps teams are having when it comes to using a telemetry pipeline to optimize the data they need to do their jobs. None of these groups would dismiss technology outright as it's the realm to which they’ve dedicated their careers. Almost all of them are using AI in the analytics platforms they use to gain insights about the environments they pledge to protect. But many are still using static, rules-based tools to optimize, shape, and deliver this data to those tools.
Humans Can’t Do What AI Does
Unlike the machines in the story of John Henry, machine learning is far more capable at analyzing patterns and optimizing this data than humans are. Even the most experienced Security and DevOps pros can’t compare thousands of events at the same time across hundreds of different vectors. So if your telemetry pipeline is a long list of human-defined rules and filters, it is undoubtedly missing a lot of opportunities to further refine and optimize data before it gets routed to an analytics platform.
Like John Henry's heroic efforts, human-designed data optimization is grueling and time consuming. Spending time building pipelines, tuning them, and maintaining them takes time away from other critical tasks. AI-powered pipelines like Observo AI deploy in minutes and tune themselves as they constantly learn from your data.
Onboarding New Data Types
Observability and Security analysis require looking at a wide range of data types to be effective. Each new data type can tell you more about the state of your environment. Unfortunately, many teams are forced to make choices about which data they can afford to analyze (more on this later).
Some data types are deemed too difficult to transform into a shape that makes them suitable for analysis by a SIEM or logging tool. Many security and observability data pipeline solutions solve this by hand-creating patterns to convert one data type into the schema of a particular tool. This works for major data types and destination combinations as long as there are no changes in the format on either side. The reality is that these schemas can change from time to time, forcing workarounds and constant maintenance to keep the data type and destination pairs up-to-date.
Observo AI uses machine learning models to dynamically transform data schemas and can automatically adjust to changes in schema on either side given the always-learning nature of these AI-powered transformation models.
This is very useful for established data type and destination pairs, but using AI before data is ever ingested into an analytics tool can also help onboard new or unknown pairings. We can typically fully transform an unknown data type into any destination within a few days of model training. Without using AI, this requires not only deep expertise on both data type and destination schemas, but also takes considerably more time. Every pair requires a long, unique set of rules and filters to transform data from one schema to another. This is another example of machine learning being vastly superior to hand-crafted by humans.
Data Optimization of 80% or More
The data used by Security and DevOps teams for analysis is growing as much as 25% to 35% a year. Teams are forced to make decisions about which data they can afford to fit into their budget - often driven by daily ingest limits. As we discussed earlier, these teams perform better when they analyze the broadest set of relevant data to understand how their environments are operating.
Observability pipelines can be an effective tool to reduce data volume before it’s ingested by an analytics tool. Static, rules-based pipelines when properly tuned and constantly maintained by someone with deep domain expertise can typically reduce data volume by 30-40%. Observo AI uses deep learning models to achieve 80% or more data reduction depending on the data type (some types are much more verbose than others – think firewall data and VPC Flow logs).
Machine learning is both highly effective and accurate in extracting patterns from streaming data and constantly learns from evolving patterns in the data to improve. By performing anomaly detection in the telemetry stream, Observo AI can separate data that is highly normalized from data that is much more interesting from an analytical perspective. One of the primary advantages of using AI to reduce data volumes is a feature called Smart Summarization. This groups normal data together and summarizes it for dramatic data reduction. An easy to understand example is with firewall data – if 50 events in a row are marked as “ALLOW”, Smart Summarization can combine these into a single event for dramatic data reduction. While all events from a firewall marked as “ALLOW” is something that humans can comprehend, Smart Summarization can also be used to combine normalized data based on patterns that most humans cannot easily recognize. Machine learning describes individual events in a large data set across hundreds of different factors. Plotting these in a multi-dimensional space allows for grouping events together based on similarity. Anomalies are furthest from the crowd of normal data – a sophisticated game of “Which one of these things is not like the others.” Normal data is important to track but may not have a lot of analytical value. Summarization doesn’t just sample it out, it describes how many events, over what time frame and how much data is included in the summarized event.
By reducing each event to only the essential data and summarizing normal data, Observo AI can achieve data reduction that is as much as double what you might expect from a human-designed list of rules and filters. Doing this analysis before data is ingested into an analytics index can have a huge impact on reducing infrastructure (storage, egress, compute) costs today and help you control the growth of your next license. It also helps you add a wide variety of data types to your existing license so you have a more complete picture of your environment.
Enrich Data with Sentiment Analysis in the Stream
Anomaly detection is not only good at finding and summarizing normal data, it is also an excellent way to prioritize alerts before they enter an analytics tool. Observo AI uses a process called Sentiment Analysis to describe events as positive or negative. Security teams field hundreds or thousands of alerts a day - many of which just aren’t that interesting. A hundred different users mistyping a password at a hundred different IP addresses is something to keep track of but probably shouldn’t warrant too much time expended by your security team. But if a single IP address has a hundred different bad password attempts, that may indicate someone attempting an attack.
Humans can determine the difference between these two scenarios but it requires a lot of time spent on all of the individual password failures that aren't that interesting. Sentiment Analysis can label the individual password attempts as Positive sentiment and the potential attack as Negative sentiment. Again, this is a simplified example and Sentiment Analysis can be used on much more complex patterns of behavior that humans simply cannot recognize.
By labeling a much smaller subset of events with negative sentiment, security teams narrow their focus to potentially critical events and deal with less likely culprits later. By prioritizing their work in this way, we have customers that have seen a 40% or better improvement in the time to identify and resolve critical incidents. This is a huge advantage for DevOps and Security teams and is simply not possible without using AI in the telemetry stream.
The Myth of Human Superiority and the Need for Human-Machine Collaboration
I realize as I type this, that this notion might not be popular. Movies like “War Games,” “The Matrix,” and “Terminator” talk about the dangers of unchecked machines and ultimately the triumph of the human spirit. But even in “Terminator II,” humans work with machines to defeat the big bad malevolent force. The same is true for AI-powered technology like Observo AI. We know you wouldn’t just take our word for it and subject your critical data to a black box that we assure you is doing what you want it to do. That’s why our tools have deep transparency into what’s going on behind the scenes. Observo AI shows before, after, and delta values of any transformation you choose to employ – and you can choose to use any mix of transformations for your data. We conduct carefully planned proof of concept projects to assure you that we aren’t removing any of the good stuff before routing to a destination. Our tools are just that – tools. They work with your teams to improve their productivity and offer them insights that just aren’t possible for humans to uncover by themselves. Importantly, this is all done before you pay to ingest data into an analytics index, saving time and money, helping you get a more complete picture by adding diverse data types, and giving you the confidence that the models are constantly learning and improving. They also allow you to manually add filters if you have something specific in mind or just want finer-grained control.
These are just a few of the ways we use AI to supercharge your observability and security efforts. We also automatically detect and mask sensitive data even when it ends up in unexpected places. Many rules-based tools can only find PII when it’s in the fields labeled with sensitive categories. Observo AI can find this data in open-text fields (think about voice-to-text records where PII is requested to verify identity). Observo also allows for Natural Language Search of data lakes - so you don’t have to master another query language or be a data scientist to uncover insights.
Wrapping Up
The story of John Henry is inspiring: man beats machine (although he ends up dead, and the railroads start to use steam engines to expand their progress). This story is also just a tall tale. It is an American folk tale designed to inspire. Humans, though, can’t do what machines can do. For that matter, machines can’t do all of the things people can do either. The best way forward is a combination of human expertise and ingenuity with artificial intelligence tools that can understand far beyond what people can. If you are not using AI to tame security and observability data in the telemetry stream - you simply cannot achieve the same results. If you think otherwise, it’s just a tall tale - not reality.
To see how an AI-powered observability pipeline can take your security and DevOps efforts to the next level, schedule your demo today. You can also read our white paper, “Elevating Observability: Intelligent AI-Powered Pipelines.”