Cribl puts your IT and Security data at the center of your data management strategy and provides a one-stop shop for analyzing, collecting, processing, and routing it all at any scale. Try the Cribl suite of products and start building your data engine today!
Learn more ›Evolving demands placed on IT and Security teams are driving a new architecture for how observability data is captured, curated, and queried. This new architecture provides flexibility and control while managing the costs of increasing data volumes.
Read white paper ›Cribl Stream is a vendor-agnostic observability pipeline that gives you the flexibility to collect, reduce, enrich, normalize, and route data from any source to any destination within your existing data infrastructure.
Learn more ›Cribl Edge provides an intelligent, highly scalable edge-based data collection system for logs, metrics, and application data.
Learn more ›Cribl Search turns the traditional search process on its head, allowing users to search data in place without having to collect/store first.
Learn more ›Cribl Lake is a turnkey data lake solution that takes just minutes to get up and running — no data expertise needed. Leverage open formats, unified security with rich access controls, and central access to all IT and security data.
Learn more ›The Cribl.Cloud platform gets you up and running fast without the hassle of running infrastructure.
Learn more ›Cribl.Cloud Solution Brief
The fastest and easiest way to realize the value of an observability ecosystem.
Read Solution Brief ›Cribl Copilot gets your deployments up and running in minutes, not weeks or months.
Learn more ›AppScope gives operators the visibility they need into application behavior, metrics and events with no configuration and no agent required.
Learn more ›Explore Cribl’s Solutions by Use Cases:
Explore Cribl’s Solutions by Integrations:
Explore Cribl’s Solutions by Industry:
September 25 | 10am PT / 1pm ET
Hold my beer: lessons from one team’s data pipeline journey
Register ›Try Your Own Cribl Sandbox
Experience a full version of Cribl Stream and Cribl Edge in the cloud.
Launch Now ›Get inspired by how our customers are innovating IT, security and observability. They inspire us daily!
Read Customer Stories ›Sally Beauty Holdings
Sally Beauty Swaps LogStash and Syslog-ng with Cribl.Cloud for a Resilient Security and Observability Pipeline
Read Case Study ›Experience a full version of Cribl Stream and Cribl Edge in the cloud.
Launch Now ›Transform data management with Cribl, the Data Engine for IT and Security
Learn More ›Cribl Corporate Overview
Cribl makes open observability a reality, giving you the freedom and flexibility to make choices instead of compromises.
Get the Guide ›Stay up to date on all things Cribl and observability.
Visit the Newsroom ›Cribl’s leadership team has built and launched category-defining products for some of the most innovative companies in the technology sector, and is supported by the world’s most elite investors.
Meet our Leaders ›Join the Cribl herd! The smartest, funniest, most passionate goats you’ll ever meet.
Learn More ›Whether you’re just getting started or scaling up, the Cribl for Startups program gives you the tools and resources your company needs to be successful at every stage.
Learn More ›Want to learn more about Cribl from our sales experts? Send us your contact information and we’ll be in touch.
Talk to an Expert ›One of Cribl Stream’s selling points is the reduction of ingested log volume, which helps our customers control costs and improve system performance. This can be accomplished in two ways – either by eliminating duplicate or unnecessary fields and null values within the events, or controlling the number of specific events that actually get sent to the destinations through strategic filtering. In today’s blog post, I am going to show you how to do the latter via pipelines using four of Cribl Stream’s event-reducing functions.
Personally, I learn best when I have real world examples that I can reference while building out my own solutions; so to help show the power of reduction, we’ll use Apache web events to show how you can use Cribl Stream to reduce any unneeded or license consuming events using the following four functions.
In this scenario, Apache events are consuming too much license and performing searches to build statistics is negatively impacting search performance. In the following pipeline, we will first use the Parser function to parse out the status field we need, and then we’ll use the following reduction functions based on the status code in each event:
Sample logs:
The first function we are going to explore is the Drop function. This function allows you to discard 100% of the events that match a specific filter; in this case, we are using the expression Math.floor(status/100)==2
to match any event that has a 200-299 status code. Because 2xx status codes typically indicate success instead of error, we will drop these events as they are no concern to the server admin team.
The Suppress function allows you to suppress events over a time period while still letting through a defined number of copies. Since the 3xx status codes typically indicate client redirection, which isn’t a huge concern to the team, in the example below we are only going to allow one event with a 3xx status code to pass through every 30 seconds. This allows for reduction while also allowing a few events through for reporting or troubleshooting purposes.
Cribl Stream comes equipped with two Sampling functions; Sampling and Dynamic Sampling. The Sampling function allows you to configure a static sample rate, such as allowing 1 out of every 5 events to pass through while dropping the others and Dynamic Sampling allows for adjusting sampling rates based on incoming data volume per sample group. You may notice that this function is similar to the Suppress function we used previously; the difference being allowing x number of events based on total events coming through versus events over a time period.
Because 4xx status codes arise in cases where there is a problem with the user’s request, and not with the server, our server team wants us to bring in enough samples so that they can do any necessary troubleshooting without consuming too much storage.
Lastly, we are going to use the Aggregate function not only to perform aggregate statistics on the 5xx status code event data in the form of metrics to Splunk but also to allow those events to pass through for inspection by our server team. Because this function is a little bit more involved / more powerful than the previous functions, I’ll break down below what we configured and why.
Math.floor(status/100)==5
expression so that the function is only applied to any 5xx status code event.Metrics Output:
Using the four functions above (plus the initial Parser function), we built a quick and dirty 5-function pipeline that helped us reduce our Apache sample log by 60% – without losing any relevant events that our server admin team truly cares about. Apache events are not the only logs that can benefit from this pipeline – this form of reduction can help any type of log events that you have flowing into your SIEM solutions. You can accomplish all of this worry-free, knowing that Cribl Stream has a solid replay option that allows you to keep a full-fidelity copy in a low-cost destination with the ability to replay the entire logs back as needed.
The fastest way to get started with Cribl Stream and Cribl Edge is to try the Free Cloud Sandboxes.
Experience a full version of Cribl Stream and Cribl Edge in the cloud with pre-made sources and destinations.
Classic choice. Sadly, our website is designed for all modern supported browsers like Edge, Chrome, Firefox, and Safari
Got one of those handy?