Cribl puts your IT and Security data at the center of your data management strategy and provides a one-stop shop for analyzing, collecting, processing, and routing it all at any scale. Try the Cribl suite of products and start building your data engine today!
Learn more ›Evolving demands placed on IT and Security teams are driving a new architecture for how observability data is captured, curated, and queried. This new architecture provides flexibility and control while managing the costs of increasing data volumes.
Read white paper ›Cribl Stream is a vendor-agnostic observability pipeline that gives you the flexibility to collect, reduce, enrich, normalize, and route data from any source to any destination within your existing data infrastructure.
Learn more ›Cribl Edge provides an intelligent, highly scalable edge-based data collection system for logs, metrics, and application data.
Learn more ›Cribl Search turns the traditional search process on its head, allowing users to search data in place without having to collect/store first.
Learn more ›Cribl Lake is a turnkey data lake solution that takes just minutes to get up and running — no data expertise needed. Leverage open formats, unified security with rich access controls, and central access to all IT and security data.
Learn more ›The Cribl.Cloud platform gets you up and running fast without the hassle of running infrastructure.
Learn more ›Cribl.Cloud Solution Brief
The fastest and easiest way to realize the value of an observability ecosystem.
Read Solution Brief ›Cribl Copilot gets your deployments up and running in minutes, not weeks or months.
Learn more ›AppScope gives operators the visibility they need into application behavior, metrics and events with no configuration and no agent required.
Learn more ›Explore Cribl’s Solutions by Use Cases:
Explore Cribl’s Solutions by Integrations:
Explore Cribl’s Solutions by Industry:
September 25 | 10am PT / 1pm ET
Hold my beer: lessons from one team’s data pipeline journey
Register ›Try Your Own Cribl Sandbox
Experience a full version of Cribl Stream and Cribl Edge in the cloud.
Launch Now ›Get inspired by how our customers are innovating IT, security and observability. They inspire us daily!
Read Customer Stories ›Sally Beauty Holdings
Sally Beauty Swaps LogStash and Syslog-ng with Cribl.Cloud for a Resilient Security and Observability Pipeline
Read Case Study ›Experience a full version of Cribl Stream and Cribl Edge in the cloud.
Launch Now ›Transform data management with Cribl, the Data Engine for IT and Security
Learn More ›Cribl Corporate Overview
Cribl makes open observability a reality, giving you the freedom and flexibility to make choices instead of compromises.
Get the Guide ›Stay up to date on all things Cribl and observability.
Visit the Newsroom ›Cribl’s leadership team has built and launched category-defining products for some of the most innovative companies in the technology sector, and is supported by the world’s most elite investors.
Meet our Leaders ›Join the Cribl herd! The smartest, funniest, most passionate goats you’ll ever meet.
Learn More ›Whether you’re just getting started or scaling up, the Cribl for Startups program gives you the tools and resources your company needs to be successful at every stage.
Learn More ›Want to learn more about Cribl from our sales experts? Send us your contact information and we’ll be in touch.
Talk to an Expert ›Goal:
Confidently migrate existing applications and tooling to the cloud (or to multiple clouds) on time and under budget.
Challenge:
Reconfiguring architectures and data flows to ensure parity and visibility in the cloud (or multiple clouds), while keeping a handle on ingress and egress charges.
Example:
You are migrating a widely-deployed application from an on-premises deployment to a cloud deployment, with the primary goals for migration being optimized performance, reduced management overhead and streamlined costs. This is your opportunity to address some ongoing challenges of your on-premises deployment, and ensure parity between your old deployment and new cloud deployment before fully switching over.
How Can Cribl Help?
By routing your data from existing sources to multiple destinations, you can ensure data parity in your new cloud destinations, before turning off your on-premises (or legacy) analytics, monitoring, storage or database products and tooling. Further, Cribl can reduce your costs significantly by putting our worker nodes inside of your cloud, to cheaply and effectively compress and move the data to reduce egress charges.
To do this, you will test and deploy several Cribl Stream technical use cases:
Before You Begin:
What You’ll Achieve:
From your existing collectors and agents, set up destinations and pipelines for your new cloud destinations. (If you need new collectors or agents, you can learn more about Cribl Edge, a vendor-neutral, small footprint agent, which allows you to configure which data you want to send from the edge to your destination. Edge also provides a clean UI to ease fleet management.)
Identify the data being sent to each destination. For each type of data you will accomplish the following:
For data that requires shaping or normalization, create a pipeline or use the out-of-the-box Packs.
For data that requires reduction, create a pipeline or use the out-of-the-box Packs. (Most Packs help to reduce data volumes by up to 30%.)
Spec out each source:
For your QuickStart, we recommend no more than 2 Sources.
Where does your data need to go:
For your QuickStart, we recommend no more than 2 Destinations.
As part of the exercise to prove your use case, we recommend you limit your evaluation to 1-2 sources and 1-2 destinations (or fewer).
Note: for an alternative to setting up Sources and Destinations you can use Cribl packs and sample data for your evaluation. Look at step 9 to use Packs and the included Sample data.
Another way you can get started quickly with Cribl is with QuickConnect or Routes.
Cribl QuickConnect lets you visually connect Cribl Stream Sources to Destinations using a simple drag-and-drop interface. If all you need are independent connections that link parallel Source/Destination pairs, Cribl Stream’s QuickConnect rapid visual configuration tool is a useful alternative to configuring Routes.
For maximum control, you can use Routes to filter, clone, and cascade incoming data across a related set of Pipelines and Destinations. If you simply need to get data flowing fast, use QuickConnect.
Capture Sample Data set for each Sourcetype.
Capturing a sample data set allows Cribl Pipeline and Packs to validate the logic against your sample data, and show a before and after view to prove that your Reduction and Enrichment use cases are working.
As an alternative to capturing sample data at the Source, use QuickConnect to capture a sample dataset.
In the QuickConnect UI, when you hover over the destination, you can click on Capture. The Capture button captures a sample of data flowing through the Source.
As an alternative to capturing sample data at the Source, use Routes to capture a sample dataset:
For your use cases you will:
Streamline the number of fields or volume of data you send to your analysis tool:
Modify Logs to Metrics:
Enrich data with third-party sources:
Transform data to prepare it with Common Information Model fields (for Splunk) or Elastic Common Schema (ECS for Elastic):
Packs enable Cribl Stream administrators to pack up and share Pipelines and Functions across organizations, and include sample data for testing. The following Packs might be helpful:
As an alternative to Packs and the out-of-the-box Pipelines that are part of the Packs, you can create your own Pipeline. Pipelines are Cribl’s main way to manipulate events. Examine Cribl Tips and Tricks for additional examples and best practices. Look for all the sections that have Try This at Home for Pipeline examples https://docs.cribl.io/stream/usecase-lookups-regex/#try-this-at-home
Please note: If you are working with existing data source being sent to your downstream systems and you do nothing to the output from Cribl Stream, it has a may break any existing dependencies on the original format of the data. Be sure to consult this Best Practices blog or the users and owners of your downstream systems before committing any data source to a destination from within Cribl Stream.
Technical Use Cases Tested:
Also select Monitoring > Data > Pipelines and examine slicendice.
For additional examples, see:
When you’re convinced that Stream is right for you, reach out to your Cribl team and we can work with you on advanced topics like architecture, sizing, pricing, and anything else you need to get started!
Goal:
Confidently migrate existing applications and tooling to the cloud (or to multiple clouds) on time and under budget.
Challenge:
Reconfiguring architectures and data flows to ensure parity and visibility in the cloud (or multiple clouds), while keeping a handle on ingress and egress charges.
Example:
You are migrating a widely-deployed application from an on-premises deployment to a cloud deployment, with the primary goals for migration being optimized performance, reduced management overhead and streamlined costs. This is your opportunity to address some ongoing challenges of your on-premises deployment, and ensure parity between your old deployment and new cloud deployment before fully switching over.
How Can Cribl Help?
By routing your data from existing sources to multiple destinations, you can ensure data parity in your new cloud destinations, before turning off your on-premises (or legacy) analytics, monitoring, storage or database products and tooling. Further, Cribl can reduce your costs significantly by putting our worker nodes inside of your cloud, to cheaply and effectively compress and move the data to reduce egress charges.
To do this, you will test and deploy several Cribl Stream technical use cases:
Before You Begin:
What You’ll Achieve:
From your existing collectors and agents, set up destinations and pipelines for your new cloud destinations. (If you need new collectors or agents, you can learn more about Cribl Edge, a vendor-neutral, small footprint agent, which allows you to configure which data you want to send from the edge to your destination. Edge also provides a clean UI to ease fleet management.)
Identify the data being sent to each destination. For each type of data you will accomplish the following:
For data that requires shaping or normalization, create a pipeline or use the out-of-the-box Packs.
For data that requires reduction, create a pipeline or use the out-of-the-box Packs. (Most Packs help to reduce data volumes by up to 30%.)
Spec out each source:
For your QuickStart, we recommend no more than 2 Sources.
Where does your data need to go:
For your QuickStart, we recommend no more than 2 Destinations.
Another way you can get started quickly with Cribl is with QuickConnect or Routes.
Cribl QuickConnect lets you visually connect Cribl Stream Sources to Destinations using a simple drag-and-drop interface. If all you need are independent connections that link parallel Source/Destination pairs, Cribl Stream’s QuickConnect rapid visual configuration tool is a useful alternative to configuring Routes.
For maximum control, you can use Routes to filter, clone, and cascade incoming data across a related set of Pipelines and Destinations. If you simply need to get data flowing fast, use QuickConnect.
Capture Sample Data set for each Sourcetype.
Capturing a sample data set allows Cribl Pipeline and Packs to validate the logic against your sample data, and show a before and after view to prove that your Reduction and Enrichment use cases are working.
As an alternative to capturing sample data at the Source, use QuickConnect to capture a sample dataset.
In the QuickConnect UI, when you hover over the destination, you can click on Capture. The Capture button captures a sample of data flowing through the Source.
As an alternative to capturing sample data at the Source, use Routes to capture a sample dataset:
For your use cases you will:
Streamline the number of fields or volume of data you send to your analysis tool:
Modify Logs to Metrics:
Enrich data with third-party sources:
Transform data to prepare it with Common Information Model fields (for Splunk) or Elastic Common Schema (ECS for Elastic):
Packs enable Cribl Stream administrators to pack up and share Pipelines and Functions across organizations, and include sample data for testing. The following Packs might be helpful:
As an alternative to Packs and the out-of-the-box Pipelines that are part of the Packs, you can create your own Pipeline. Pipelines are Cribl’s main way to manipulate events. Examine Cribl Tips and Tricks for additional examples and best practices. Look for all the sections that have Try This at Home for Pipeline examples https://docs.cribl.io/stream/usecase-lookups-regex/#try-this-at-home
Please note: If you are working with existing data source being sent to your downstream systems and you do nothing to the output from Cribl Stream, it has a may break any existing dependencies on the original format of the data. Be sure to consult this Best Practices blog or the users and owners of your downstream systems before committing any data source to a destination from within Cribl Stream.
Technical Use Cases Tested:
Also select Monitoring > Data > Pipelines and examine slicendice.
For additional examples, see:
When you’re convinced that Stream is right for you, reach out to your Cribl team and we can work with you on advanced topics like architecture, sizing, pricing, and anything else you need to get started!
Classic choice. Sadly, our website is designed for all modern supported browsers like Edge, Chrome, Firefox, and Safari
Got one of those handy?