Process Logs Out of the Box with Integration Pipelines
Structure and enrich ingested logs from common sources using out-of-the-box and modified Integration Pipelines.
Logs contain key details about your infrastructure and applications. Before you can start querying and analyzing data from logs, you have to extract and enrich the log details if they are not in the JSON format. To do so, Datadog offers Log Pipelines and Processors. You can filter a subset of logs into a Pipeline with a unique set of Processors that perform data-structuring actions, such as parsing or remapping attributes. To help you get started, Datadog offers over 270 Integration Pipelines for common logs sources like coding languages, web servers, and cloud services. In this course, you'll learn about Pipelines and Processors and how to use an Integration Pipeline from the Pipeline Library. You'll also clone and modify an Integration Pipeline.
By the end of this course, you'll be able to do the following:
Developers and Datadog users who will be configuring log pipelines and settings for Datadog Log Management
In order to complete the course, you will need:
At the bottom of each lesson, click MARK LESSON COMPLETE AND CONTINUE button so that you are marked complete for each lesson and can receive the certificate at the end of the course.
Please note that your enrollment in this course ends after 30 days. You can re-enroll at any time and pick up where you left off.
Centralized Log Processing in Datadog
Integration Pipelines
Lab: Process Logs Out of the Box with Integration Pipelines
Processors
Working with Pipelines and Processors
Lab: Clone and Modify an Integration Pipeline
Summary
Feedback Survey