Courses
-
Investigate with LLM Observability
NEW! Investigate LLM application issues using metrics and traces. Move from high-level metrics to individual traces to identify the root cause of latency problems, silent pipeline failures, and quality issues behind successful operational metrics.
Enroll for free
-
Tracing LLM Applications
Trace key operations for end-to-end-visibility in multi-step pipelines. Visualize execution flows in complex LLM chains. Debug failures and understand model behavior using detailed traces, contextual annotations, and performance metrics.
Enroll for free
-
Getting Started with LLM Observability
Build observability into an LLM application. Monitor LLM performance and costs. Explore trace data with prompt inputs and response outputs. Analyze token usage and latency metrics. Identify errors and discover root causes.
Enroll for free