Courses
-
Tracing LLM Applications
NEW! Trace key operations for end-to-end-visibility in multi-step pipelines. Visualize execution flows in complex LLM chains. Debug failures and understand model behavior using detailed traces, contextual annotations, and performance metrics.
Enroll for free
-
Getting Started with LLM Observability
Build observability into an LLM application. Monitor LLM performance and costs. Explore trace data with prompt inputs and response outputs. Analyze token usage and latency metrics. Identify errors and discover root causes.
Enroll for free