Langgraph (Agentic Tracing)
Introduction
This document provides a step-by-step guide on how to integrate RagaAI Catalyst with LangGraph to enhance observability, tracing, and instrumentation in your LangGraph applications. By leveraging RagaAI Catalyst, you can gain comprehensive insights into the execution of LangGraph components, including Langchain tools, custom methods, and more. This guide will walk you through the setup, tracing, and evaluation process using a sample LangGraph application.
RagaAI introduces seamless tracing for key LangGraph components, including:
Langchain Tools: Automatically track the execution of Langchain tools within your LangGraph workflows.
Methods Annotated as LangGraph Tools (@tool): Gain visibility into methods specifically designed as LangGraph tools and marked with the @tool annotation.
Custom Methods with @trace_tool: Utilize the new @trace_tool annotation to instrument and monitor any method within your LangGraph application, offering flexible and granular tracing.
Sample Code
Refer to the provided Google Colab notebook for the complete implementation.
Prerequisites
API Keys: Access keys for RagaAI Catalyst, OpenAI, Anthropic, and Tavily(or other dependencies).
Dependencies: Install the required libraries using the following command:
Step 1: Setting Up RagaAI Catalyst
1.1 Initialize RagaAI Catalyst
To begin, initialize the RagaAI Catalyst client with your access and secret keys:
1.2 Initialize the Tracer
Create a Tracer
object to define the project and dataset for storing traces:
1.3 Initialize Tracing
Enable tracing by initializing it with the catalyst
and tracer
objects:
Step 2: Instrumenting LangGraph Components
RagaAI Catalyst provides decorators to trace specific components of your LangGraph application. Below are the key decorators:
@trace_tool
: Traces custom methods or LangGraph tools.@trace_llm
: Traces LLM calls.@trace_agent
: Traces agent executions.@trace_custom
: Traces custom components.
2.1 Tracing LangGraph Tools
To trace a LangGraph tool, use the @trace_tool
decorator. For example:
For complete code refer colab
Step 3: Running the LangGraph Application
3.1 Define the LangGraph Workflow
Create a LangGraph workflow by defining nodes and edges. For example:
Step 4: Viewing and Analyzing Traces
4.1 View Traces in RagaAI Catalyst
Once the traces are uploaded, navigate to your dataset in the RagaAI Catalyst dashboard to view individual traces.
4.2 Run Evaluations and Metrics
Navigate to your dataset and click on the Evaluate button.
Choose from pre-configured metrics such as Hallucination, Cosine Similarity, Honesty, or Toxicity.
Configure the metric by selecting the evaluation type (e.g., LLM, Agent, or Tool) and defining the schema.
Run the metric and analyze the results for insights into your application's performance.
4.3 Compare Traces
Within the dataset, click on the Compare button.
Select up to 3 datapoints (traces) to compare.
View the diff view, which highlights differences in code and attributes between traces.
4.4 Compare Experiments
In the Dataset view, select Compare Datasets.
Choose up to 3 experiments for comparison.
Analyze your graphs:
Conclusion
By integrating RagaAI Catalyst with LangGraph, you can achieve enhanced observability and traceability in your LangGraph applications. This documentation provides a comprehensive guide to setting up, instrumenting, and analyzing your application using RagaAI Catalyst.
For any queries or support, contact the RagaAI Catalyst team at support@ragaai.com.
Last updated
Was this helpful?