# Langgraph (Agentic Tracing)

### Introduction

This document provides a step-by-step guide on how to integrate **RagaAI Catalyst** with **LangGraph** to enhance observability, tracing, and instrumentation in your LangGraph applications. By leveraging RagaAI Catalyst, you can gain comprehensive insights into the execution of LangGraph components, including Langchain tools, custom methods, and more. This guide will walk you through the setup, tracing, and evaluation process using a sample LangGraph application.

RagaAI introduces seamless tracing for key LangGraph components, including:

* &#x20;Langchain Tools: Automatically track the execution of Langchain tools within your LangGraph workflows.&#x20;
* Methods Annotated as LangGraph Tools (@tool): Gain visibility into methods specifically designed as LangGraph tools and marked with the @tool annotation.
* Custom Methods with @trace\_tool: Utilize the new @trace\_tool annotation to instrument and monitor any method within your LangGraph application, offering flexible and granular tracing.&#x20;

***

### Sample Code

Refer to the provided [Google Colab notebook](https://colab.research.google.com/drive/1JJoUk8BJPZVQXDWWXrb4xXjYsBoQXhtC?usp=sharing#scrollTo=XtBWneLADyuk) for the complete implementation.

***

### Prerequisites

1. **API Keys**: Access keys for RagaAI Catalyst, OpenAI, Anthropic, and Tavily(or other dependencies).
2. **Dependencies**: Install the required libraries using the following command:

```bash
!pip install -U ragaai-catalyst
```

***

### Step 1: Setting Up RagaAI Catalyst

#### 1.1 Initialize RagaAI Catalyst

To begin, initialize the RagaAI Catalyst client with your access and secret keys:

```python
from ragaai_catalyst import RagaAICatalyst

catalyst = RagaAICatalyst(
    access_key="access_key",
    secret_key="secret_key"
)
```

#### 1.2 Initialize the Tracer

Create a `Tracer` object to define the project and dataset for storing traces:

```python
from ragaai_catalyst.tracers import Tracer

tracer = Tracer(
    project_name="Langgraph_testing",
    dataset_name="customer_support1",
    tracer_type="agentic/langgraph",
)
init_tracing(catalyst=catalyst, tracer=tracer)
```

***

### Step 2: Instrumenting LangGraph Components

RagaAI Catalyst provides decorators to trace specific components of your LangGraph application. Below are the key decorators:

1. **`@trace_tool`**: Traces custom methods or LangGraph tools.
2. **`@trace_llm`**: Traces LLM calls.
3. **`@trace_agent`**: Traces agent executions.
4. **`@trace_custom`**: Traces custom components.

#### 2.1 Tracing LangGraph Tools

To trace a LangGraph tool, use the `@trace_tool` decorator. For example:

For complete code refer [colab](https://colab.research.google.com/drive/1JJoUk8BJPZVQXDWWXrb4xXjYsBoQXhtC?usp=sharing#scrollTo=xVBCbSbf8ejK)

```python
from ragaai_catalyst import trace_tool

@tool
@trace_tool("lookup_policy")
def lookup_policy(query: str) -> str:
    """Consult the company policies to check whether certain options are permitted."""
    docs = retriever.query(query, k=2)
    return "\n\n".join([doc["page_content"] for doc in docs])
```

***

### Step 3: Running the LangGraph Application

#### 3.1 Define the LangGraph Workflow

Create a LangGraph workflow by defining nodes and edges. For example:

```python
from langgraph.graph import StateGraph, START, END

builder = StateGraph(State)
builder.add_node("assistant", Assistant(part_1_assistant_runnable))
builder.add_node("tools", create_tool_node_with_fallback(part_1_tools))
builder.add_edge(START, "assistant")
builder.add_conditional_edges("assistant", tools_condition)
builder.add_edge("tools", "assistant")
```

***

### Step 4: Viewing and Analyzing Traces

#### 4.1 View Traces in RagaAI Catalyst

Once the traces are uploaded, navigate to your dataset in the RagaAI Catalyst dashboard to view individual traces.

<figure><img src="https://1811327582-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FYbIiNdp1QbG4avl7VShw%2Fuploads%2FRfHZLf1Wt2hDxm33B3rw%2FUntitled%20design.gif?alt=media&#x26;token=21442112-c209-4f69-993c-227b74f80a95" alt=""><figcaption></figcaption></figure>

#### 4.2 Run Evaluations and Metrics

1. Navigate to your dataset and click on the **Evaluate** button.
2. Choose from pre-configured metrics such as **Hallucination**, **Cosine Similarity**, **Honesty**, or **Toxicity**.
3. Configure the metric by selecting the evaluation type (e.g., LLM, Agent, or Tool) and defining the schema.
4. Run the metric and analyze the results for insights into your application's performance.

#### 4.3 Compare Traces

1. Within the dataset, click on the **Compare** button.
2. Select up to 3 datapoints (traces) to compare.
3. View the diff view, which highlights differences in code and attributes between traces.

#### 4.4 Compare Experiments

1. In the Dataset view, select **Compare Datasets**.
2. Choose up to 3 experiments for comparison.
3. Analyze your graphs:

<figure><img src="https://1811327582-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FYbIiNdp1QbG4avl7VShw%2Fuploads%2FauILBxqQMMn0YyrlWyzP%2FUntitled%20design%20(1).gif?alt=media&#x26;token=8d2572d6-2686-431b-ad36-419133f95560" alt=""><figcaption></figcaption></figure>

***

### Conclusion

By integrating RagaAI Catalyst with LangGraph, you can achieve enhanced observability and traceability in your LangGraph applications. This documentation provides a comprehensive guide to setting up, instrumenting, and analyzing your application using RagaAI Catalyst.

***

For any queries or support, contact the RagaAI Catalyst team at <support@ragaai.com>.
