Skip to main content

LangChain Integration

TraceAgent provides a native callback handler for LangChain that automatically captures chain steps, LLM calls, tool usage, and retriever activity.

Installation

pip install trace-agent-sdk trace-agent-langchain

Basic Usage

from trace_agent_sdk import TraceAgentClient
from trace_agent_langchain import TraceAgentLangChainCallback

client = TraceAgentClient("http://localhost:8000")
run = client.start_run("langchain-agent", "Answer user question")

callback = TraceAgentLangChainCallback(run)
agent.invoke({"input": "What's the weather?"}, callbacks=[callback])

run.finish({"status": "success"})

Captured Events

EventCaptured Data
Chain start/endChain type, input, output, duration
LLM call/responsePrompt, response, token counts
Tool use/resultTool name, parameters, output
Retriever queryQuery, retrieved documents
ErrorError type, message, traceback

With Local Models (LM Studio)

from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
base_url="http://localhost:1234/v1",
api_key="not-needed",
model="local-model"
)

callback = TraceAgentLangChainCallback(run)
agent.invoke({"input": "..."}, callbacks=[callback])
tip

Set OPENAI_BASE_URL environment variable to avoid hardcoding the endpoint.

Environment Variables

VariableDescription
TRACE_AGENT_BASE_URLServer URL for the callback handler
OPENAI_BASE_URLOpenAI-compatible endpoint for local models
OPENAI_API_KEYAPI key (use any value for local models)

What's Next?

💬 Comments