Skip to content

LangChain Integration

The AgentLensCallbackHandler plugs into LangChain's callback system to automatically trace chains, agents, LLM calls, and tool invocations without changing your existing code.

Installation

terminal
pip install vectry-agentlens langchain langchain-openai

Quick setup

main.py
import agentlens
from agentlens.integrations.langchain import AgentLensCallbackHandler

agentlens.init(
    api_key="your-api-key",
    endpoint="https://agentlens.vectry.tech",
)

handler = AgentLensCallbackHandler()

Using with chains

Pass the handler in the callbacks config:

chain_example.py
from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.output_parsers import StrOutputParser

llm = ChatOpenAI(model="gpt-4o")
prompt = ChatPromptTemplate.from_messages([
    ("system", "You are a helpful assistant."),
    ("user", "{input}"),
])

chain = prompt | llm | StrOutputParser()

result = chain.invoke(
    {"input": "Explain recursion"},
    config={"callbacks": [handler]},
)

Using with agents

agent_example.py
from langchain_openai import ChatOpenAI
from langchain.agents import AgentExecutor, create_tool_calling_agent
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.tools import tool

@tool
def calculator(expression: str) -> str:
    """Evaluate a math expression."""
    return str(eval(expression))

llm = ChatOpenAI(model="gpt-4o")
prompt = ChatPromptTemplate.from_messages([
    ("system", "You are a helpful math assistant."),
    ("user", "{input}"),
    ("placeholder", "{agent_scratchpad}"),
])

agent = create_tool_calling_agent(llm, [calculator], prompt)
executor = AgentExecutor(agent=agent, tools=[calculator])

result = executor.invoke(
    {"input": "What is 42 * 17 + 3?"},
    config={"callbacks": [handler]},
)

What gets captured

The callback handler maps LangChain events to AgentLens concepts:

LangChain EventAgentLens TypeCaptured Data
Chain start/endCHAIN spanInput/output, duration
LLM start/endLLM_CALL spanModel, messages, tokens, cost, duration
Tool start/endTOOL_CALL spanTool name, input args, output, duration
Agent actionTOOL_SELECTION decisionSelected tool, reasoning
RetryRETRY eventError message, attempt count
ErrorERROR eventException type, message, traceback

Global callbacks

To trace all LangChain operations without passing callbacks individually, set the handler globally:

global.py
from langchain_core.globals import set_llm_cache
from langchain.callbacks.manager import set_handler

set_handler(handler)

# Now all chains and agents are traced automatically
result = chain.invoke({"input": "Hello"})
# No need to pass config={"callbacks": [handler]}

Handler options

ParameterTypeDefaultDescription
trace_namestr | NoneNoneOverride the default trace name
tagslist[str][]Tags to attach to all traces
capture_ioboolTrueCapture input/output payloads
options.py
handler = AgentLensCallbackHandler(
    trace_name="my-langchain-app",
    tags=["production", "langchain"],
    capture_io=True,
)