Log, Trace, and Monitor
When building apps or agents using Langchain, you end up making multiple API calls to fulfill a single user request. However, these requests are not chained when you want to analyse them. With Portkey, all the embeddings, completions, and other requests from a single user request will get logged and traced to a common ID, enabling you to gain full visibility of user interactions.
This notebook serves as a step-by-step guide on how to log, trace, and monitor Langchain LLM calls using Portkey
in your Langchain app.
First, let's import Portkey, OpenAI, and Agent tools
import os
from langchain.agents import AgentExecutor, create_openai_tools_agent
from langchain_openai import ChatOpenAI
from portkey_ai import PORTKEY_GATEWAY_URL, createHeaders
Paste your OpenAI API key below. (You can find it here)
os.environ["OPENAI_API_KEY"] = "..."
Get Portkey API Key
- Sign up for Portkey here
- On your dashboard, click on the profile icon on the bottom left, then click on "Copy API Key"
- Paste it below
PORTKEY_API_KEY = "..." # Paste your Portkey API Key here
Set Trace ID
- Set the trace id for your request below
- The Trace ID can be common for all API calls originating from a single request
TRACE_ID = "uuid-trace-id" # Set trace id here