Skip to main content

Interface

In an effort to make it as easy as possible to create custom chains, we've implemented a "Runnable" protocol that most components implement. This is a standard interface with a few different methods, which make it easy to define custom chains as well as making it possible to invoke them in a standard way. The standard interface exposed includes:

  • stream: stream back chunks of the response
  • invoke: call the chain on an input
  • batch: call the chain on a list of inputs
  • streamLog: stream back intermediate steps as they happen, in addition to the final response

The input type varies by component :

ComponentInput Type
PromptObject
RetrieverSingle string
LLM, ChatModelSingle string, list of chat messages or PromptValue
ToolSingle string, or object, depending on the tool
OutputParserThe output of an LLM or ChatModel

The output type also varies by component :

ComponentOutput Type
LLMString
ChatModelChatMessage
PromptPromptValue
RetrieverList of documents
ToolDepends on the tool
OutputParserDepends on the parser

You can combine runnables (and runnable-like objects such as functions and objects whose values are all functions) into sequences in two ways:

  • Call the .pipe instance method, which takes another runnable-like as an argument
  • Use the RunnableSequence.from([]) static method with an array of runnable-likes, which will run in sequence when invoked

See below for examples of how this looks.

Stream

npm install @langchain/openai
import { ChatOpenAI } from "@langchain/openai";
import { PromptTemplate } from "@langchain/core/prompts";

const model = new ChatOpenAI({});
const promptTemplate = PromptTemplate.fromTemplate(
"Tell me a joke about {topic}"
);

const chain = promptTemplate.pipe(model);

const stream = await chain.stream({ topic: "bears" });

// Each chunk has the same interface as a chat message
for await (const chunk of stream) {
console.log(chunk?.content);
}

/*
Why don't bears wear shoes?

Because they have bear feet!
*/

API Reference:

Invoke

import { ChatOpenAI } from "@langchain/openai";
import { PromptTemplate } from "@langchain/core/prompts";
import { RunnableSequence } from "@langchain/core/runnables";

const model = new ChatOpenAI({});
const promptTemplate = PromptTemplate.fromTemplate(
"Tell me a joke about {topic}"
);

// You can also create a chain using an array of runnables
const chain = RunnableSequence.from([promptTemplate, model]);

const result = await chain.invoke({ topic: "bears" });

console.log(result);
/*
AIMessage {
content: "Why don't bears wear shoes?\n\nBecause they have bear feet!",
}
*/

API Reference:

Batch

import { ChatOpenAI } from "@langchain/openai";
import { PromptTemplate } from "@langchain/core/prompts";

const model = new ChatOpenAI({});
const promptTemplate = PromptTemplate.fromTemplate(
"Tell me a joke about {topic}"
);

const chain = promptTemplate.pipe(model);

const result = await chain.batch([{ topic: "bears" }, { topic: "cats" }]);

console.log(result);
/*
[
AIMessage {
content: "Why don't bears wear shoes?\n\nBecause they have bear feet!",
},
AIMessage {
content: "Why don't cats play poker in the wild?\n\nToo many cheetahs!"
}
]
*/

API Reference:

You can also pass a batchOptions argument to the call. There are options to set maximum concurrency and whether or not to return exceptions instead of throwing them (useful for gracefully handling failures!):

import { ChatOpenAI } from "@langchain/openai";
import { PromptTemplate } from "@langchain/core/prompts";

const model = new ChatOpenAI({
modelName: "badmodel",
});
const promptTemplate = PromptTemplate.fromTemplate(
"Tell me a joke about {topic}"
);

const chain = promptTemplate.pipe(model);

const result = await chain.batch(
[{ topic: "bears" }, { topic: "cats" }],
{},
{ returnExceptions: true, maxConcurrency: 1 }
);

console.log(result);
/*
[
NotFoundError: The model `badmodel` does not exist
at Function.generate (/Users/jacoblee/langchain/langchainjs/node_modules/openai/src/error.ts:71:6)
at OpenAI.makeStatusError (/Users/jacoblee/langchain/langchainjs/node_modules/openai/src/core.ts:381:13)
at OpenAI.makeRequest (/Users/jacoblee/langchain/langchainjs/node_modules/openai/src/core.ts:442:15)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async file:///Users/jacoblee/langchain/langchainjs/langchain/dist/chat_models/openai.js:514:29
at RetryOperation._fn (/Users/jacoblee/langchain/langchainjs/node_modules/p-retry/index.js:50:12) {
status: 404,
NotFoundError: The model `badmodel` does not exist
at Function.generate (/Users/jacoblee/langchain/langchainjs/node_modules/openai/src/error.ts:71:6)
at OpenAI.makeStatusError (/Users/jacoblee/langchain/langchainjs/node_modules/openai/src/core.ts:381:13)
at OpenAI.makeRequest (/Users/jacoblee/langchain/langchainjs/node_modules/openai/src/core.ts:442:15)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async file:///Users/jacoblee/langchain/langchainjs/langchain/dist/chat_models/openai.js:514:29
at RetryOperation._fn (/Users/jacoblee/langchain/langchainjs/node_modules/p-retry/index.js:50:12) {
status: 404,
]
*/

API Reference:

Stream log

All runnables also have a method called .streamLog() which is used to stream all or part of the intermediate steps of your chain/sequence as they happen.

This is useful to show progress to the user, to use intermediate results, or to debug your chain. You can stream all steps (default) or include/exclude steps by name, tags or metadata.

This method yields JSONPatch ops that when applied in the same order as received build up the RunState.

Here's an example with streaming intermediate documents from a retrieval chain:

import { HNSWLib } from "@langchain/community/vectorstores/hnswlib";
import { ChatOpenAI, OpenAIEmbeddings } from "@langchain/openai";
import { formatDocumentsAsString } from "langchain/util/document";
import { StringOutputParser } from "@langchain/core/output_parsers";
import {
RunnablePassthrough,
RunnableSequence,
} from "@langchain/core/runnables";
import {
ChatPromptTemplate,
HumanMessagePromptTemplate,
SystemMessagePromptTemplate,
} from "@langchain/core/prompts";

// Initialize the LLM to use to answer the question.
const model = new ChatOpenAI({});

const vectorStore = await HNSWLib.fromTexts(
[
"mitochondria is the powerhouse of the cell",
"mitochondria is made of lipids",
],
[{ id: 1 }, { id: 2 }],
new OpenAIEmbeddings()
);

// Initialize a retriever wrapper around the vector store
const vectorStoreRetriever = vectorStore.asRetriever();

// Create a system & human prompt for the chat model
const SYSTEM_TEMPLATE = `Use the following pieces of context to answer the question at the end.
If you don't know the answer, just say that you don't know, don't try to make up an answer.
----------------
{context}`;
const messages = [
SystemMessagePromptTemplate.fromTemplate(SYSTEM_TEMPLATE),
HumanMessagePromptTemplate.fromTemplate("{question}"),
];
const prompt = ChatPromptTemplate.fromMessages(messages);

const chain = RunnableSequence.from([
{
context: vectorStoreRetriever.pipe(formatDocumentsAsString),
question: new RunnablePassthrough(),
},
prompt,
model,
new StringOutputParser(),
]);

const stream = await chain.streamLog("What is the powerhouse of the cell?");

for await (const chunk of stream) {
console.log(JSON.stringify(chunk));
console.log();
}

/*
{"ops":[{"op":"replace","path":"","value":{"id":"5a79d2e7-171a-4034-9faa-63af88e5a451","streamed_output":[],"logs":{}}}]}

{"ops":[{"op":"add","path":"/logs/RunnableMap","value":{"id":"5948dd9f-b827-45f8-9fa6-74e5cc972a56","name":"RunnableMap","type":"chain","tags":["seq:step:1"],"metadata":{},"start_time":"2023-12-23T00:20:46.664Z","streamed_output_str":[]}}]}

{"ops":[{"op":"add","path":"/logs/RunnableSequence","value":{"id":"e9e9ef5e-3a04-4110-9a24-517c929b9137","name":"RunnableSequence","type":"chain","tags":["context"],"metadata":{},"start_time":"2023-12-23T00:20:46.804Z","streamed_output_str":[]}}]}

{"ops":[{"op":"add","path":"/logs/RunnablePassthrough","value":{"id":"4c79d835-87e5-4ff8-b560-987aea83c0e4","name":"RunnablePassthrough","type":"chain","tags":["question"],"metadata":{},"start_time":"2023-12-23T00:20:46.805Z","streamed_output_str":[]}}]}

{"ops":[{"op":"add","path":"/logs/RunnablePassthrough/final_output","value":{"output":"What is the powerhouse of the cell?"}},{"op":"add","path":"/logs/RunnablePassthrough/end_time","value":"2023-12-23T00:20:46.947Z"}]}

{"ops":[{"op":"add","path":"/logs/VectorStoreRetriever","value":{"id":"1e169f18-711e-47a3-910e-ee031f70b6e0","name":"VectorStoreRetriever","type":"retriever","tags":["seq:step:1","hnswlib"],"metadata":{},"start_time":"2023-12-23T00:20:47.082Z","streamed_output_str":[]}}]}

{"ops":[{"op":"add","path":"/logs/VectorStoreRetriever/final_output","value":{"documents":[{"pageContent":"mitochondria is the powerhouse of the cell","metadata":{"id":1}},{"pageContent":"mitochondria is made of lipids","metadata":{"id":2}}]}},{"op":"add","path":"/logs/VectorStoreRetriever/end_time","value":"2023-12-23T00:20:47.398Z"}]}

{"ops":[{"op":"add","path":"/logs/RunnableLambda","value":{"id":"a0d61a88-8282-42be-8949-fb0e8f8f67cd","name":"RunnableLambda","type":"chain","tags":["seq:step:2"],"metadata":{},"start_time":"2023-12-23T00:20:47.495Z","streamed_output_str":[]}}]}

{"ops":[{"op":"add","path":"/logs/RunnableLambda/final_output","value":{"output":"mitochondria is the powerhouse of the cell\n\nmitochondria is made of lipids"}},{"op":"add","path":"/logs/RunnableLambda/end_time","value":"2023-12-23T00:20:47.604Z"}]}

{"ops":[{"op":"add","path":"/logs/RunnableSequence/final_output","value":{"output":"mitochondria is the powerhouse of the cell\n\nmitochondria is made of lipids"}},{"op":"add","path":"/logs/RunnableSequence/end_time","value":"2023-12-23T00:20:47.690Z"}]}

{"ops":[{"op":"add","path":"/logs/RunnableMap/final_output","value":{"question":"What is the powerhouse of the cell?","context":"mitochondria is the powerhouse of the cell\n\nmitochondria is made of lipids"}},{"op":"add","path":"/logs/RunnableMap/end_time","value":"2023-12-23T00:20:47.780Z"}]}

{"ops":[{"op":"add","path":"/logs/ChatPromptTemplate","value":{"id":"5b6cff77-0c52-4218-9bde-d92c33ad12f3","name":"ChatPromptTemplate","type":"prompt","tags":["seq:step:2"],"metadata":{},"start_time":"2023-12-23T00:20:47.864Z","streamed_output_str":[]}}]}

{"ops":[{"op":"add","path":"/logs/ChatPromptTemplate/final_output","value":{"lc":1,"type":"constructor","id":["langchain_core","prompt_values","ChatPromptValue"],"kwargs":{"messages":[{"lc":1,"type":"constructor","id":["langchain_core","messages","SystemMessage"],"kwargs":{"content":"Use the following pieces of context to answer the question at the end.\nIf you don't know the answer, just say that you don't know, don't try to make up an answer.\n----------------\nmitochondria is the powerhouse of the cell\n\nmitochondria is made of lipids","additional_kwargs":{}}},{"lc":1,"type":"constructor","id":["langchain_core","messages","HumanMessage"],"kwargs":{"content":"What is the powerhouse of the cell?","additional_kwargs":{}}}]}}},{"op":"add","path":"/logs/ChatPromptTemplate/end_time","value":"2023-12-23T00:20:47.956Z"}]}

{"ops":[{"op":"add","path":"/logs/ChatOpenAI","value":{"id":"0cc3b220-ca7f-4fd3-88d5-bea1f7417c3d","name":"ChatOpenAI","type":"llm","tags":["seq:step:3"],"metadata":{},"start_time":"2023-12-23T00:20:48.126Z","streamed_output_str":[]}}]}

{"ops":[{"op":"add","path":"/logs/StrOutputParser","value":{"id":"47d9bd52-c14a-420d-8d52-1106d751581c","name":"StrOutputParser","type":"parser","tags":["seq:step:4"],"metadata":{},"start_time":"2023-12-23T00:20:48.666Z","streamed_output_str":[]}}]}

{"ops":[{"op":"add","path":"/logs/ChatOpenAI/streamed_output_str/-","value":""}]}

{"ops":[{"op":"add","path":"/streamed_output/-","value":""}]}

{"ops":[{"op":"add","path":"/logs/ChatOpenAI/streamed_output_str/-","value":"The"}]}

{"ops":[{"op":"add","path":"/streamed_output/-","value":"The"}]}

{"ops":[{"op":"add","path":"/logs/ChatOpenAI/streamed_output_str/-","value":" mitochond"}]}

{"ops":[{"op":"add","path":"/streamed_output/-","value":" mitochond"}]}

{"ops":[{"op":"add","path":"/logs/ChatOpenAI/streamed_output_str/-","value":"ria"}]}

{"ops":[{"op":"add","path":"/streamed_output/-","value":"ria"}]}

{"ops":[{"op":"add","path":"/logs/ChatOpenAI/streamed_output_str/-","value":" is"}]}

{"ops":[{"op":"add","path":"/streamed_output/-","value":" is"}]}

{"ops":[{"op":"add","path":"/logs/ChatOpenAI/streamed_output_str/-","value":" the"}]}

{"ops":[{"op":"add","path":"/streamed_output/-","value":" the"}]}

{"ops":[{"op":"add","path":"/logs/ChatOpenAI/streamed_output_str/-","value":" powerhouse"}]}

{"ops":[{"op":"add","path":"/streamed_output/-","value":" powerhouse"}]}

{"ops":[{"op":"add","path":"/logs/ChatOpenAI/streamed_output_str/-","value":" of"}]}

{"ops":[{"op":"add","path":"/streamed_output/-","value":" of"}]}

{"ops":[{"op":"add","path":"/logs/ChatOpenAI/streamed_output_str/-","value":" the"}]}

{"ops":[{"op":"add","path":"/streamed_output/-","value":" the"}]}

{"ops":[{"op":"add","path":"/logs/ChatOpenAI/streamed_output_str/-","value":" cell"}]}

{"ops":[{"op":"add","path":"/streamed_output/-","value":" cell"}]}

{"ops":[{"op":"add","path":"/logs/ChatOpenAI/streamed_output_str/-","value":"."}]}

{"ops":[{"op":"add","path":"/streamed_output/-","value":"."}]}

{"ops":[{"op":"add","path":"/logs/ChatOpenAI/streamed_output_str/-","value":""}]}

{"ops":[{"op":"add","path":"/streamed_output/-","value":""}]}

{"ops":[{"op":"add","path":"/logs/ChatOpenAI/final_output","value":{"generations":[[{"text":"The mitochondria is the powerhouse of the cell.","generationInfo":{"prompt":0,"completion":0},"message":{"lc":1,"type":"constructor","id":["langchain_core","messages","AIMessageChunk"],"kwargs":{"content":"The mitochondria is the powerhouse of the cell.","additional_kwargs":{}}}}]]}},{"op":"add","path":"/logs/ChatOpenAI/end_time","value":"2023-12-23T00:20:48.841Z"}]}

{"ops":[{"op":"add","path":"/logs/StrOutputParser/final_output","value":{"output":"The mitochondria is the powerhouse of the cell."}},{"op":"add","path":"/logs/StrOutputParser/end_time","value":"2023-12-23T00:20:48.945Z"}]}

{"ops":[{"op":"replace","path":"/final_output","value":{"output":"The mitochondria is the powerhouse of the cell."}}]}
*/

API Reference: