LangChain integration

Installation

Requires Python 3.10+

pip install notdiamond[create]

Integration

If you already have a LangChain project up and running, you can easily integrate Not Diamond into your code using NotDiamondRoutedRunnable. It will query Not Diamond's API, then route prompts to the selected model automatically.

This class works like other langchain Runnables by chaining together into an invokable object:

PromptTemplate use case

from langchain_core.prompts import PromptTemplate
- from langchain_openai import ChatOpenAI
+ from notdiamond.toolkit.langchain import NotDiamondRoutedRunnable

user_input = "Write merge sort in Python."

prompt_template = PromptTemplate.from_template(
    "You are a world class software developer. {user_input}"
)
- model = ChatOpenAI(model_name='gpt-4o')
+ nd_routed_runnable = NotDiamondRoutedRunnable(
+     nd_api_key="sk-...", nd_llm_configs=["openai/gpt-4o", "openai/gpt-4o-mini"]
+ )

chain = prompt_template | nd_routed_runnable
result = chain.invoke({"user_input": user_input})
print(result.content)
from langchain_core.prompts import PromptTemplate
from langchain_openai import ChatOpenAI

user_input = "Write merge sort in Python."

prompt_template = PromptTemplate.from_template(
    "You are a world class software developer. {user_input}"
)

model = ChatOpenAI(model_name='gpt-3.5-turbo')
chain = prompt_template | model
result = chain.invoke({"user_input": user_input})

print(result.content)
from langchain_core.prompts import PromptTemplate
from notdiamond.llms.llm import NDLLM

user_input = "Write merge sort in Python."

prompt_template = PromptTemplate.from_template(
    "You are a world class software developer. {user_input}"
)

nd_llm = NDLLM(llm_providers=['openai/gpt-3.5-turbo', 'openai/gpt-4', 'anthropic/claude-2.1', 'google/gemini-pro'])
result, session_id, provider = nd_llm.invoke(prompt_template=prompt_template,
                                                input={"user_input": user_input})

print(provider.model)
print(result.content)

Streaming

NotDiamondRoutedRunnable can also be used to stream content:

from langchain_core.prompts import ChatPromptTemplate, PromptTemplate
- from langchain_openai import ChatOpenAI
+ from notdiamond.toolkit.langchain import NotDiamondRoutedRunnable


prompt_template = PromptTemplate.from_template(
    "You are a world class software developer. {user_input}"
)

- chat = ChatOpenAI(model_name="gpt-3.5-turbo")
+ chat = NotDiamondRoutedRunnable(nd_llm_configs=['openai/gpt-3.5-turbo', 'anthropic/claude-3-opus-20240229'])

for chunk in chat.stream(prompt_template=prompt_template, input={"user_input": "Write a merge sort in Python."}):
    print(chunk.content, end="", flush=True)
from langchain_core.prompts import ChatPromptTemplate, PromptTemplate
from langchain_openai import ChatOpenAI


prompt_template = PromptTemplate.from_template(
    "You are a world class software developer. {user_input}"
)

chat = ChatOpenAI(model_name="gpt-3.5-turbo")

for chunk in chat.stream(prompt_template.format(user_input="Write merge sort in Python.")):
    print(chunk.content, end="", flush=True)
from langchain_core.prompts import ChatPromptTemplate, PromptTemplate
from notdiamond.llms.llm import NDLLM


prompt_template = PromptTemplate.from_template(
    "You are a world class software developer. {user_input}"
)

chat = NDLLM(llm_providers=['openai/gpt-3.5-turbo', 'openai/gpt-4', 'anthropic/claude-2.1', 'google/gemini-pro'])

for chunk in chat.stream(prompt_template.format(user_input="Write merge sort in Python.")):
    print(chunk.content, end="", flush=True)

🚧

Model limitations of NotDiamondRoutedRunnable

At this time users cannot use this langchain integration to route to Perplexity. We instead recommend using notdiamond[create] as described in model_select vs. create.