OpenRouter integration

You can elegantly use Not Diamond in combination with OpenRouter if you want to use them as a proxy for your LLM calls. We provide a simple way to map Not Diamond's LLM provider names to their equivalent in OpenRouter:

import requests
from notdiamond import NotDiamond

OPENROUTER_API_KEY="YOUR_OPENROUTER_API_KEY"

client = NotDiamond()

llm_providers = ['openai/gpt-3.5-turbo', 'openai/gpt-4-1106-preview', 'openai/gpt-4-turbo-preview', 
                 'anthropic/claude-3-haiku-20240307', 'anthropic/claude-3-opus-20240229']

messages = [
    {"role": "system", "content": "You are a world class software developer."},
    {"role": "user", "content": "Write a merge sort in Python. Be concise"}
]

session_id, provider = client.chat.completions.model_select(
    messages = messages,
    model=llm_providers
)

# Map the recommended model from Not Diamond to OpenRouter's naming conventions
response = requests.post(
    url="https://openrouter.ai/api/v1/chat/completions",
    headers={
        "Authorization": f"Bearer {OPENROUTER_API_KEY}"
    },
    json={
        "model": provider.openrouter_model,
        "messages": messages
    }
)

print(session_id)
print(provider.openrouter_model)
print(response.json()['choices'][0]['message']['content'])

Note that not all models supported by Not Diamond have a direct mapping to OpenRouter's models. When there is no mapping found, provider.openrouter_model will return None and a warning message will be printed in the logs.