Skip to content

Using Python/Langchain

Here we'll explore a few more examples of using the Chat Completions API with python modules. First we'll use the openai module, and then will use langchain.

OpenAI module

Synchronous

This example uses the openai module to make a synchronous request to chat.dartmouth.edu and then displays what is generated by the LLM.

from openai import OpenAI

# init the client but point it to TGI
client = OpenAI(
    base_url="https://chat.dartmouth.edu/api",
    api_key="PLACE_KEY_HERE"
)

chat_completion = client.chat.completions.create(
    model="anthropic.claude-3-5-haiku-20241022",
    messages=[
        {"role": "system", "content": "You are a helpful assistant." },
        {"role": "user", "content": "What is deep learning?"}
    ],
    stream=False
)

print(chat_completion.choices[0].message.content)

Streaming

This example uses the openai module to make a request that is streamed. As soon as a portion of the response has been returned, it will immediately display it to the user.

from openai import OpenAI

# init the client but point it to TGI
client = OpenAI(
    base_url="https://chat.dartmouth.edu/api",
    api_key="PLACE_KEY_HERE"
)

chat_completion = client.chat.completions.create(
    model="anthropic.claude-3-5-haiku-20241022",
    messages=[
        {"role": "system", "content": "You are a helpful assistant." },
        {"role": "user", "content": "What is deep learning?"}
    ],
    stream=True
)

# iterate and print stream
for message in chat_completion:
    try:
        print(message.choices[0].delta.content, end="")
    except:
        pass
else:
  print()

Langchain

NOTE: These and other examples can be found at https://python.langchain.com/docs/integrations/chat/openai/.

Simple message

from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
    model="anthropic.claude-3-5-haiku-20241022",
    api_key="PLACE_KEY_HERE",
    base_url="https://chat.dartmouth.edu/api",
)

input_text = "The meaning of life is "
print(llm.invoke(input_text).content)

List of messages

from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
    model="anthropic.claude-3-5-haiku-20241022",
    api_key="PLACE_KEY_HERE",
    base_url="https://chat.dartmouth.edu/api",
)

messages = [
    (
        "system",
        "You are a helpful assistant that translates English to French. Translate the user sentence.",
    ),
    ("user", "I love programming."),
]

print(llm.invoke(messages).content)

As a Chain

from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate

llm = ChatOpenAI(
    model="anthropic.claude-3-5-haiku-20241022",
    api_key="PLACE_KEY_HERE",
    base_url="https://chat.dartmouth.edu/api",
)

prompt = ChatPromptTemplate.from_messages(
    [
        (
            "system",
            "You are a helpful assistant that translates {input_language} to {output_language}.",
        ),
        ("user", "{input}"),
    ]
)

chain = prompt | llm
response = chain.invoke(
    {
        "input_language": "English",
        "output_language": "German",
        "input": "I love programming.",
    }
)

print(response.content)

Langchain Dartmouth

For convenience, Dartmouth has created a python package to simplify access to the various LLMs. Details about this package can be found at https://dartmouth.github.io/langchain-dartmouth-cookbook/.