Anthropic
Replace https://api.anthropic.com/
with https://llmfoundry.straive.com/anthropic/
.
All Anthropic models and APIs are supported, including:
claude-3-haiku-20240307
claude-3-5-haiku-20241022
claude-3-5-sonnet-20241022
Curl
curl -X POST https://llmfoundry.straive.com/anthropic/v1/messages \
-H "Authorization: Bearer $LLMFOUNDRY_TOKEN:my-test-project" \
-H "Content-Type: application/json" \
-d '{"model": "claude-3-haiku-20240307", "max_tokens": 10, "messages": [{"role": "user", "content": "What is 2 + 2"}]}'
Python requests
import os
import requests # Or replace requests with httpx
response = requests.post(
"https://llmfoundry.straive.com/anthropic/v1/messages",
headers={"Authorization": f"Bearer {os.environ['LLMFOUNDRY_TOKEN']}:my-test-project"},
json={"model": "claude-3-haiku-20240307", "max_tokens": 10, "messages": [{"role": "user", "content": "What is 2 + 2"}]}
)
print(response.json())
JavaScript
const token = process.env.LLMFOUNDRY_TOKEN;
const response = await fetch("https://llmfoundry.straive.com/anthropic/v1/messages", {
method: "POST",
// You can also use x-api-key: `${token}:my-test-project` instead of the Authorization header
headers: { "Content-Type": "application/json", Authorization: `Bearer ${token}:my-test-project` },
// If the user is already logged into LLM Foundry, use `credentials: "include"` to send **THEIR** API token instead of the `Authorization` header.
// credentials: "include",
body: JSON.stringify({
model: "claude-3-haiku-20240307",
max_tokens: 10,
messages: [{ role: "user", content: "What is 2 + 2" }],
}),
});
console.log(await response.json());
LangChain
import os
from langchain_anthropic import ChatAnthropic
chat_model = ChatAnthropic(
anthropic_api_key=f'{os.environ["LLMFOUNDRY_TOKEN"]}:my-test-project',
anthropic_api_url="https://llmfoundry.straive.com/anthropic/",
model_name="claude-3-haiku-20240307"
)
print(chat_model.invoke("What is 2 + 2?").content)