Bedrock

Replace https://bedrock.us-east-1.amazonaws.com/ with https://llmfoundry.straive.com/bedrock/.

All Amazon Bedrock models and APIs are supported, including:

  • amazon.titan-text-lite-v1
  • amazon.titan-text-express-v1
  • amazon.titan-text-premier-v1:0
  • anthropic.claude-3-haiku-20240307-v1:0:200k
  • anthropic.claude-3-5-haiku-20241022-v1:0
  • anthropic.claude-3-5-sonnet-20241022-v2:0
  • meta.llama3-2-11b-instruct-v1:0
  • meta.llama3-2-90b-instruct-v1:0

Curl

curl -X POST https://llmfoundry.straive.com/bedrock/amazon.titan-text-lite-v1/converse \
  -H "Authorization: Bearer $LLMFOUNDRY_TOKEN:my-test-project" \
  -H "Content-Type: application/json" \
  -d '{"messages": [{"role": "user", "content": [{"text": "What is 2 + 2"}]}], "inferenceConfig": {"temperature": 0.1}}'

Python requests

import os
import requests  # Or replace requests with httpx

response = requests.post(
    "https://llmfoundry.straive.com/bedrock/amazon.titan-text-lite-v1/converse",
    headers={"Authorization": f"Bearer {os.environ['LLMFOUNDRY_TOKEN']}:my-test-project"},
    json={
        "messages": [{"role": "user", "content": [{"text": "What is 2 + 2"}]}],
        "inferenceConfig": {"temperature": 0.1}
    }
)
print(response.json())

JavaScript

const token = process.env.LLMFOUNDRY_TOKEN;
const response = await fetch("https://llmfoundry.straive.com/bedrock/amazon.titan-text-lite-v1/converse", {
  method: "POST",
  headers: { "Content-Type": "application/json", Authorization: `Bearer ${token}:my-test-project` },
  // If the user is already logged into LLM Foundry, use `credentials: "include"` to send **THEIR** API token instead of the `Authorization` header.
  credentials: "include",
  body: JSON.stringify({
    messages: [{ role: "user", content: [{ text: "What is 2 + 2" }] }],
    inferenceConfig: { temperature: 0.1 },
  }),
});
console.log(await response.json());