Cloudflare

Replace https://api.cloudflare.com/client/v4/accounts/{account}/ai/run/... with https://llmfoundry.straive.com/cloudflare/....

All Cloudflare Workes AI models and APIs are supported, including:

  • @cf/meta/llama-3-8b-instruct
  • @cf/defog/sqlcoder-7b-2
  • @cf/mistral/mistral-7b-instruct-v0.1

Curl

curl -X POST https://llmfoundry.straive.com/cloudflare/@cf/mistral/mistral-7b-instruct-v0.1 \
  -H "Authorization: Bearer $LLMFOUNDRY_TOKEN:my-test-project" \
  -H "Content-Type: application/json" \
  -d '{"messages": [{"role": "user", "content": "What is 2 + 2"}]}'

Python requests

import os
import requests  # Or replace requests with httpx

response = requests.post(
    "https://llmfoundry.straive.com/cloudflare/@cf/mistral/mistral-7b-instruct-v0.1",
    headers={"Authorization": f"Bearer {os.environ['LLMFOUNDRY_TOKEN']}:my-test-project"},
    json={"messages": [{"role": "user", "content": "What is 2 + 2"}]}
)
print(response.json())

JavaScript

const token = process.env.LLMFOUNDRY_TOKEN;
const response = await fetch("https://llmfoundry.straive.com/cloudflare/@cf/mistral/mistral-7b-instruct-v0.1", {
  method: "POST",
  // If the user is already logged into LLM Foundry, use `credentials: "include"` to send **THEIR** API token instead of the `Authorization` header.
  headers: { "Content-Type": "application/json", Authorization: `Bearer ${token}:my-test-project` },
  credentials: "include",
  body: JSON.stringify({ messages: [{ role: "user", content: "What is 2 + 2?" }] }),
});
console.log(await response.json());