Back to Models
Refuel LLM V2
togethercomputer/refuel-llm-v2
Refuel AI
Chat
Description
Refuel LLM V2 by Refuel AI. This model supports chat capabilities. Variant: V2.
Specifications
- Context Length:
- 16384
- Variant:
- V2
Pricing
- Input Cost:
- $0.1200/100K tokens
- Output Cost:
- $0.1200/100K tokens
Usage Example
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "https://api.yourproxy.com/v1",
apiKey: process.env.PROXY_KEY,
});
const completion = await client.chat.completions.create({
model: "togethercomputer/refuel-llm-v2",
messages: [{ role: "user", content: "Hello!" }],
});