Back to Models
Meta Llama 3.3 70B Instruct Turbo
meta-llama/llama-3.3-70b-instruct-turbo
Meta
Chat
Description
Meta Llama 3.3 70B Instruct Turbo by Meta. This model supports chat capabilities. Main use: Chat, Medium General Purpose, Function Calling. Variant: Instruct.
Specifications
- Context Length:
- 131072
- Variant:
- Instruct
- Parameters:
- 70B
- Input Modalities:
- Text
- Output Modalities:
- Text
- Main Use:
- Chat, Medium General Purpose, Function Calling
Pricing
- Input Cost:
- 1.76 Credits / 1M tokens
- Output Cost:
- 1.76 Credits / 1M tokens
Usage Example
const { Client } = require('ai-proxy-sdk');
const client = new Client({
consumerKey: process.env.BITMESH_CONSUMER_KEY,
consumerSecret: process.env.BITMESH_CONSUMER_SECRET,
baseUrl: process.env.BITMESH_API_BASE_URL || 'https://api.bitmesh.ai'
});
const response = await client.chat.completions.create({
model: 'meta-llama/llama-3.3-70b-instruct-turbo',
messages: [
{ role: 'system', content: 'You are a helpful assistant.' },
{ role: 'user', content: 'What are some fun things to do with AI?' }
],
max_tokens: 1000
});