Llama 3.3 70b
Meta
meta/llama-3.3-70b
128K context
Context Window
128K
128,000 tokens
Max Output
8K
8,192 tokens
About this model
Strong open-source model for general tasks
This model supports up to 128K tokens of context. It provides strong code generation and debugging capabilities.
Access it through Chuizi.AI with a single ck- API key β no separate Meta account needed.
Highlights
128K context window
8K max output
Strong code generation
Best For
Code generationRefactoringDebuggingDocumentation
2024-12-06
Capabilities
ChatCodetools
Aliases
llama-3.3-70bPricing (per 1M tokens)
| Pricing (per 1M tokens) | / 1M tokens |
|---|---|
| Input / 1M | $0.76 |
| Output / 1M | $0.76 |
Final prices shown
Quick Start
main.py
from openai import OpenAI client = OpenAI( base_url="https://api.chuizi.ai/v1", api_key="ck-your-key-here", ) response = client.chat.completions.create( model="meta/llama-3.3-70b", messages=[{"role": "user", "content": "Hello!"}], ) print(response.choices[0].message.content)