AI Models/huggingfaceh4/Zephyr 141B-A35B
huggingfaceh4Chat

Zephyr 141B-A35B

huggingfaceh4/zephyr-orpo-141b-a35b
66KContext Window
Online

Zephyr 141B-A35B is A Mixture of Experts (MoE) model with 141B total parameters and 35B active parameters. Fine-tuned on a mix of publicly available, synthetic datasets. It is an instruct finetune of [Mixtral 8x22B](/models/mistralai/mixtral-8x22b). #moe

Capabilities

Text GenerationCode GenerationAnalysis & Reasoningmodels.reasoning

Technical Specs

Input Modality
Text
Output Modality
Text
Arch

Pricing

Pay per use, no monthly fees
Input Token< ¥0.001/1K Token
Output Token< ¥0.001/1K Token

Quick Start

from openai import OpenAI

client = OpenAI(
    base_url="https://api.uniontoken.ai/v1",
    api_key="YOUR_UNIONTOKEN_API_KEY",
)

response = client.chat.completions.create(
    model="huggingfaceh4/zephyr-orpo-141b-a35b",
    messages=[
        {"role": "user", "content": "Hello!"}
    ],
)

print(response.choices[0].message.content)

FAQ

Zephyr 141B-A35B
huggingfaceh4/zephyr-orpo-141b-a35b
In< ¥0.001/1K
Out< ¥0.001/1K
Context Window66K
Start Using →View Integration Docs

Ready to get started?

Get 1M free tokens on registration, no monthly fees or minimum spend

Register Now →