AI Models/databricks/Databricks: DBRX 132B Instruct
databricksChat

Databricks: DBRX 132B Instruct

databricks/dbrx-instruct
33KContext Window
Online

DBRX is a new open source large language model developed by Databricks. At 132B, it outperforms existing open source LLMs like Llama 2 70B and [Mixtral-8x7b](/models/mistralai/mixtral-8x7b) on standard industry benchmarks for language understanding, programming, math, and logic. It uses a fine-grained mixture-of-experts (MoE) architecture. 36B parameters are active on any input. It was pre-trained on 12T tokens of text and code data. Compared to other open MoE models like Mixtral-8x7B and Grok-1, DBRX is fine-grained, meaning it uses a larger number of smaller experts. See the launch announcement and benchmark results [here](https://www.databricks.com/blog/introducing-dbrx-new-state-art-open-llm). #moe

Capabilities

Text GenerationCode GenerationAnalysis & Reasoningmodels.reasoning

Technical Specs

Input Modality
Text
Output Modality
Text
Arch

Pricing

Pay per use, no monthly fees
Input Token< ¥0.001/1K Token
Output Token< ¥0.001/1K Token

Quick Start

from openai import OpenAI

client = OpenAI(
    base_url="https://api.uniontoken.ai/v1",
    api_key="YOUR_UNIONTOKEN_API_KEY",
)

response = client.chat.completions.create(
    model="databricks/dbrx-instruct",
    messages=[
        {"role": "user", "content": "Hello!"}
    ],
)

print(response.choices[0].message.content)

FAQ

Databricks: DBRX 132B Instruct
databricks/dbrx-instruct
In< ¥0.001/1K
Out< ¥0.001/1K
Context Window33K
Start Using →View Integration Docs

Ready to get started?

Get 1M free tokens on registration, no monthly fees or minimum spend

Register Now →