AI Models/recursal/RWKV v5: Eagle 7B
recursalChat

RWKV v5: Eagle 7B

recursal/eagle-7b
10KContext Window
Online

Eagle 7B is trained on 1.1 Trillion Tokens across 100+ world languages (70% English, 15% multilang, 15% code). - Built on the [RWKV-v5](/models?q=rwkv) architecture (a linear transformer with 10-100x+ lower inference cost) - Ranks as the world's greenest 7B model (per token) - Outperforms all 7B class models in multi-lingual benchmarks - Approaches Falcon (1.5T), LLaMA2 (2T), Mistral (>2T?) level of performance in English evals - Trade blows with MPT-7B (1T) in English evals - All while being an ["Attention-Free Transformer"](https://www.isattentionallyouneed.com/) Eagle 7B models are provided for free, by [Recursal.AI](https://recursal.ai), for the beta period till end of March 2024 Find out more [here](https://blog.rwkv.com/p/eagle-7b-soaring-past-transformers) [rnn](/models?q=rwkv)

Capabilities

Text GenerationCode GenerationAnalysis & Reasoningmodels.reasoning

Technical Specs

Input Modality
Text
Output Modality
Text
Arch

Pricing

Pay per use, no monthly fees
Input Token< ¥0.001/1K Token
Output Token< ¥0.001/1K Token

Quick Start

from openai import OpenAI

client = OpenAI(
    base_url="https://api.uniontoken.ai/v1",
    api_key="YOUR_UNIONTOKEN_API_KEY",
)

response = client.chat.completions.create(
    model="recursal/eagle-7b",
    messages=[
        {"role": "user", "content": "Hello!"}
    ],
)

print(response.choices[0].message.content)

FAQ

RWKV v5: Eagle 7B
recursal/eagle-7b
In< ¥0.001/1K
Out< ¥0.001/1K
Context Window10K
Start Using →View Integration Docs

Ready to get started?

Get 1M free tokens on registration, no monthly fees or minimum spend

Register Now →