Skip to content

Zephyr 141B-A35B

huggingfaceh4/zephyr-orpo-141b-a35b

Created Apr 12, 202465,536 context

Zephyr 141B-A35B is A Mixture of Experts (MoE) model with 141B total parameters and 35B active parameters. Fine-tuned on a mix of publicly available, synthetic datasets.

It is an instruct finetune of Mixtral 8x22B.

#moe

OpenRouterOpenRouter
© 2026 OpenRouter, Inc

Product

  • Chat
  • Rankings
  • Models
  • Providers
  • Pricing
  • Enterprise

Company

  • About
  • Announcements
  • CareersHiring
  • Partners
  • Privacy
  • Terms of Service
  • Support
  • State of AI

Developer

  • Documentation
  • API Reference
  • SDK
  • Status

Connect

  • Discord
  • GitHub
  • LinkedIn
  • X
  • YouTube

Recent activity on Zephyr 141B-A35B

Total usage per day on OpenRouter

Not enough data to display yet.