152.2K Downloads Updated 2 years ago
ollama run notux:8x7b-v1-q3_K_L
Updated 2 years ago
2 years ago
dba2de3a5168 · 20GB ·
This model is a fine-tuned version of Mixtral using a high-quality, curated dataset. As of Dec 26th 2023, this model is the top ranked MoE (Mixture of Experts) model on the Hugging Face Open LLM Leaderboard.