Posted 4/10/2024, 5:05:00 PM
Mistral Unveils Open-Source 281GB AI Model Rivaling Big Tech
- Mistral, a French AI startup, released Mixtral 8x22B, a 281GB AI language model aiming to rival OpenAI, Meta, and Google models
- Mixtral 8x22B has a 65,000 token context window and 176B parameters to make decisions, outperforming prior Mistral models
- The model is open source, available via torrent for anyone to use and build upon
- The release coincides with other major model launches like OpenAI's GPT-4 Turbo and Google's Gemini Pro 1.5
- Mistral's open source approach raises concerns about preventing misuse and fixing flaws, unlike commercial models that can be taken offline