NEW Mixtral 8x22b Tested - Mistral's New Flagship MoE Open-Source Model
Matthew Berman Matthew Berman
259K subscribers
51,457 views
0

 Published On Apr 13, 2024

Mistral AI just launched Mixtral 8x22, a massive MoE open-source model that is topping benchmarks. Let's test it!

Join My Newsletter for Regular AI Updates šŸ‘‡šŸ¼
https://www.matthewberman.com

Need AI Consulting? āœ…
https://forwardfuture.ai/

My Links šŸ”—
šŸ‘‰šŸ» Subscribe: Ā Ā Ā /Ā @matthew_bermanĀ Ā 
šŸ‘‰šŸ» Twitter: Ā Ā /Ā matthewbermanĀ Ā 
šŸ‘‰šŸ» Discord: Ā Ā /Ā discordĀ Ā 
šŸ‘‰šŸ» Patreon: Ā Ā /Ā matthewbermanĀ Ā 

Media/Sponsorship Inquiries šŸ“ˆ
https://bit.ly/44TC45V

Links:
LLM Leaderboard - https://bit.ly/3qHV0X7
Mixtral Model - https://huggingface.co/lightblue/Kara...

show more

Share/Embed