Open source model startup Mistral AI released a new LLM last week with nothing but a torrent link. It has now offered some details about Mixtral, the new LLM. From a report: Mistral AI continues its mission to deliver the best open models to the developer community. Moving forward in AI requires taking new technological turns beyond reusing well-known architectures and training paradigms. Most importantly, it requires making the community benefit from original models to foster new inventions and usages.

Today, the team is proud to release Mixtral 8x7B, a high-quality sparse mixture of experts models (SMoE) with open weights. Licensed under Apache 2.0. Mixtral outperforms Llama 2 70B on most benchmarks with 6x faster inference. It is the strongest open-weight model with a permissive license and the best model overall regarding cost/performance trade-offs. In particular, it matches or outperforms GPT3.5 on most standard benchmarks.

Mixtral has the following capabilities:
1.

Link to original post https://news.slashdot.org/story/23/12/11/1030244/mistral-says-mixtral-its-new-open-source-llm-matches-or-outperforms-llama-2-70b-and-gpt35-on-most-benchmarks?utm_source=rss1.0mainlinkanon&utm_medium=feed from Teknoids News

Read the original story