An anonymous reader quotes a report from : Microsoft Research, the blue sky division of the software giant, […] announced the release of its Phi-2 small language model (SML), a text-to-text AI program that is “small enough to run on a laptop or mobile device,” according to a post on X. At the same time, Phi-2 with its 2.7 billion parameters (connections between artificial neurons) boasts performance that is comparable to other, much larger models including Meta’s Llama 2-7B with its 7 billion parameters and even Mistral-7B, another 7 billion parameter model.

Microsoft researchers also noted in their blog post on the Phi-2 release that it outperforms Google’s brand new Gemini Nano 2 model despite it having half a billion more parameters, and delivers less “toxicity” and bias in its responses than Llama 2. Microsoft also couldn’t resist taking a little dig at Google’s now much-criticized, staged demo video for

Link to original post https://slashdot.org/story/23/12/16/0430207/microsoft-releases-phi-2-a-small-llm-that-outperforms-llama-2-and-mistral-7b?utm_source=rss1.0mainlinkanon&utm_medium=feed from Teknoids News

Read the original story