Stability AI, the startup behind the generative AI art tool Stable Diffusion, today open sourced a suite of text-generating AI models intended to go head to head with systems like OpenAI’s GPT-4.
Called StableLM and available in “alpha” on GitHub and Hugging Face, a platform for hosting AI models and code, Stability AI says that the models can generate both code and text and “demonstrate how small and efficient models can deliver high performance with appropriate training.”
“Language models will form the backbone of our digital economy, and we want everyone to have a voice in their design,” the Stability AI team wrote in a blog post on the company’s site.
The models were trained on a dataset called The Pile, a mix of internet-scraped text samples from websites including PubMed, StackExchange and Wikipedia. But Stability AI claims it created a custom training set that expands the size of the standard Pile by

Link to original post from Teknoids News

Read the original story