Remember Natural Language Processing? NLP arose several years ago but it was only in 2018 that AI researchers proved it was possible to train a neural network once on a large amount of data and use it again and again for different tasks. In 2019 GPT-2 from Open AI, and T5 by Google appeared, showing that they were startlingly good (it’s now been incorporated into Google Duplex, pictured). Concerns were even raised about their possible misuse.
But since then, things have gone, well, pretty exponential. 

2021 saw a veritable’ Cambrian explosion’ of NLP Start-ups and Large Language Models.
This year, Google released LambDa, a large language model for chatbot applications. Then Deepmind released Alpha Code then later Flamingo – a language model capable of visual understanding. In July of this year alone, the Big Science project released Bloom, a massive open source language model and Meta announced that they’d trained a single language

Link to original post from Teknoids News

Read the original story