Do We Really Need More Powerful Language Models?

Today people rarely question the connection: bigger models mean better models. Every new release of GPT by Open AI generates tremendous interest in traditional media and social media alike. However, do we need more powerful language models (foundational models) to help us with daily tasks?

For this article, I have talked to Ivan Smetannikov, Data Science Team Lead at Serokell, Ph.D. Computer Science, Associate Professor, and Senior Researcher at ITMO. He explains why ChatGPT might often be a massive waste of time and resources. And talks about alternative approaches to building NLP models that could bring the same results.

CategoriesUncategorized