Lessons From a Decade of Generative AI

With the recent buzz around generative AI, led by the likes of ChatGPT and Bard, businesses are increasingly seeking to understand the use cases for the technology. It’s a great time for instigating conversations around the power of AI, but generative AI is nothing new. Generative modeling (i.e., generative AI) has been blowing up behind the scenes for more than a decade, propelled by three major factors:  the development of open-source software libraries such as Tensorflow in 2015 and PyTorch in 2016; innovations in neural network architectures and training; and hardware improvements such as graphics processing units (GPUs) and tensor processing units (TPUs) to facilitate training and inference on massive neural networks. 

In this article, I’ll aim to explain what generative models are, how they got to where they are today, and how they should be used, but also explore their limitations.

CategoriesUncategorized