Applications for GPU-Based AI and Machine Learning

GPUs Continue to Expand Application Use in Artificial Intelligence and Machine Learning

Artificial intelligence (AI) is set to transform global productivity, working patterns, and lifestyles and create enormous wealth. Research firm Gartner expects the global AI economy to increase from about $1.2 trillion last year to about $3.9 Trillion by 2022, while McKinsey sees it delivering global economic activity of around $13 trillion by 2030. And of course, this transformation is fueled by the powerful Machine Learning (ML) tools and techniques such as Deep Reinforcement Learning (DRL), Generative Adversarial Networks (GAN), Gradient-boosted-tree models (GBM), Natural Language Processing (NLP), and more.

Most of the success in modern AI and ML systems is dependent on their ability to process massive amounts of raw data in a parallel fashion using task-optimized hardware. In fact, the modern resurgence of AI started with the 2012 ImageNet competition where deep-learning algorithms demonstrated an eye-popping increment in the image classification accuracy over their non-deep-learning counterparts (algorithms). However, along with clever programming and mathematical modeling, the use of specialized hardware played a significant role in this early success.