Applications for GPU-Based AI and Machine Learning

GPUs Continue to Expand Application Use in Artificial Intelligence and Machine Learning

Artificial intelligence (AI) is set to transform global productivity, working patterns, and lifestyles and create enormous wealth. Research firm Gartner expects the global AI economy to increase from about $1.2 trillion last year to about $3.9 Trillion by 2022, while McKinsey sees it delivering global economic activity of around $13 trillion by 2030. And of course, this transformation is fueled by the powerful Machine Learning (ML) tools and techniques such as Deep Reinforcement Learning (DRL), Generative Adversarial Networks (GAN), Gradient-boosted-tree models (GBM), Natural Language Processing (NLP), and more.

Most of the success in modern AI and ML systems is dependent on their ability to process massive amounts of raw data in a parallel fashion using task-optimized hardware. In fact, the modern resurgence of AI started with the 2012 ImageNet competition where deep-learning algorithms demonstrated an eye-popping increment in the image classification accuracy over their non-deep-learning counterparts (algorithms). However, along with clever programming and mathematical modeling, the use of specialized hardware played a significant role in this early success.

Are Businesses Doing Enough to Make AI a Game Changer?

There has been no shortage of hype attached to artificial intelligence in recent years, with many a breathless tome predicting that it would transform life as we know it. One could be forgiven for thinking the latest report into AI from consultancy firm Cognizant fits neatly into that box.

After all, it reveals that some 84 percent of business leaders believe that AI will be vital to their business in the next few years. Despite this, however, few executives had much in the way of planning in place to convert this optimism into tangible change.Image title