By bringing AI to people through search engines, Microsoft and Google are accelerating a significant shift in computing, and the hardware and datacenter infrastructure powering the applications may be one indicator of success.
Microsoft and Google unveiled next-generation AI-powered search engines last week that can reason and forecast as well as give users more thorough responses to their queries. Similar to how ChatGPT can offer in-depth responses or build essays, search engines will be able to generate comprehensive answers to complex questions.
In order to answer to text inquiries, Microsoft is integrating AI into Bing, while Google has announced plans to integrate AI into its text, image, and video search engines. Last week, the announcements were made on consecutive days.The businesses understood that a robust hardware infrastructure was necessary for integrating artificial intelligence into search engines. The corporations withheld information about the hardware used to power the AI computation.
Microsoft and Google have been developing AI hardware for major announcements like last week's AI search engines for many years.
The AI computing infrastructures used by the organisations diverge greatly, therefore the viability of the search engines will be put to the test by how quickly they respond and how accurately they deliver results.
An insider with knowledge of Google's plans has verified that the TPU (Tensor Processing Unit) chips used in its cloud service power Bard. Microsoft said that its AI supercomputer in Azure, which most likely runs on GPUs, can produce results at the rate of search latency or on the order of milliseconds.
This sets up a highly visible AI computing competition between Google's TPUs and Nvidia, the industry leader whose GPUs control the market.
Teams from all over the world were constructing and powering equipment and data centres. We were meticulously coordinating and setting up a complicated group of distributed resources.