Next-Generation Search Sets Up AI Hardware Battle Between Google and Micros

By bringing AI to people through search engines, Microsoft and Google are accelerating a significant shift in computing, and the hardware and datacenter infrastructure powering the applications may be one indicator of success.

Microsoft and Google unveiled next-generation AI-powered search engines last week that can reason and forecast as well as give users more thorough responses to their queries. Similar to how ChatGPT can offer in-depth responses or build essays, search engines will be able to generate comprehensive answers to complex questions.

In order to answer to text inquiries, Microsoft is integrating AI into Bing, while Google has announced plans to integrate AI into its text, image, and video search engines. Last week, the announcements were made on consecutive days.The businesses understood that a robust hardware infrastructure was necessary for integrating artificial intelligence into search engines. The corporations withheld information about the hardware used to power the AI computation.

Microsoft and Google have been developing AI hardware for major announcements like last week's AI search engines for many years.

The AI computing infrastructures used by the organisations diverge greatly, therefore the viability of the search engines will be put to the test by how quickly they respond and how accurately they deliver results.

An insider with knowledge of Google's plans has verified that the TPU (Tensor Processing Unit) chips used in its cloud service power Bard. Microsoft said that its AI supercomputer in Azure, which most likely runs on GPUs, can produce results at the rate of search latency or on the order of milliseconds.

This sets up a highly visible AI computing competition between Google's TPUs and Nvidia, the industry leader whose GPUs control the market.

Teams from all over the world were constructing and powering equipment and data centres. We were meticulously coordinating and setting up a complicated group of distributed resources.

Will AI-based image rendering replace hardware?

Sincerely, I did not foresee the development of AI. I had thought that the unit of measurement for computer power would always remain ghz. Hardware rendering has been faster over time. But given that DLSS and other comparable applications use AI already, I'm concerned that it could eventually replace our GPUs. Excuse my ignorance, but if AI is already able to place "fake" frames between actual frames, might it possibly do so in the future? What if, for example, the GPU only needs to render one frame per second? possibly even little. AI takes care of the remainder, filling it in at whichever framerate you want.

The thought that GPUs might one day perform the same functions as AIPUs is both intriguing and unsettling.