Google Unveils Supercomputer with Advanced Artificial Intelligence, Claims Superiority to Nvidia

Google unveiled its brand new AI supercomputer on Wednesday, touting it as faster and more efficient than its competitors.

Google Unveils Supercomputer with Advanced Artificial Intelligence, Claims Superiority to Nvidia

Google unveiled its brand new AI supercomputer on Wednesday, touting it as faster and more efficient than its competitors.

The tech giant has been working on artificial intelligence (AI) chips called Tensor Processing Units (TPUs) since 2016, even though Nvidia currently dominates the market with over 90% share.

The tech giant is no stranger to pioneering AI advancements, having developed some of the most important breakthroughs in the field over the past decade. Despite this, it's been racing to commercialize its inventions and prove it hasn't squandered its lead.

Google's new TPU-based supercomputer is called TPU v4. It was used to train the company's PaLM model, which directly competes with OpenAI's GPT model, in just 50 days. According to the tech giant, it's 1.2x–1.7x faster and uses 1.3x–1.9x less power than Nvidia's A100 system.

Unfortunately, the tech giant did not compare its results with the H100 chip, as it is the latest model and was made with advanced technology. However, Nvidia CEO Jensen Huang said that the MLPerf 3.0 results of the H100 were significantly faster than its predecessor.

The amount of computer power necessary to run AI models is expensive, leading many to develop new chips, components, and software that can reduce the amount of power needed. Cloud providers such as Google, Microsoft, and Amazon have taken advantage of the power needed for AI, renting out computer processing by the hour and providing credits and computing time to startups.

Google's TPUs are proof that the tech giant is still a major player in AI, as it enables them to train models quickly and efficiently. Whatever the future of AI may hold, Google's TPUs will definitely be part of it.