Google uses its own deep-learning chip for artificial intelligence

Spread the love

Google has announced that it is not only developing machine learning algorithms, but also the associated hardware, so-called tensor processing units or tpus. The tpus are used by Google for frequently used functions, from Street View to Alpha Go.

Google started developing the tpus, basically custom asics, a few years ago to see if it was possible to come up with a way to make machine learning faster. The tpus were developed for Google’s own machine learning language TensorFlow, which the company recently made open source.

According to Google, the performance per watt on a tpu is many times greater than what can be achieved with other systems. The company compares its performance to “technology from about seven years in the future or three generations of Moores Law.” The company has also thought about the practical applicability of the signs; they fit into the hard drive slots of the data center racks.

The reason the chips are faster than many other asics is that they have a higher tolerance when it comes to ‘reduced computational precision’, which means fewer transistors are needed per operation. As a result, several operations can be done per second, leading to faster evolution of the machine learning algorithms.

The tpus are already part of various machine learning systems from Google, such as RankBrain for search results, and Street View to improve the quality of maps and navigation, as well as part of Alpha Go, the computer with which Google defeated the human world champion in the board game go. . Through Google’s Cloud Machine Learning platform, developers outside of Google can also use the computing power of the tpus.

You might also like