Custom Tensor Processing Unit chip revealed as secret to Google's AI capabilities

According to Google, this technology is seven years ahead of anything else, or three generations according to Moore's Law


Google has designed and deployed a custom chip for driving its machine learning technologies. The chips are custom made to work with TensorFlow, Google's machine learning platform. The technology was stealthily developed, and is already deployed for Street View, Inbox Smart Reply and RankBrain, which is the brains behind delivering more relevant search results. The chips slide into the hard drive slot in a server rack.

The chips are named Tensor Processing Units (TPU). It delivers better performance optimisation per unit of electricity consumption, an order of magnitude better than any other product in the market. According to Google, this technology is seven years ahead of anything else, or three generations according to Moore's Law. A server rack with TPUs was used by DeepMind to defeat a professional Go master. Go has traditionally been a challenge for machine learning implementations because of the large number of possibilities, and to overcome the intuitive approach used by human players.

Custom Tensor Processing Unit chip revealed as secret to Googles AI capabilities

The TPUs in a server rack used against Go champion Lee Sedol

 

These technologies will be available for developers and customers to use the industry leading chips, through software such as TensorFlow. Google recently added distributed computing to TensorFlow. Google will need this kind of disruptive technology for delivering personalised services in natural language to all of its users. Google Home powered by Google Assistant is an example of a newly announced project that will use both artificial intelligence and natural language processing.

Find our entire collection of stories, in-depth analysis, live updates, videos & more on Chandrayaan 2 Moon Mission on our dedicated #Chandrayaan2TheMoon domain.