It is not surprising to know that Google has developed its own Tensor Processing Unit or TPU chips, that will act as a catalyst to boost the machine learning process.
The main motive behind designing TPU chips
Around 4 years ago Google has faced some hurdles. For instance, if all users use voice recognition feature for 3 minutes per day, the company needs to double its data centers to cater to the user’s request for machine learning system featuring those services.
The company claims that these Tensor Processing Unit or TPU chips are much faster than CPUs and GPUs.
The TPU chips were introduced back in May 2016, during its I/O developer conference, but it has not disclosed much about the metrics of these chips. Now the company has disclosed all the specs and features of TPUs.
The speed efficiency of TPUs:
Those users keen to know about all the design details can visit the Google’s official paper that was published on Wednesday. It also mentions about the performance that company has seen compared to the conventional GPUs and CPUs.
TPU was manufactured to boost up the interference stage of deep neural networks. It was seen that these chips are 15 to 30 times faster than CPUs and GPUs with respect to the machine learning interference tasks. The TPU chips were compared to the Intel’s Haswell Xeon E5-2699 v3 processors and Nvidia’s Tesla K80 GPUs. After comparison, the performance per watt of TPU was 25 to 80 times more better than this GPU and CPU.
This is the significant factor because power consumption and efficiency have a major impact on the data centers.
The TPU chips are used by Google since 2015 and they have been consistently providing good results for the applications like language translation, voice recognition, image recognition etc.
These applications are based on neural networks and running such applications is easy tasks which require only 100 to 1,500 lines of code based on TensorFlow. The TensorFlow is open source machine learning framework by Google.
Also, the TPU chips make faster predictions within a fraction of second because of its high speed and power efficiency. The chips play a major role even for search-related tasks.
However, Google is not the only company that wishes to use dedicated hardware for its machine learning algorithms.