Credit: Google

Forget the CPU, GPU, and FPGA, Google says its Tensor Processing Unit, or TPU, advances machine learning capability by a factor of three generations.

This is roughly equivalent to fast-forwarding technology about seven years into the future three generations of Moore s Law , the blog said.

Because of this, we can squeeze more operations per second into the silicon, use more sophisticated and powerful machine learning models, and apply these models more quickly, so users get more intelligent results more rapidly.

The tiny TPU can fit into a hard drive slot within the data center rack and has already been powering RankBrain and Street View, the blog said.

Analyst Patrick Moorhead of Moore Insights & Strategy, who attended the I/O developer conference, said, from what little Google has revealed about the TPU, he doesn t think the company is about to abandon traditional CPUs and GPUs just yet.

He likened the comparison to decoding an H.265 video stream with a CPU versus an ASIC built for that task.

The text above is a summary, you can read full article here.