Krishna Chaitanya Venkata

35%
Flag icon
The result was the tensor processing unit, or TPU. It was designed to process the tensors—mathematical objects—that under-pinned a neural network. The trick was that its calculations were less precise than typical processors.20 The number of calculations made by a neural network was so vast, each calculation didn’t have to be exact. It dealt in integers rather than floating point numbers. Rather than multiply 13.646 by 45.828, the TPU lopped off the decimal points and just multiplied 13 and 45. That meant it could perform trillions of extra calculations each second—exactly what Dean and his ...more
Genius Makers: The Mavericks Who Brought A.I. to Google, Facebook, and the World
Rate this book
Clear rating
Open Preview