top

Google announces that Cloud TPU is now available in beta on its GCP

On Monday, Google officially announced that Cloud TPUs (Tensor Processing Units) are now available in beta on GCP (Google Cloud Platform). These TPUs are specifically designed to scale up and speed up the machine learning workloads that are programmed with TensorFlow. The TPUs help to train the machine learning models quickly rather than exceeding days or weeks. Google initially declared its work with TPUs a few years back, however, released them recently to be utilized by its cloud clients. "We designed Cloud TPUs to deliver differentiated performance per dollar for targeted TensorFlow workloads and to enable ML engineers and researchers to iterate more quickly," Google wrote in a Cloud Platform blog. These TPUs are built with application-specific integrated circuits (ASICs) and each cloud TPU can pack up to 64GB of high-bandwidth memory and 180 teraflops of floating-point performance onto a single board. All these boards can be connected together or used alone through a high-speed network to design multi-petaflop machine learning supercomputers called “TPU pods”. Google Compute Engine VM gives data scientists access to a network-attached cloud TPU instead of sharing a cluster. So, they can customize and control those to address the issues of their workload. “Cloud TPUs are available in limited quantities today and usage is billed by the second at the rate of $6.50 USD / Cloud TPU / hour,” Google said in a blog post. In addition, Google also announced that GPUs are also available in beta in the latest release of Kubernetes Engine which speeds up the compute-intensive applications like financial modeling, image processing, and machine learning. Source: Google Cloud Platform Blog
Rated 3.0/5 based on 0 customer reviews
Normal Mode Dark Mode

Google announces that Cloud TPU is now available in beta on its GCP

Ruslan Bragin
What's New
16th Feb, 2018
Google announces that Cloud TPU is now available in beta on its GCP

On Monday, Google officially announced that Cloud TPUs (Tensor Processing Units) are now available in beta on GCP (Google Cloud Platform). These TPUs are specifically designed to scale up and speed up the machine learning workloads that are programmed with TensorFlow.

The TPUs help to train the machine learning models quickly rather than exceeding days or weeks. Google initially declared its work with TPUs a few years back, however, released them recently to be utilized by its cloud clients.

"We designed Cloud TPUs to deliver differentiated performance per dollar for targeted TensorFlow workloads and to enable ML engineers and researchers to iterate more quickly," Google wrote in a Cloud Platform blog.

These TPUs are built with application-specific integrated circuits (ASICs) and each cloud TPU can pack up to 64GB of high-bandwidth memory and 180 teraflops of floating-point performance onto a single board. All these boards can be connected together or used alone through a high-speed network to design multi-petaflop machine learning supercomputers called “TPU pods”.

Google Compute Engine VM gives data scientists access to a network-attached cloud TPU instead of sharing a cluster. So, they can customize and control those to address the issues of their workload. “Cloud TPUs are available in limited quantities today and usage is billed by the second at the rate of $6.50 USD / Cloud TPU / hour,” Google said in a blog post.

In addition, Google also announced that GPUs are also available in beta in the latest release of Kubernetes Engine which speeds up the compute-intensive applications like financial modeling, image processing, and machine learning.

Source: Google Cloud Platform Blog

Ruslan

Ruslan Bragin

Author
Ruslan is a passionate in developing data and Machine learning solution. He is currently working on projects related to IoT.

Leave a Reply

Your email address will not be published. Required fields are marked *

SUBSCRIBE OUR BLOG

Follow Us On

Share on

other Blogs

20% Discount