Expanding choices for PowerAI developers with TensorFlow

26 January, 2017
Michael Gschwind
IBM

Watson, Siri, the Google Assistant and Amazon Alexa have inspired worldwide interest in deep learning and the promise it holds for artificial intelligence (AI) development. But today, deploying this technology can be costly and time-consuming, often requiring an in-house deep learning development group stuffed with PhDs.

That’s why IBM created PowerAI, the world’s leading enterprise distribution and support for open-source machine and deep learning frameworks used to build cognitive applications. IBM PowerAI provides a curated, tested and pre-compiled binary software distribution that enables enterprises to quickly and easily deploy deep learning for their data science and analytics application development.

In support of making sure our clients have the widest selection of deep learning distributions, we just published an update to PowerAI. This release expands support to the TensorFlow 0.12 deep learning framework, originally released by Google.

PowerAI with TensorFlow and a lot more

The latest version of PowerAI including TensorFlow has been optimized and validated on GPU-accelerated Linux on Power platforms running Ubuntu 16.04 and the NVIDIA CUDA 8 software platform — and our optimizations take particular advantage of NVIDIA Tesla P100 GPUs connected to POWER8 with the NVIDIA NVLink interconnect on the IBM Power Systems S822LC for HPC.

This updated release also incorporates Google’s Bazel build tool, which is used in the development of many advanced TensorFlow models.

TensorFlow on IBM S822LC for HPC with four NVLink-attached Tesla P100 GPUs is showing a 30 percent performance advantage when compared to a system with four Tesla P100 GPUs attached through conventional PCIe when running the Inception v3 model, a popular image recognition framework.

World’s most popular deep learning framework, faster

TensorFlow is the most popular deep learning framework on GitHub, and it has been embraced around the world by deep learning users in every kind of organization. With PowerAI, TensorFlow is becoming effortless to deploy in the enterprise.

Rajat Monga is a Google engineering leader for TensorFlow. He says, “TensorFlow is quickly becoming a viable option for companies interested in deploying deep learning for tasks ranging from computer vision, to speech recognition, to text analytics. IBM’s enterprise offering of TensorFlow will help organizations deploy this framework — we’re glad to see this support.”

How to get started

  • On existing hardware or a new system

We encourage you to try IBM PowerAI and develop your deep learning applications on Power. PowerAI is available as an easy-to-install binary for fast, efficient deployment with regular validated updates.

  • In the cloud

If you are just beginning your deep learning journey, you can now accelerate deep learning with the power of IBM POWER8 and NVIDIA GPUs on the cloud! The most recent release of PowerAI is available in the Nimbix Cloud.

The entire PowerAI team is excited for your feedback. Share how you are unleashing the power of deep learning to transform the future of computing in the comments section.

The post Expanding choices for PowerAI developers with TensorFlow appeared first on IBM Systems Blog: In the Making.