This document is a practical guide to porting TensorFlow 1 models to the Poplar SDK for running on the IPU. It is assumed that the reader is aware of the user guide Targeting the IPU from TensorFlow 1, which serves as the primary introduction to developing TensorFlow models for the IPU. The user guide provides a conceptual introduction to developing models at the framework level, and also details a number of specific facets of the TensorFlow-to-Poplar API that are pivotal to running models on the IPU. The user guide contains the details of various topics that arise in this porting guide.
This porting guide will in turn focus on some of the practical considerations of developing a model for the IPU and provide some guidance on best practices. It identifies those key elements that will assist you in transitioning to using TensorFlow on the IPU.
This document applies to the Graphcore port of TensorFlow 1.15.
The scope of this document includes:
How to approach porting in general and which questions to ask up front
Code examples that highlight IPU-specific API functions
Preliminaries such as Bash environment setup and Python import statements
The role of infeeds and outfeeds in boosting computational throughput
Profile report generation to help you identify compute or memory inefficiencies
The IPUEstimator, a TensorFlow abstraction that facilitates session handling
A working knowledge of the above elements will allow you to take your first steps in transitioning to the IPU/Poplar/TensorFlow compute stack.
Tutorials on how to port an existing Tensorflow 1 model to the IPU and speed up computation using loops and data pipelines are availiable in the Graphcore tutorials repository on GitHub: https://github.com/graphcore/tutorials/tree/sdk-release-2.6/tutorials/tensorflow1/basics.
An example application that demonstrates the use of IPUs to train CNNs including ResNet, ResNeXt and EfficientNet is available in the Graphcore examples repository on GitHub: https://github.com/graphcore/examples/tree/master/vision/cnns/tensorflow1/training