Examples and Tutorials

Overview

Tutorials repository

Simple example programs and tutorials are available in the Graphcore GitHub tutorials repository. This includes:

  • Tutorials to help you get started using the Poplar SDK and Graphcore tools to run code on the IPU.

  • Feature examples: small code examples showing you how to use various software features when developing for IPUs.

  • Simple application examples: basic applications written in different frameworks targeting the IPU.

  • Kernel benchmarks: code for benchmarking the performance of some selected types of neural network layers on the IPU, using TensorFlow or our PopART framework.

The tutorials repository also contains code used in technical notes, videos and blogs.

Examples repository

Sample applications are available in the Graphcore GitHub examples repository.

TensorFlow 1

The following examples are provided for TensorFlow 1. These include full models as well as examples of how to measure performance, use multiple IPUs and implement custom ops in Poplar, among other things.

These examples are tested against TensorFlow 1. Some of them will run with TensorFlow 2 without any changes.

Feature examples

These include examples of how to measure performance, use multiple IPUs and implement custom ops in Poplar.

Performance

  • Inspecting tensors

    This example trains simple pipelined and non-pipelined models on the MNIST numeral data set and shows how tensors (containing activations and gradients) can be returned to the host for inspection using outfeed queues. This can be useful for debugging a model.

  • I/O benchmarking

    The MNIST simple application shows how to use dataset_benchmark() to determine the maximum achievable throughput for a given dataset.

Using multiple IPUs

Simple examples demonstrating and explaining different ways of using multiple IPUs are provided.

Custom ops

  • Custom op example

    Code that demonstrates how to define your own custom op using Poplar and PopLibs and use it in TensorFlow 1.

  • Custom op example with gradient

    Code that demonstrates how to define your own custom op using Poplar and PopLibs, and use it in TensorFlow 1. Also shows how to define the gradient of your custom op so that you can use automatic differentiation and operations that depend on it, such as the minimize() method of an optimizer.

Other examples

  • IPUEstimator

    Example of using the IPU implementation of the TensorFlow Estimator API.

  • Configuring IPU connections

    A code example which demonstrates how to use set_ipu_connection_type() to control if and when the IPU device is acquired.

Simple application examples

Kernel benchmarks

  • Kernel benchmarking

    Code for benchmarking the performance of some selected neural network layers.

Example applications

TensorFlow 2

Feature examples

  • CIFAR-10 with IPUEstimator

    This example shows how to train a model to sort images from the CIFAR-10 dataset using the IPU implementation of the TensorFlow Estimator API.

  • IMDB Sentiment Prediction

    These examples train an IPU model with an embedding layer and an LSTM to predict the sentiment of an IMDB review.

  • Inspecting tensors using custom outfeed layers and a custom optimizer

    This example trains a choice of simple fully connected models on the MNIST numeral dataset and shows how tensors (containing activations and gradients) can be returned to the host via outfeeds for inspection.

  • PopDist training example

    This shows how to enable distributed training of a TensorFlow 2 model by using the PopDist API.

  • Re-computation Checkpoints

    This example demonstrates using checkpointing of intermediate values to reduce live memory peaks with a simple Keras LSTM model.

Simple application examples

Example applications

  • Adversarial Generalized Method of Moments

    This example is an implementation of Adversarial Generalized Method of Moments, an approach for solving statistical problems based on generative adversarial networks with a wide variety of applications.

  • Graph Neural Network Example

    This example uses the Spektral GNN library to predict the heat capacity of various molecules in the QM9 dataset.

PyTorch

Feature examples

  • Octave Convolutions

    A novel convolutional layer in neural networks. This example shows an implementation of how to train the model and run it for inference.

  • Using a custom op in a PyTorch model

    This example describes how to use a custom op created in Poplar for the PopTorch. It does not describe the creation of a custom op with Poplar in C++, only the use of the op within a PyTorch model.

  • PopDist training example

    This shows how to enable distributed training of a PyTorch model by using the PopDist API.

Simple application examples

Complete examples of models

Pre-trained models

  • Hugging Face BERT

    A pre-trained BERT model made available by Hugging Face and which is implemented in PyTorch. This example consists of running one of the pre-trained BERT model on an IPU for an inference session.

Other examples

  • PopART MNIST

    An example of how to export a PyTorch model as an ONNX file and reuse this file with Graphcore’s PopART.

Example applications

PopART / ONNX

There are several examples demonstrating how to use the Poplar Advanced Runtime (PopART). These include full models as well as examples of how to use multiple IPUs, implement custom ops in Poplar and other key features provided by PopART.

Feature examples

Multi-IPU examples

Custom operators

  • Custom Operators

    This directory contains two example implementations of custom operators for PopART (Cube and LeakyReLU). Both examples create an operation definition with forward and backward parts, and include a simple inference script to demonstrate using the operators.

Other examples

  • Data Callbacks

    This example creates a simple computation graph and uses callbacks to feed data and retrieve the results. Time between host-device transfer and receipt of the result on the host is computed and displayed for a range of different data sizes.

  • Automatic and Manual Recomputing

    This example shows how to use manual and automatic recomputation in PopART with a seven-layer DNN and generated data.

Simple application examples

  • Simple MNIST Examples

    Contains two simple models, one linear and one using convolutions trained on the MNIST dataset.

Kernel benchmarks

  • Kernel Synthetic Benchmarks

    Contains synthetic benchmarks for models with two types of layer (LSTM and 2D Convolution) and synthetic data in training and inference.

Example applications

Poplar and PopLibs

Tutorials and examples that show how to use the Poplar and PopLibs libraries that underlie the other frameworks. Particularly useful if you need to extend the existing frameworks by writing a custom operation, for example.

Feature examples

  • Advanced example

    This example performs a simple computation but demonstrates some of the advanced features of Poplar (for example, using dedicated programs for the I/O, using PopLibs, saving/reusing executables, choosing the number of IPUs). It is a good follow-up to what the Poplar tutorials.

  • Prefetch

    This example shows how to implement prefetching of the input stream of a Poplar program using a callback, in order to maximise I/O performance.

Simple application examples

Example applications