# Examples and Tutorials¶

Example programs and tutorials are available in the Graphcore GitHub repository.

## Simple examples¶

These are code examples that demonstrate some of the features of each of the frameworks on the IPU.

### TensorFlow 2¶

There are several examples showing how to use TensorFlow 2 on the IPU:

Adversarial Generalized Method of Moments: This example is an implementation of Adversarial Generalized Method of Moments, an approach for solving statistical problems based on generative adversarial networks with a wide variety of applications.

CIFAR-10 with IPUEstimator: This example shows how to train a model to sort images from the CIFAR-10 dataset using the IPU implementation of the TensorFlow Estimator API.

Graph Neural Network Example: This example uses the Spektral GNN library to predict the heat capacity of various molecules in the QM9 dataset.

IMDB Sentiment Prediction: These examples train an IPU model with an embedding layer and an LSTM to predict the sentiment of an IMDB review.

Inspecting tensors using custom outfeed layers and a custom optimizer: This example trains a choice of simple fully connected models on the MNIST numeral dataset and shows how tensors (containing activations and gradients) can be returned to the host via outfeeds for inspection.

Simple MNIST training example: This example trains a simple 2-layer fully connected model on the MNIST numeral dataset.

Shakespeare corpus reader: This example learns to predict the next character in the corpus of William Shakespeare.

### TensorFlow 1¶

The following examples are provided for TensorFlow 1. These include full models as well as examples of how to measure performance, use multiple IPUs and implement custom ops in Poplar, among other things.

These examples are tested against TensorFlow 1. Some of them will run with TensorFlow 2 without any changes.

#### Complete models¶

A number of complete examples of models implemented on the IPU are available.

Classifying hand-written digits: The MNIST dataset is a well-known example of a basic machine learning task. This is an example of its implementation on IPUs. This example also shows how to use

`ipu.dataset_benchmark`

to determine the maximum achievable throughput for a given dataset.UNet Industrial: This is an image segmentation model for industrial use cases.

Markov chain Monte Carlo methods: These are well known techniques for solving integration and optimisation problems in large-dimensional spaces. Their applications include algorithmic trading and computational biology. This example uses the TensorFlow Probability library.

CosmoFlow: This is a deep learning model for calculating cosmological parameters. The model primarily consists of 3D convolutions, pooling operations, and dense layers.

#### Performance and profiling¶

Kernel benchmarking: Code for benchmarking the performance of some selected neural network layers.

I/O benchmarking: The MNIST example shows how to use

`ipu.dataset_benchmark`

to determine the maximum achievable throughput for a given dataset.Profiling: Code demonstrating how to generate text-based reports on the performance of your model.

#### Using multiple IPUs¶

Simple examples demonstrating and explaining different ways of using multiple IPUs are provided.

Pipelining and replication are used to parallelise and speed up training, whereas sharding is generally used to simply fit a model in memory.

#### Custom ops¶

Custom op example: Code that demonstrates how to define your own custom op using Poplar and PopLibs and use it in TensorFlow 1.

Custom op example with gradient: Code that demonstrates how to define your own custom op using Poplar and PopLibs, and use it in TensorFlow 1. Also shows how to define the gradient of your custom op so that you can use automatic differentiation and operations that depend on it, such as the

`minimize`

method of an optimizer.

#### Other examples¶

IPUEstimator: Example of using the IPU implementation of the TensorFlow Estimator API.

Block sparsity: Examples for performing block-sparse computations on the IPU, including the Poplar code for two block-sparse custom ops (matrix multiplication and softmax), which are used to construct a simple MNIST example.

Configuring IPU connections: A code example which demonstrates how to use

`ipu.utils.set_ipu_connection_type`

to control if and when the IPU device is acquired.

### Poplar¶

Advanced example: This example performs a simple computation but demonstrates some of the advanced features of Poplar (for example, using dedicated programs for the I/O, using PopLibs, saving/reusing executables, choosing the number of IPUs). It is a good follow-up to what the Poplar tutorials.

Prefetch: This example shows how to implement prefetching of the input stream of a Poplar program using a callback, in order to maximise I/O performance.

### PyTorch¶

There are also examples showing how to use PyTorch with Graphcore’s IPUs. These include full models as well as examples of how to use pre-trained models.

#### Complete models¶

Complete examples of models implemented on the IPU.

Classifying hand-written digits: The MNIST dataset is a well-known example of a basic machine learning task.

Octave Convolutions: A novel convolutional layer in neural networks. This example shows an implementation of how to train the model and run it for inference.

#### Pre-trained models¶

Hugging Face’s BERT: A pre-trained BERT model made available by Hugging Face and which is implemented in PyTorch. This example consists of running one of the pre-trained BERT model on an IPU for an inference session.

#### Other examples¶

PopART’s MNIST: An example of how to export a PyTorch model as an ONNX file and reuse this file with Graphcore’s PopART.

### PopART¶

There are several examples demonstrating how to use the Poplar Advanced Runtime (PopART). These include full models as well as examples of how to use multiple IPUs, implement custom ops in Poplar and other key features provided by PopART.

#### Simple models and benchmarks¶

Simple MNIST Examples: Contains 2 simple models, 1 linear and 1 using convolutions trained on the MNIST dataset.

Kernel Synthetic Benchmarks: Contains synthetic benchmarks for models with two types of layer (LSTM and 2D Convolution) and synthetic data in training and inference.

#### Multi-IPU examples¶

Sharding a Model over Multiple IPUs: This demo shows how to “shard” (split) a model over multiple IPUs using PopART.

Pipelining a Model over Multiple IPUs: This demo shows how to use pipelining in PopART on a very simple model consisting of two dense layers.

Utilising Streaming Memory with Phased Execution: This example runs a network in inference mode over two IPUs by splitting it in several execution phases and keeping the weights in Streaming Memory.

#### Further examples¶

Custom Operators: This directory contains two example implementations of custom operators for PopART (Cube and LeakyReLU). Both examples create an operation definition with forward and backward parts, and include a simple inference script to demonstrate using the operators.

Block Sparsity: Examples of performing block-sparse computations on the IPU, including the Poplar code for two block-sparse custom ops (matrix multiplication and softmax), which are used to construct a number of examples and test scripts, including a simple MNIST example.

Data Callbacks: This example creates a simple computation graph and uses callbacks to feed data and retrieve the results. Time between host-device transfer and receipt of the result on the host is computed and displayed for a range of different data sizes.

Distributed MNIST with Horovod Training Demo: This example uses distributed training with a Horovod PopART extension to train a network on the MNIST dataset.

Automatic and Manual Recomputing: This example shows how to use manual and automatic recomputation in PopART with a seven-layer DNN and generated data.

## Application examples¶

A number of complete application examples are available. The most notable are listed in the table below. You can find the others by browsing the repository.

### TensorFlow 1¶

### PopART¶

## Tutorials¶

The tutorials directory contains tutorials to help you get started using the Poplar SDK and Graphcore tools. This contains:

PyTorch tutorials: A tutorial to introduce the PyTorch framework support for the IPU.

TensorFlow 1 tutorials: A set of tutorials to introduce the TensorFlow 1 framework support for the IPU.

TensorFlow 2 tutorials: A set of tutorials to introduce the TensorFlow 2 framework support for the IPU.

Poplar tutorials: A set of tutorials to introduce the Poplar graph programming framework and the PopLibs libraries.

PopVision tutorials: A set of tutorials to introduce PopVision, our suite of graphical application analysis tools.

A complete list of available tutorials can be found in the tutorials directory.