Logo
Pre-Training and Fine-Tuning BERT for the IPU
Version: latest
  • 1. Introduction
  • 2. Pre-training BERT on the IPU-POD
  • 3. Scaling BERT to the IPU-POD
    • 3.1. Collective gradient reduction
    • 3.2. Training BERT with large batches
      • 3.2.1. Linear scaling rule
      • 3.2.2. Gradual warmup strategy
      • 3.2.3. AdamW optimizer
      • 3.2.4. LAMB optimizer
      • 3.2.5. Low-precision training
  • 4. Training results
    • 4.1. Pre-training accuracy
    • 4.2. Fine-tuning accuracy
      • 4.2.1. SQuAD v1.1
      • 4.2.2. CLUE
  • 5. Trademarks & copyright
Pre-Training and Fine-Tuning BERT for the IPU

5. Trademarks & copyright

Graphcloud®, Graphcore®, Poplar® and PopVision® are registered trademarks of Graphcore Ltd.

Bow™, Bow-2000™, Bow Pod™, Colossus™, In-Processor-Memory™, IPU-Core™, IPU-Exchange™, IPU-Fabric™, IPU-Link™, IPU-M2000™, IPU-Machine™, IPU-POD™, IPU-Tile™, PopART™, PopDist™, PopLibs™, PopRun™, PopTorch™, Streaming Memory™ and Virtual-IPU™ are trademarks of Graphcore Ltd.

All other trademarks are the property of their respective owners.

Copyright © 2016-2020 Graphcore Ltd. All rights reserved.

Previous

Revision 3e289360.

Read the Docs v: latest