Logo
Mixed-Precision Arithmetic for AI: A Hardware Perspective
Version: latest
  • 1. Introduction
  • 2. Floating-point number representations
  • 3. The IPU 16-bit floating point format
  • 4. Lower precision for higher power efficiency
  • 5. Mixed-precision arithmetic
  • 6. Mixed-precision training
  • 7. Deterministic versus stochastic rounding
  • 8. Loss scaling
  • 9. Summary
  • 10. Trademarks & copyright
Mixed-Precision Arithmetic for AI: A Hardware Perspective
  • »
  • 10. Trademarks & copyright
  • Edit on GitHub

10. Trademarks & copyright

Graphcloud®, Graphcore® and Poplar® are registered trademarks of Graphcore Ltd.

Bow™, Bow-2000™, Bow Pod™, Colossus™, In-Processor-Memory™, IPU-Core™, IPU-Exchange™, IPU-Fabric™, IPU-Link™, IPU-M2000™, IPU-Machine™, IPU-POD™, IPU-Tile™, PopART™, PopDist™, PopLibs™, PopRun™, PopVision™, PopTorch™, Streaming Memory™ and Virtual-IPU™ are trademarks of Graphcore Ltd.

All other trademarks are the property of their respective owners.

Copyright © 2016-2022 Graphcore Ltd. All rights reserved.

Previous

Revision c15022b1.

Read the Docs v: latest
Versions
latest
2.6.0
2.4.0
1.1.0
preview_1.0.0
Downloads
pdf
On Read the Docs
Project Home
Builds