Logo
Targeting the IPU from TensorFlow 1
Version: 1.4.0
  • 1. Introduction
  • 2. Tutorial
  • 3. Targeting the Poplar XLA device
  • 4. Training a model
  • 5. Example using IPUEstimator
  • 6. Example using IPUPipelineEstimator
  • 7. Distributed training
  • 8. Half-precision floating point and stochastic rounding
  • 9. IPU-optimised operations
  • 10. IPU Outlined Functions
  • 11. Custom IPU operations
  • 12. IPU host embeddings
  • 13. Retrieving information about compilation and execution
  • 14. Python API
  • 15. TensorFlow operators supported by the IPU
  • 16. Resources
  • 17. Index
  • 18. Trademarks & copyright
Targeting the IPU from TensorFlow 1

18. Trademarks & copyright

Graphcore® and Poplar® are registered trademarks of Graphcore Ltd.

AI-Float™, Colossus™, Exchange Memory™, In-Processor-Memory™, IPU-Core™, IPU-Exchange™, IPU-Fabric™, IPU-Link™, IPU-M2000™, IPU-Machine™, IPU-POD™, IPU-Tile™, PopART™, PopLibs™, PopVision™, PopTorch™, Streaming Memory™ and Virtual-IPU™ are trademarks of Graphcore Ltd.

All other trademarks are the property of their respective owners.

Copyright © 2016-2020 Graphcore Ltd. All rights reserved.

Previous

Revision 06374276.