Logo
Targeting the IPU from TensorFlow 2
Version: 1.4.0
  • 1. Introduction
  • 2. Targeting the Poplar XLA device
  • 3. Support for TensorFlow 2
  • 4. TensorFlow 2 examples
  • 5. Training a model
  • 6. Example using IPUEstimator
  • 7. Example using IPUPipelineEstimator
  • 8. Distributed training
  • 9. Half-precision floating point and stochastic rounding
  • 10. IPU-optimised operations
  • 11. IPU Outlined Functions
  • 12. Custom IPU operations
  • 13. IPU host embeddings
  • 14. Retrieving information about compilation and execution
  • 15. Python API
  • 16. TensorFlow operators supported by the IPU
  • 17. Resources
  • 18. Index
  • 19. Trademarks & copyright
Targeting the IPU from TensorFlow 2

19. Trademarks & copyright

Graphcore® and Poplar® are registered trademarks of Graphcore Ltd.

AI-Float™, Colossus™, Exchange Memory™, In-Processor-Memory™, IPU-Core™, IPU-Exchange™, IPU-Fabric™, IPU-Link™, IPU-M2000™, IPU-Machine™, IPU-POD™, IPU-Tile™, PopART™, PopLibs™, PopVision™, PopTorch™, Streaming Memory™ and Virtual-IPU™ are trademarks of Graphcore Ltd.

All other trademarks are the property of their respective owners.

Copyright © 2016-2020 Graphcore Ltd. All rights reserved.

Previous

Revision e32fa54b.