Logo
Mixed-Precision Arithmetic for AI: A Hardware Perspective
Version: latest
  • 1. Introduction
  • 2. Floating-point number representations
  • 3. The IPU 16-bit floating point format
  • 4. Lower precision for higher power efficiency
  • 5. Mixed-precision arithmetic
  • 6. Mixed-precision training
  • 7. Deterministic versus stochastic rounding
  • 8. Loss scaling
  • 9. Summary
  • 10. Trademarks & copyright
Mixed-Precision Arithmetic for AI: A Hardware Perspective

Mixed-Precision Arithmetic for AI: A Hardware Perspective

  • 1. Introduction
  • 2. Floating-point number representations
  • 3. The IPU 16-bit floating point format
  • 4. Lower precision for higher power efficiency
  • 5. Mixed-precision arithmetic
  • 6. Mixed-precision training
  • 7. Deterministic versus stochastic rounding
  • 8. Loss scaling
  • 9. Summary
  • 10. Trademarks & copyright
Next

Revision a99e351b.

Read the Docs v: latest