Graphcore’s Bow-2000 IPU-Machine is designed to support scale-up and scale-out machine intelligence compute. The Bow Pod reference designs, based on the Bow-2000, deliver scalable building blocks for the Bow Pod systems range of products: Bow Pod16 (4 Bow-2000 machines directly attach to a single host server), Bow Pod64 (16 Bow-2000 machines in a switched system with 1-4 host servers), and Bow Pod256 (64 Bow-2000 machines in a switched system with 4-16 host servers). Bow Pod1024 is currently available for early access.
Virtualization and provisioning software allow the AI compute resources to be elastically allocated to users and be grouped for both model-parallel and data-parallel AI compute in all Bow Pod systems, supporting multiple users and mixed workloads as well as single systems for large models.
Bow Pod system level products, including Bow-2000 machines, host servers and network switches, are available from Graphcore channel partners globally. Customers can select their preferred server brand from a range of leading server vendors. There are multiple host servers from different vendors approved for use in Bow Pod systems, see the approved server list for details. The disaggregated host architecture allows for different server requirements based on workload.
The Bow-2000 is backwards compatible with the IPU-M2000™ IPU-Machine and has up to 40% performance improvement and up to 16% more power efficiency for real world AI workloads compared to the IPU-M2000 with no code changes.