5.1. PopTorch

3.4.0

New features

None

Bug Fixes

  • Fix corner-case where model parameters with requires_grad=False were not handled correctly during training.

  • Propagate the requires_grad setting for parameters that are set per-replica.

Other improvements

None

Known issues

Compatibility changes

None

3.3.0

New features

  • Added support for the following ops:

    • torch.nn.Mish

    • torch.bucketize

    • torch.cdist

    • torch.sort

    • torch.take_along_dim

    • torch.bincount

  • Added support for the following torch_cluster ops:

    • radius

    • grid

    • knn

  • Added drop-in replacements for the following torch_cluster operations (replace the torch_cluster ops with poptorch ops ones): * fps * nearest

  • Added a cond op for inference models only. This op conditionally executes one of two branches depending on the input condition.

  • Extended the API for copying data structures to the IPU. copyNamedBuffersToDevice allows for a named buffer to be copied.

  • FixedSizeCollator can now pad to different node and edge types.

  • The PopTorch wheel now has dependencies on torchvision and torchaudio. This will prevent upgrade to an unsupported PyTorch version when installing other third-party packages that depend on torchvision or torchaudio.

  • Added support for the largest=False` option in the torch.topk op.

  • Added the compilationTime function to extract the total compilation time from the compiled PopTorch model.

Bug Fixes

  • Fixed compilation of models with torch.norm inside the for_loop op.

  • Fixed the clamp dtype mismatch error. torch.clamp used to raise an incompatible type error.

  • torch.var used to raise an error when an input dimension was negative. The fix converts the input negative integer to a positive integer so it can be used as an index of the input shape vector.

  • Fixed the torch.round behaviour in PopTorch to use a “round half down” method to match the behaviour in PyTorch. PopTorch previously used a “round half up” method.

  • Fixed the Int32-Float32 op not being processed.

Other improvements

Known issues

None

Compatibility changes

None