5.14. PopRT frontend

The PopRT frontend is used to support the loading of model files generated by different frameworks. You can currently load only ONNX and TensorFlow models. You can specify the frontend for loading models with the PopRT CLI or the Python API (poprt.frontend).

5.14.1. ONNX frontend

The PopRT ONNX frontend is the default frontend in PopRT, and you do not need to specify additional parameters when using this frontend.

For more information on the ONNX frontend, refer to the poprt.frontend.OnnxFrontend class API description.

5.14.2. TensorFlow frontend

The PopRT TensorFlow frontend uses the API provided by tf2onnx to extend PopRT’s support for TensorFlow models.

For more information on the Tensorflow frontend, refer to the poprt.frontend.TensorflowFrontend class API description.

Currently, TensorFLow SavedModel and the following CLI parameters are supported. For detailed information on these parameters, refer to tf2onnx subcommand.

Note

Except for the saved_model parameter, all other parameters have the same meaning as the parameters provided by tf2onnx. saved_model is used to indicate whether the loaded model is a TensorFlow SavedModel. The path to the saved model is specified with input_model.

  • saved_model

  • signature_def

  • tag

  • outputs

  • opset

  • inputs_as_nchw

  • outputs_as_nchw

Loading a TensorFlow model with the PopRT CLI

The PopRT CLI uses the tf2onnx subcommand to process the parameters related to the conversion of a TensorFlow model to ONNX. You need to input these command line parameters immediately after the tf2onnx subcommand, for example:

poprt \
    --input_model resnet_v2_50 \
    --framework tensorflow \
    --input_shape inputs=1,224,224,3 \
    tf2onnx \
    --saved_model \
    --outputs Identity:0 \
    --tag serve \
    --signature_def serving_default \
    --inputs_as_nchw inputs:0 \
    --outputs_as_nchw Identity:0

Loading a TensorFlow model with Python API

You can also load a TensorFlow model with the poprt.frontend.TensorflowFrontend module, for example:

frontend = poprt.frontend.get_frontend(
    saved_model_path,
    framework='tensorflow',
    saved_model=True,
    signature_def=signature,
    tag=tag,
    opset=opset,
    inputs_as_nchw = inputs_as_nchw,
    outputs_as_nchw = outputs_as_nchw,
    input_shape=input_shape,
)
model = frontend.load_model()