ZETIC.MLange

Prepare Model and Input(s)

Guide to preparing TorchScript or ONNX models and NumPy inputs for ZETIC.MLange

Save Model and Input(s)

The input for MLange is:

  1. Model: TorchScript or ONNX format
  2. Input(s): NumPy array format

To use PyTorch nn.Module, please trace your model first:

import torch
import numpy as np

torch_model = torch.nn.Module(...)

# Trace your PyTorch model
torchscript_model = torch.jit.trace(your_torch_model, TORCH_INPUTS)

# (1) Save your traced model
torch.jit.save(torchscript_model, OUTPUT_TORCHSCRIPT_MODEL_PATH)

# (2) Save your sample inputs to use
np_input = TORCH_INPUT.detach().numpy()
np.save("INPUT.npy", np_input)

For more details, refer to the torch.jit.save documentation.

You can convert your model to ONNX format using various frameworks.

For guidance on converting to ONNX format, refer to the ONNX Tutorials.

Check the Order of Model Input(s)

You can verify the order of the model inputs using Netron. It's crucial to provide the data in the same order when generating a new model key or running the model after deployment. Maintaining the correct input order is essential for the model to function correctly.

Even if the original model supports flexible input sizes, the input and output sizes of the generated model will be fixed based on the size of the input tensor used when creating the model.

Checking input sequence with Netron