View Source AxonOnnx (AxonOnnx v0.4.0)
Library for converting to and from Axon/ONNX.
ONNX is a Neural Network specification supported by most popular deep learning frameworks such as PyTorch and TensorFlow. AxonOnnx allows you to convert to and from ONNX models via a simple import/export API.
You can import supported ONNX models using AxonOnnx.import/2
:
{model, params} = AxonOnnx.import("model.onnx")
model
will be an Axon struct and params
will be a compatible
model state.
You can export supported models using AxonOnnx.export/4
:
AxonOnnx.export(model, templates, params)
Link to this section Summary
Functions
Dumps an Axon model and parameters into a binary representing and ONNX model.
Exports an Axon model and parameters to an ONNX model with the given input templates.
Imports an ONNX model from the given path.
Loads an ONNX model into an Axon model from the given binary.
Link to this section Functions
Dumps an Axon model and parameters into a binary representing and ONNX model.
Exports an Axon model and parameters to an ONNX model with the given input templates.
You may optionally specify a path
to export a model to
a specific file path:
AxonOnnx.export(model, templates, params, path: "resnet.onnx")
Imports an ONNX model from the given path.
Some models support ONNX dim_params
which you may specify
by providing dimension names as a keyword list:
AxonOnnx.import("model.onnx", batch: 1)
The imported model will be in the form:
{model, params} = AxonOnnx.import("model.onnx")
Loads an ONNX model into an Axon model from the given binary.
Some models support ONNX dim_params
which you may specify
by providing dimension names as a keyword list:
onnx = File.read!("model.onnx")
AxonOnnx.load(onnx, batch: 1)
The imported model will be in the form:
{model, params} = AxonOnnx.import(onnx)