View Source NNInterp (nn_interp v0.1.1)

Integrated DNN interpreter for Elixir. Deep Learning inference framework.

Link to this section Summary

Functions

Adjust NMS result to aspect of the input image. (letterbox)

Get name of backend NN framework.

Ensure that the back-end framework is as expected.

Get the flat binary from the output tensor on the interpreter.

Get list of the flat binary from the output tensoron the interpreter.

Get the propaty of the model.

Invoke prediction.

run(x) deprecated

Put a flat binary to the input tensor on the interpreter.

Put flat binaries to the input tensors on the interpreter.

Stop the interpreter.

Ensure that the model matches the back-end framework.

Link to this section Functions

Link to this function

adjust2letterbox(nms_result, aspect \\ [1.0, 1.0])

View Source

Adjust NMS result to aspect of the input image. (letterbox)

parameters

Parameters:

  • nms_result - NMS result {:ok, result}
  • [rx, ry] - aspect ratio of the input image

Get name of backend NN framework.

Ensure that the back-end framework is as expected.

Link to this function

get_output_tensor(mod, index, opts \\ [])

View Source

Get the flat binary from the output tensor on the interpreter.

parameters

Parameters

  • mod - modules' names or session.
  • index - index of output tensor in the model
Link to this function

get_output_tensors(mod, range)

View Source

Get list of the flat binary from the output tensoron the interpreter.

parameters

Parameters

  • mod - modules' names or session.
  • range - range of output tensor in the model

Get the propaty of the model.

parameters

Parameters

  • mod - modules' names

Invoke prediction.

Two modes are toggled depending on the type of input data. One is the stateful mode, in which input/output data are stored as model states. The other mode is stateless, where input/output data is stored in a session structure assigned to the application.

parameters

Parameters

  • mod/session - modules name(stateful) or session structure(stateless).

examples

Examples.

    output_bin = session()  # stateless mode
      |> NNInterp.set_input_tensor(0, input_bin)
      |> NNInterp.invoke()
      |> NNInterp.get_output_tensor(0)
Link to this function

non_max_suppression_multi_class(mod, arg, boxes, scores, opts \\ [])

View Source

Execute post processing: nms.

parameters

Parameters

  • mod - modules' names
  • num_boxes - number of candidate boxes
  • num_class - number of category class
  • boxes - binaries, serialized boxes tensor[num_boxes][4]; dtype: float32
  • scores - binaries, serialized score tensor[num_boxes][num_class]; dtype: float32
  • opts
    • iou_threshold: - IOU threshold
    • score_threshold: - score cutoff threshold
    • sigma: - soft IOU parameter
    • boxrepr: - type of box representation
      • :center - center pos and width/height
      • :topleft - top-left pos and width/height
      • :corner - top-left and bottom-right corner pos
This function is deprecated. Use invoke/1 instead.
Link to this function

set_input_tensor(mod, index, bin, opts \\ [])

View Source

Put a flat binary to the input tensor on the interpreter.

parameters

Parameters

  • mod - modules' names or session.
  • index - index of input tensor in the model
  • bin - input data - flat binary, cf. serialized tensor
  • opts - data conversion
Link to this function

set_input_tensors(mod, from, items)

View Source

Put flat binaries to the input tensors on the interpreter.

parameters

Parameters

  • mod - modules' names or session.
  • from - first index of input tensor in the model
  • items - list of input data - flat binary, cf. serialized tensor

Stop the interpreter.

parameters

Parameters

  • mod - modules' names
Link to this function

validate_model(model, url)

View Source

Ensure that the model matches the back-end framework.

parameters

Parameters

  • model - path of model file
  • url - download site