View Source Evision.DNNSuperRes.DNNSuperResImpl (Evision v0.1.28)

Link to this section Summary

Types

t()

Type that represents an DNNSuperRes.DNNSuperResImpl struct.

Functions

Empty constructor for python

Returns the scale factor of the model

Returns the scale factor of the model

Read the model from the given path

Set desired model

Set computation backend

Set computation target

Upsample via neural network

Upsample via neural network

Upsample via neural network of multiple outputs

Link to this section Types

@type t() :: %Evision.DNNSuperRes.DNNSuperResImpl{ref: reference()}

Type that represents an DNNSuperRes.DNNSuperResImpl struct.

  • ref. reference()

    The underlying erlang resource variable.

Link to this section Functions

@spec create() :: t() | {:error, String.t()}

Empty constructor for python

Return
  • retval: Evision.DNNSuperRes.DNNSuperResImpl.t()

Python prototype (for reference only):

create() -> retval
@spec getAlgorithm(t()) :: binary() | {:error, String.t()}

Returns the scale factor of the model:

Positional Arguments
  • self: Evision.DNNSuperRes.DNNSuperResImpl.t()
Return

@return Current algorithm.

Python prototype (for reference only):

getAlgorithm() -> retval
@spec getScale(t()) :: integer() | {:error, String.t()}

Returns the scale factor of the model:

Positional Arguments
  • self: Evision.DNNSuperRes.DNNSuperResImpl.t()
Return
  • retval: int

@return Current scale factor.

Python prototype (for reference only):

getScale() -> retval
@spec readModel(t(), binary()) :: t() | {:error, String.t()}

Read the model from the given path

Positional Arguments
  • self: Evision.DNNSuperRes.DNNSuperResImpl.t()

  • path: String.

    Path to the model file.

Python prototype (for reference only):

readModel(path) -> None
Link to this function

setModel(self, algo, scale)

View Source
@spec setModel(t(), binary(), integer()) :: t() | {:error, String.t()}

Set desired model

Positional Arguments
  • self: Evision.DNNSuperRes.DNNSuperResImpl.t()

  • algo: String.

    String containing one of the desired models:

    • edsr
    • espcn
    • fsrcnn
    • lapsrn
  • scale: int.

    Integer specifying the upscale factor

Python prototype (for reference only):

setModel(algo, scale) -> None
Link to this function

setPreferableBackend(self, backendId)

View Source
@spec setPreferableBackend(t(), integer()) :: t() | {:error, String.t()}

Set computation backend

Positional Arguments
  • self: Evision.DNNSuperRes.DNNSuperResImpl.t()
  • backendId: int

Python prototype (for reference only):

setPreferableBackend(backendId) -> None
Link to this function

setPreferableTarget(self, targetId)

View Source
@spec setPreferableTarget(t(), integer()) :: t() | {:error, String.t()}

Set computation target

Positional Arguments
  • self: Evision.DNNSuperRes.DNNSuperResImpl.t()
  • targetId: int

Python prototype (for reference only):

setPreferableTarget(targetId) -> None
@spec upsample(t(), Evision.Mat.maybe_mat_in()) ::
  Evision.Mat.t() | {:error, String.t()}

Upsample via neural network

Positional Arguments
  • self: Evision.DNNSuperRes.DNNSuperResImpl.t()

  • img: Evision.Mat.t().

    Image to upscale

Return
  • result: Evision.Mat.t().

    Destination upscaled image

Python prototype (for reference only):

upsample(img[, result]) -> result
Link to this function

upsample(self, img, opts)

View Source
@spec upsample(t(), Evision.Mat.maybe_mat_in(), [{atom(), term()}, ...] | nil) ::
  Evision.Mat.t() | {:error, String.t()}

Upsample via neural network

Positional Arguments
  • self: Evision.DNNSuperRes.DNNSuperResImpl.t()

  • img: Evision.Mat.t().

    Image to upscale

Return
  • result: Evision.Mat.t().

    Destination upscaled image

Python prototype (for reference only):

upsample(img[, result]) -> result
Link to this function

upsampleMultioutput(self, img, imgs_new, scale_factors, node_names)

View Source
@spec upsampleMultioutput(
  t(),
  Evision.Mat.maybe_mat_in(),
  [Evision.Mat.maybe_mat_in()],
  [integer()],
  [
    binary()
  ]
) :: t() | {:error, String.t()}

Upsample via neural network of multiple outputs

Positional Arguments
  • self: Evision.DNNSuperRes.DNNSuperResImpl.t()

  • img: Evision.Mat.t().

    Image to upscale

  • imgs_new: [Evision.Mat].

    Destination upscaled images

  • scale_factors: [int].

    Scaling factors of the output nodes

  • node_names: [String].

    Names of the output nodes in the neural network

Python prototype (for reference only):

upsampleMultioutput(img, imgs_new, scale_factors, node_names) -> None