View Source Evision.DNNSuperRes.DNNSuperResImpl (Evision v0.2.9)
Summary
Functions
Empty constructor for python
Returns the scale factor of the model
Returns the scale factor of the model
Read the model from the given path
Set desired model
Set computation backend
Set computation target
Upsample via neural network
Upsample via neural network
Upsample via neural network of multiple outputs
Types
@type t() :: %Evision.DNNSuperRes.DNNSuperResImpl{ref: reference()}
Type that represents an DNNSuperRes.DNNSuperResImpl
struct.
ref.
reference()
The underlying erlang resource variable.
Functions
Empty constructor for python
Return
- retval:
Evision.DNNSuperRes.DNNSuperResImpl.t()
Python prototype (for reference only):
create() -> retval
@spec getAlgorithm(Keyword.t()) :: any() | {:error, String.t()}
@spec getAlgorithm(t()) :: binary() | {:error, String.t()}
Returns the scale factor of the model:
Positional Arguments
- self:
Evision.DNNSuperRes.DNNSuperResImpl.t()
Return
- retval:
String
@return Current algorithm.
Python prototype (for reference only):
getAlgorithm() -> retval
@spec getScale(Keyword.t()) :: any() | {:error, String.t()}
@spec getScale(t()) :: integer() | {:error, String.t()}
Returns the scale factor of the model:
Positional Arguments
- self:
Evision.DNNSuperRes.DNNSuperResImpl.t()
Return
- retval:
integer()
@return Current scale factor.
Python prototype (for reference only):
getScale() -> retval
Read the model from the given path
Positional Arguments
self:
Evision.DNNSuperRes.DNNSuperResImpl.t()
path:
String
.Path to the model file.
Python prototype (for reference only):
readModel(path) -> None
Set desired model
Positional Arguments
self:
Evision.DNNSuperRes.DNNSuperResImpl.t()
algo:
String
.String containing one of the desired models:
- edsr
- espcn
- fsrcnn
- lapsrn
scale:
integer()
.Integer specifying the upscale factor
Python prototype (for reference only):
setModel(algo, scale) -> None
Set computation backend
Positional Arguments
- self:
Evision.DNNSuperRes.DNNSuperResImpl.t()
- backendId:
integer()
Python prototype (for reference only):
setPreferableBackend(backendId) -> None
Set computation target
Positional Arguments
- self:
Evision.DNNSuperRes.DNNSuperResImpl.t()
- targetId:
integer()
Python prototype (for reference only):
setPreferableTarget(targetId) -> None
@spec upsample(t(), Evision.Mat.maybe_mat_in()) :: Evision.Mat.t() | {:error, String.t()}
Upsample via neural network
Positional Arguments
self:
Evision.DNNSuperRes.DNNSuperResImpl.t()
img:
Evision.Mat
.Image to upscale
Return
result:
Evision.Mat.t()
.Destination upscaled image
Python prototype (for reference only):
upsample(img[, result]) -> result
@spec upsample(t(), Evision.Mat.maybe_mat_in(), [{atom(), term()}, ...] | nil) :: Evision.Mat.t() | {:error, String.t()}
Upsample via neural network
Positional Arguments
self:
Evision.DNNSuperRes.DNNSuperResImpl.t()
img:
Evision.Mat
.Image to upscale
Return
result:
Evision.Mat.t()
.Destination upscaled image
Python prototype (for reference only):
upsample(img[, result]) -> result
upsampleMultioutput(self, img, imgs_new, scale_factors, node_names)
View Source@spec upsampleMultioutput( t(), Evision.Mat.maybe_mat_in(), [Evision.Mat.maybe_mat_in()], [integer()], [ binary() ] ) :: t() | {:error, String.t()}
Upsample via neural network of multiple outputs
Positional Arguments
self:
Evision.DNNSuperRes.DNNSuperResImpl.t()
img:
Evision.Mat
.Image to upscale
imgs_new:
[Evision.Mat]
.Destination upscaled images
scale_factors:
[integer()]
.Scaling factors of the output nodes
node_names:
[String]
.Names of the output nodes in the neural network
Python prototype (for reference only):
upsampleMultioutput(img, imgs_new, scale_factors, node_names) -> None