View Source Evision.CUDA.ORB (Evision v0.2.9)

Summary

Types

t()

Type that represents an CUDA.ORB struct.

Functions

Variant 1:

Computes the descriptors for a set of keypoints detected in an image.

Variant 1:

Computes the descriptors for a set of keypoints detected in an image.

Variant 1:

convert

create

Variant 1:

detect

Variant 1:

detect

Variant 1:

detectAndComputeAsync

Variant 1:

detectAndComputeAsync

Variant 1:

Detects keypoints in an image.

Variant 1:

Detects keypoints in an image.

getBlurForDescriptor

getEdgeThreshold

getFastThreshold

Variant 1:

read

Types

@type t() :: %Evision.CUDA.ORB{ref: reference()}

Type that represents an CUDA.ORB struct.

  • ref. reference()

    The underlying erlang resource variable.

Functions

@spec compute(Keyword.t()) :: any() | {:error, String.t()}
Link to this function

compute(self, images, keypoints)

View Source
@spec compute(t(), [Evision.Mat.maybe_mat_in()], [[Evision.KeyPoint.t()]]) ::
  {[[Evision.KeyPoint.t()]], [Evision.Mat.t()]} | {:error, String.t()}
@spec compute(t(), Evision.Mat.maybe_mat_in(), [Evision.KeyPoint.t()]) ::
  {[Evision.KeyPoint.t()], Evision.Mat.t()} | {:error, String.t()}

Variant 1:

compute

Positional Arguments
  • self: Evision.CUDA.ORB.t()

  • images: [Evision.Mat].

    Image set.

Return
  • keypoints: [[Evision.KeyPoint]].

    Input collection of keypoints. Keypoints for which a descriptor cannot be computed are removed. Sometimes new keypoints can be added, for example: SIFT duplicates keypoint with several dominant orientations (for each orientation).

  • descriptors: [Evision.Mat].

    Computed descriptors. In the second variant of the method descriptors[i] are descriptors computed for a keypoints[i]. Row j is the keypoints (or keypoints[i]) is the descriptor for keypoint j-th keypoint.

Has overloading in C++

Python prototype (for reference only):

compute(images, keypoints[, descriptors]) -> keypoints, descriptors

Variant 2:

Computes the descriptors for a set of keypoints detected in an image (first variant) or image set (second variant).

Positional Arguments
Return
  • keypoints: [Evision.KeyPoint].

    Input collection of keypoints. Keypoints for which a descriptor cannot be computed are removed. Sometimes new keypoints can be added, for example: SIFT duplicates keypoint with several dominant orientations (for each orientation).

  • descriptors: Evision.Mat.t().

    Computed descriptors. In the second variant of the method descriptors[i] are descriptors computed for a keypoints[i]. Row j is the keypoints (or keypoints[i]) is the descriptor for keypoint j-th keypoint.

Python prototype (for reference only):

compute(image, keypoints[, descriptors]) -> keypoints, descriptors
Link to this function

compute(self, images, keypoints, opts)

View Source
@spec compute(
  t(),
  [Evision.Mat.maybe_mat_in()],
  [[Evision.KeyPoint.t()]],
  [{atom(), term()}, ...] | nil
) :: {[[Evision.KeyPoint.t()]], [Evision.Mat.t()]} | {:error, String.t()}
@spec compute(
  t(),
  Evision.Mat.maybe_mat_in(),
  [Evision.KeyPoint.t()],
  [{atom(), term()}, ...] | nil
) ::
  {[Evision.KeyPoint.t()], Evision.Mat.t()} | {:error, String.t()}

Variant 1:

compute

Positional Arguments
  • self: Evision.CUDA.ORB.t()

  • images: [Evision.Mat].

    Image set.

Return
  • keypoints: [[Evision.KeyPoint]].

    Input collection of keypoints. Keypoints for which a descriptor cannot be computed are removed. Sometimes new keypoints can be added, for example: SIFT duplicates keypoint with several dominant orientations (for each orientation).

  • descriptors: [Evision.Mat].

    Computed descriptors. In the second variant of the method descriptors[i] are descriptors computed for a keypoints[i]. Row j is the keypoints (or keypoints[i]) is the descriptor for keypoint j-th keypoint.

Has overloading in C++

Python prototype (for reference only):

compute(images, keypoints[, descriptors]) -> keypoints, descriptors

Variant 2:

Computes the descriptors for a set of keypoints detected in an image (first variant) or image set (second variant).

Positional Arguments
Return
  • keypoints: [Evision.KeyPoint].

    Input collection of keypoints. Keypoints for which a descriptor cannot be computed are removed. Sometimes new keypoints can be added, for example: SIFT duplicates keypoint with several dominant orientations (for each orientation).

  • descriptors: Evision.Mat.t().

    Computed descriptors. In the second variant of the method descriptors[i] are descriptors computed for a keypoints[i]. Row j is the keypoints (or keypoints[i]) is the descriptor for keypoint j-th keypoint.

Python prototype (for reference only):

compute(image, keypoints[, descriptors]) -> keypoints, descriptors
Link to this function

computeAsync(named_args)

View Source
@spec computeAsync(Keyword.t()) :: any() | {:error, String.t()}
Link to this function

computeAsync(self, image)

View Source
@spec computeAsync(t(), Evision.Mat.maybe_mat_in()) ::
  {Evision.Mat.t(), Evision.Mat.t()} | {:error, String.t()}
@spec computeAsync(t(), Evision.CUDA.GpuMat.t()) ::
  {Evision.CUDA.GpuMat.t(), Evision.CUDA.GpuMat.t()} | {:error, String.t()}

Variant 1:

Computes the descriptors for a set of keypoints detected in an image.

Positional Arguments
Keyword Arguments
  • stream: Evision.CUDA.Stream.t().

    CUDA stream.

Return
  • keypoints: Evision.Mat.t().

    Input collection of keypoints.

  • descriptors: Evision.Mat.t().

    Computed descriptors. Row j is the descriptor for j-th keypoint.

Python prototype (for reference only):

computeAsync(image[, keypoints[, descriptors[, stream]]]) -> keypoints, descriptors

Variant 2:

Computes the descriptors for a set of keypoints detected in an image.

Positional Arguments
  • self: Evision.CUDA.ORB.t()

  • image: Evision.CUDA.GpuMat.t().

    Image.

Keyword Arguments
  • stream: Evision.CUDA.Stream.t().

    CUDA stream.

Return
  • keypoints: Evision.CUDA.GpuMat.t().

    Input collection of keypoints.

  • descriptors: Evision.CUDA.GpuMat.t().

    Computed descriptors. Row j is the descriptor for j-th keypoint.

Python prototype (for reference only):

computeAsync(image[, keypoints[, descriptors[, stream]]]) -> keypoints, descriptors
Link to this function

computeAsync(self, image, opts)

View Source
@spec computeAsync(t(), Evision.Mat.maybe_mat_in(), [{:stream, term()}] | nil) ::
  {Evision.Mat.t(), Evision.Mat.t()} | {:error, String.t()}
@spec computeAsync(t(), Evision.CUDA.GpuMat.t(), [{:stream, term()}] | nil) ::
  {Evision.CUDA.GpuMat.t(), Evision.CUDA.GpuMat.t()} | {:error, String.t()}

Variant 1:

Computes the descriptors for a set of keypoints detected in an image.

Positional Arguments
Keyword Arguments
  • stream: Evision.CUDA.Stream.t().

    CUDA stream.

Return
  • keypoints: Evision.Mat.t().

    Input collection of keypoints.

  • descriptors: Evision.Mat.t().

    Computed descriptors. Row j is the descriptor for j-th keypoint.

Python prototype (for reference only):

computeAsync(image[, keypoints[, descriptors[, stream]]]) -> keypoints, descriptors

Variant 2:

Computes the descriptors for a set of keypoints detected in an image.

Positional Arguments
  • self: Evision.CUDA.ORB.t()

  • image: Evision.CUDA.GpuMat.t().

    Image.

Keyword Arguments
  • stream: Evision.CUDA.Stream.t().

    CUDA stream.

Return
  • keypoints: Evision.CUDA.GpuMat.t().

    Input collection of keypoints.

  • descriptors: Evision.CUDA.GpuMat.t().

    Computed descriptors. Row j is the descriptor for j-th keypoint.

Python prototype (for reference only):

computeAsync(image[, keypoints[, descriptors[, stream]]]) -> keypoints, descriptors
@spec convert(Keyword.t()) :: any() | {:error, String.t()}
Link to this function

convert(self, gpu_keypoints)

View Source
@spec convert(t(), Evision.Mat.maybe_mat_in()) ::
  [Evision.KeyPoint.t()] | {:error, String.t()}
@spec convert(t(), Evision.CUDA.GpuMat.t()) ::
  [Evision.KeyPoint.t()] | {:error, String.t()}

Variant 1:

convert

Positional Arguments
Return
  • keypoints: [Evision.KeyPoint]

Converts keypoints array from internal representation to standard vector.

Python prototype (for reference only):

convert(gpu_keypoints) -> keypoints

Variant 2:

convert

Positional Arguments
  • self: Evision.CUDA.ORB.t()
  • gpu_keypoints: Evision.CUDA.GpuMat.t()
Return
  • keypoints: [Evision.KeyPoint]

Converts keypoints array from internal representation to standard vector.

Python prototype (for reference only):

convert(gpu_keypoints) -> keypoints
@spec create() :: t() | {:error, String.t()}

create

Keyword Arguments
  • nfeatures: integer().
  • scaleFactor: float.
  • nlevels: integer().
  • edgeThreshold: integer().
  • firstLevel: integer().
  • wTA_K: integer().
  • scoreType: integer().
  • patchSize: integer().
  • fastThreshold: integer().
  • blurForDescriptor: bool.
Return
  • retval: Evision.CUDA.ORB.t()

Python prototype (for reference only):

create([, nfeatures[, scaleFactor[, nlevels[, edgeThreshold[, firstLevel[, WTA_K[, scoreType[, patchSize[, fastThreshold[, blurForDescriptor]]]]]]]]]]) -> retval
@spec create(Keyword.t()) :: any() | {:error, String.t()}
@spec create(
  [
    blurForDescriptor: term(),
    edgeThreshold: term(),
    fastThreshold: term(),
    firstLevel: term(),
    nfeatures: term(),
    nlevels: term(),
    patchSize: term(),
    scaleFactor: term(),
    scoreType: term(),
    wTA_K: term()
  ]
  | nil
) :: t() | {:error, String.t()}

create

Keyword Arguments
  • nfeatures: integer().
  • scaleFactor: float.
  • nlevels: integer().
  • edgeThreshold: integer().
  • firstLevel: integer().
  • wTA_K: integer().
  • scoreType: integer().
  • patchSize: integer().
  • fastThreshold: integer().
  • blurForDescriptor: bool.
Return
  • retval: Evision.CUDA.ORB.t()

Python prototype (for reference only):

create([, nfeatures[, scaleFactor[, nlevels[, edgeThreshold[, firstLevel[, WTA_K[, scoreType[, patchSize[, fastThreshold[, blurForDescriptor]]]]]]]]]]) -> retval
@spec defaultNorm(Keyword.t()) :: any() | {:error, String.t()}
@spec defaultNorm(t()) :: integer() | {:error, String.t()}

defaultNorm

Positional Arguments
  • self: Evision.CUDA.ORB.t()
Return
  • retval: integer()

Python prototype (for reference only):

defaultNorm() -> retval
Link to this function

descriptorSize(named_args)

View Source
@spec descriptorSize(Keyword.t()) :: any() | {:error, String.t()}
@spec descriptorSize(t()) :: integer() | {:error, String.t()}

descriptorSize

Positional Arguments
  • self: Evision.CUDA.ORB.t()
Return
  • retval: integer()

Python prototype (for reference only):

descriptorSize() -> retval
Link to this function

descriptorType(named_args)

View Source
@spec descriptorType(Keyword.t()) :: any() | {:error, String.t()}
@spec descriptorType(t()) :: integer() | {:error, String.t()}

descriptorType

Positional Arguments
  • self: Evision.CUDA.ORB.t()
Return
  • retval: integer()

Python prototype (for reference only):

descriptorType() -> retval
@spec detect(Keyword.t()) :: any() | {:error, String.t()}
@spec detect(t(), [Evision.Mat.maybe_mat_in()]) ::
  [[Evision.KeyPoint.t()]] | {:error, String.t()}
@spec detect(t(), Evision.Mat.maybe_mat_in()) ::
  [Evision.KeyPoint.t()] | {:error, String.t()}

Variant 1:

detect

Positional Arguments
  • self: Evision.CUDA.ORB.t()

  • images: [Evision.Mat].

    Image set.

Keyword Arguments
  • masks: [Evision.Mat].

    Masks for each input image specifying where to look for keypoints (optional). masks[i] is a mask for images[i].

Return
  • keypoints: [[Evision.KeyPoint]].

    The detected keypoints. In the second variant of the method keypoints[i] is a set of keypoints detected in images[i] .

Has overloading in C++

Python prototype (for reference only):

detect(images[, masks]) -> keypoints

Variant 2:

Detects keypoints in an image (first variant) or image set (second variant).

Positional Arguments
Keyword Arguments
  • mask: Evision.Mat.

    Mask specifying where to look for keypoints (optional). It must be a 8-bit integer matrix with non-zero values in the region of interest.

Return
  • keypoints: [Evision.KeyPoint].

    The detected keypoints. In the second variant of the method keypoints[i] is a set of keypoints detected in images[i] .

Python prototype (for reference only):

detect(image[, mask]) -> keypoints
Link to this function

detect(self, images, opts)

View Source
@spec detect(t(), [Evision.Mat.maybe_mat_in()], [{:masks, term()}] | nil) ::
  [[Evision.KeyPoint.t()]] | {:error, String.t()}
@spec detect(t(), Evision.Mat.maybe_mat_in(), [{:mask, term()}] | nil) ::
  [Evision.KeyPoint.t()] | {:error, String.t()}

Variant 1:

detect

Positional Arguments
  • self: Evision.CUDA.ORB.t()

  • images: [Evision.Mat].

    Image set.

Keyword Arguments
  • masks: [Evision.Mat].

    Masks for each input image specifying where to look for keypoints (optional). masks[i] is a mask for images[i].

Return
  • keypoints: [[Evision.KeyPoint]].

    The detected keypoints. In the second variant of the method keypoints[i] is a set of keypoints detected in images[i] .

Has overloading in C++

Python prototype (for reference only):

detect(images[, masks]) -> keypoints

Variant 2:

Detects keypoints in an image (first variant) or image set (second variant).

Positional Arguments
Keyword Arguments
  • mask: Evision.Mat.

    Mask specifying where to look for keypoints (optional). It must be a 8-bit integer matrix with non-zero values in the region of interest.

Return
  • keypoints: [Evision.KeyPoint].

    The detected keypoints. In the second variant of the method keypoints[i] is a set of keypoints detected in images[i] .

Python prototype (for reference only):

detect(image[, mask]) -> keypoints
Link to this function

detectAndCompute(named_args)

View Source
@spec detectAndCompute(Keyword.t()) :: any() | {:error, String.t()}
Link to this function

detectAndCompute(self, image, mask)

View Source
@spec detectAndCompute(t(), Evision.Mat.maybe_mat_in(), Evision.Mat.maybe_mat_in()) ::
  {[Evision.KeyPoint.t()], Evision.Mat.t()} | {:error, String.t()}

detectAndCompute

Positional Arguments
Keyword Arguments
  • useProvidedKeypoints: bool.
Return
  • keypoints: [Evision.KeyPoint]
  • descriptors: Evision.Mat.t().

Detects keypoints and computes the descriptors

Python prototype (for reference only):

detectAndCompute(image, mask[, descriptors[, useProvidedKeypoints]]) -> keypoints, descriptors
Link to this function

detectAndCompute(self, image, mask, opts)

View Source
@spec detectAndCompute(
  t(),
  Evision.Mat.maybe_mat_in(),
  Evision.Mat.maybe_mat_in(),
  [{:useProvidedKeypoints, term()}] | nil
) :: {[Evision.KeyPoint.t()], Evision.Mat.t()} | {:error, String.t()}

detectAndCompute

Positional Arguments
Keyword Arguments
  • useProvidedKeypoints: bool.
Return
  • keypoints: [Evision.KeyPoint]
  • descriptors: Evision.Mat.t().

Detects keypoints and computes the descriptors

Python prototype (for reference only):

detectAndCompute(image, mask[, descriptors[, useProvidedKeypoints]]) -> keypoints, descriptors
Link to this function

detectAndComputeAsync(named_args)

View Source
@spec detectAndComputeAsync(Keyword.t()) :: any() | {:error, String.t()}
Link to this function

detectAndComputeAsync(self, image, mask)

View Source
@spec detectAndComputeAsync(
  t(),
  Evision.Mat.maybe_mat_in(),
  Evision.Mat.maybe_mat_in()
) ::
  {Evision.Mat.t(), Evision.Mat.t()} | {:error, String.t()}
@spec detectAndComputeAsync(t(), Evision.CUDA.GpuMat.t(), Evision.CUDA.GpuMat.t()) ::
  {Evision.CUDA.GpuMat.t(), Evision.CUDA.GpuMat.t()} | {:error, String.t()}

Variant 1:

detectAndComputeAsync

Positional Arguments
Keyword Arguments
  • useProvidedKeypoints: bool.
  • stream: Evision.CUDA.Stream.t().
Return
  • keypoints: Evision.Mat.t().
  • descriptors: Evision.Mat.t().

Detects keypoints and computes the descriptors.

Python prototype (for reference only):

detectAndComputeAsync(image, mask[, keypoints[, descriptors[, useProvidedKeypoints[, stream]]]]) -> keypoints, descriptors

Variant 2:

detectAndComputeAsync

Positional Arguments
  • self: Evision.CUDA.ORB.t()
  • image: Evision.CUDA.GpuMat.t()
  • mask: Evision.CUDA.GpuMat.t()
Keyword Arguments
  • useProvidedKeypoints: bool.
  • stream: Evision.CUDA.Stream.t().
Return
  • keypoints: Evision.CUDA.GpuMat.t().
  • descriptors: Evision.CUDA.GpuMat.t().

Detects keypoints and computes the descriptors.

Python prototype (for reference only):

detectAndComputeAsync(image, mask[, keypoints[, descriptors[, useProvidedKeypoints[, stream]]]]) -> keypoints, descriptors
Link to this function

detectAndComputeAsync(self, image, mask, opts)

View Source
@spec detectAndComputeAsync(
  t(),
  Evision.Mat.maybe_mat_in(),
  Evision.Mat.maybe_mat_in(),
  [stream: term(), useProvidedKeypoints: term()] | nil
) :: {Evision.Mat.t(), Evision.Mat.t()} | {:error, String.t()}
@spec detectAndComputeAsync(
  t(),
  Evision.CUDA.GpuMat.t(),
  Evision.CUDA.GpuMat.t(),
  [stream: term(), useProvidedKeypoints: term()] | nil
) :: {Evision.CUDA.GpuMat.t(), Evision.CUDA.GpuMat.t()} | {:error, String.t()}

Variant 1:

detectAndComputeAsync

Positional Arguments
Keyword Arguments
  • useProvidedKeypoints: bool.
  • stream: Evision.CUDA.Stream.t().
Return
  • keypoints: Evision.Mat.t().
  • descriptors: Evision.Mat.t().

Detects keypoints and computes the descriptors.

Python prototype (for reference only):

detectAndComputeAsync(image, mask[, keypoints[, descriptors[, useProvidedKeypoints[, stream]]]]) -> keypoints, descriptors

Variant 2:

detectAndComputeAsync

Positional Arguments
  • self: Evision.CUDA.ORB.t()
  • image: Evision.CUDA.GpuMat.t()
  • mask: Evision.CUDA.GpuMat.t()
Keyword Arguments
  • useProvidedKeypoints: bool.
  • stream: Evision.CUDA.Stream.t().
Return
  • keypoints: Evision.CUDA.GpuMat.t().
  • descriptors: Evision.CUDA.GpuMat.t().

Detects keypoints and computes the descriptors.

Python prototype (for reference only):

detectAndComputeAsync(image, mask[, keypoints[, descriptors[, useProvidedKeypoints[, stream]]]]) -> keypoints, descriptors
@spec detectAsync(Keyword.t()) :: any() | {:error, String.t()}
Link to this function

detectAsync(self, image)

View Source
@spec detectAsync(t(), Evision.Mat.maybe_mat_in()) ::
  Evision.Mat.t() | {:error, String.t()}
@spec detectAsync(t(), Evision.CUDA.GpuMat.t()) ::
  Evision.CUDA.GpuMat.t() | {:error, String.t()}

Variant 1:

Detects keypoints in an image.

Positional Arguments
Keyword Arguments
  • mask: Evision.Mat.

    Mask specifying where to look for keypoints (optional). It must be a 8-bit integer matrix with non-zero values in the region of interest.

  • stream: Evision.CUDA.Stream.t().

    CUDA stream.

Return
  • keypoints: Evision.Mat.t().

    The detected keypoints.

Python prototype (for reference only):

detectAsync(image[, keypoints[, mask[, stream]]]) -> keypoints

Variant 2:

Detects keypoints in an image.

Positional Arguments
  • self: Evision.CUDA.ORB.t()

  • image: Evision.CUDA.GpuMat.t().

    Image.

Keyword Arguments
  • mask: Evision.CUDA.GpuMat.t().

    Mask specifying where to look for keypoints (optional). It must be a 8-bit integer matrix with non-zero values in the region of interest.

  • stream: Evision.CUDA.Stream.t().

    CUDA stream.

Return
  • keypoints: Evision.CUDA.GpuMat.t().

    The detected keypoints.

Python prototype (for reference only):

detectAsync(image[, keypoints[, mask[, stream]]]) -> keypoints
Link to this function

detectAsync(self, image, opts)

View Source
@spec detectAsync(
  t(),
  Evision.Mat.maybe_mat_in(),
  [mask: term(), stream: term()] | nil
) ::
  Evision.Mat.t() | {:error, String.t()}
@spec detectAsync(t(), Evision.CUDA.GpuMat.t(), [mask: term(), stream: term()] | nil) ::
  Evision.CUDA.GpuMat.t() | {:error, String.t()}

Variant 1:

Detects keypoints in an image.

Positional Arguments
Keyword Arguments
  • mask: Evision.Mat.

    Mask specifying where to look for keypoints (optional). It must be a 8-bit integer matrix with non-zero values in the region of interest.

  • stream: Evision.CUDA.Stream.t().

    CUDA stream.

Return
  • keypoints: Evision.Mat.t().

    The detected keypoints.

Python prototype (for reference only):

detectAsync(image[, keypoints[, mask[, stream]]]) -> keypoints

Variant 2:

Detects keypoints in an image.

Positional Arguments
  • self: Evision.CUDA.ORB.t()

  • image: Evision.CUDA.GpuMat.t().

    Image.

Keyword Arguments
  • mask: Evision.CUDA.GpuMat.t().

    Mask specifying where to look for keypoints (optional). It must be a 8-bit integer matrix with non-zero values in the region of interest.

  • stream: Evision.CUDA.Stream.t().

    CUDA stream.

Return
  • keypoints: Evision.CUDA.GpuMat.t().

    The detected keypoints.

Python prototype (for reference only):

detectAsync(image[, keypoints[, mask[, stream]]]) -> keypoints
@spec empty(Keyword.t()) :: any() | {:error, String.t()}
@spec empty(t()) :: boolean() | {:error, String.t()}

empty

Positional Arguments
  • self: Evision.CUDA.ORB.t()
Return
  • retval: bool

Python prototype (for reference only):

empty() -> retval
Link to this function

getBlurForDescriptor(named_args)

View Source
@spec getBlurForDescriptor(Keyword.t()) :: any() | {:error, String.t()}
@spec getBlurForDescriptor(t()) :: boolean() | {:error, String.t()}

getBlurForDescriptor

Positional Arguments
  • self: Evision.CUDA.ORB.t()
Return
  • retval: bool

Python prototype (for reference only):

getBlurForDescriptor() -> retval
Link to this function

getDefaultName(named_args)

View Source
@spec getDefaultName(Keyword.t()) :: any() | {:error, String.t()}
@spec getDefaultName(t()) :: binary() | {:error, String.t()}

getDefaultName

Positional Arguments
  • self: Evision.CUDA.ORB.t()
Return

Python prototype (for reference only):

getDefaultName() -> retval
Link to this function

getEdgeThreshold(named_args)

View Source
@spec getEdgeThreshold(Keyword.t()) :: any() | {:error, String.t()}
@spec getEdgeThreshold(t()) :: integer() | {:error, String.t()}

getEdgeThreshold

Positional Arguments
  • self: Evision.CUDA.ORB.t()
Return
  • retval: integer()

Python prototype (for reference only):

getEdgeThreshold() -> retval
Link to this function

getFastThreshold(named_args)

View Source
@spec getFastThreshold(Keyword.t()) :: any() | {:error, String.t()}
@spec getFastThreshold(t()) :: integer() | {:error, String.t()}

getFastThreshold

Positional Arguments
  • self: Evision.CUDA.ORB.t()
Return
  • retval: integer()

Python prototype (for reference only):

getFastThreshold() -> retval
Link to this function

getFirstLevel(named_args)

View Source
@spec getFirstLevel(Keyword.t()) :: any() | {:error, String.t()}
@spec getFirstLevel(t()) :: integer() | {:error, String.t()}

getFirstLevel

Positional Arguments
  • self: Evision.CUDA.ORB.t()
Return
  • retval: integer()

Python prototype (for reference only):

getFirstLevel() -> retval
Link to this function

getMaxFeatures(named_args)

View Source
@spec getMaxFeatures(Keyword.t()) :: any() | {:error, String.t()}
@spec getMaxFeatures(t()) :: integer() | {:error, String.t()}

getMaxFeatures

Positional Arguments
  • self: Evision.CUDA.ORB.t()
Return
  • retval: integer()

Python prototype (for reference only):

getMaxFeatures() -> retval
@spec getNLevels(Keyword.t()) :: any() | {:error, String.t()}
@spec getNLevels(t()) :: integer() | {:error, String.t()}

getNLevels

Positional Arguments
  • self: Evision.CUDA.ORB.t()
Return
  • retval: integer()

Python prototype (for reference only):

getNLevels() -> retval
Link to this function

getPatchSize(named_args)

View Source
@spec getPatchSize(Keyword.t()) :: any() | {:error, String.t()}
@spec getPatchSize(t()) :: integer() | {:error, String.t()}

getPatchSize

Positional Arguments
  • self: Evision.CUDA.ORB.t()
Return
  • retval: integer()

Python prototype (for reference only):

getPatchSize() -> retval
Link to this function

getScaleFactor(named_args)

View Source
@spec getScaleFactor(Keyword.t()) :: any() | {:error, String.t()}
@spec getScaleFactor(t()) :: number() | {:error, String.t()}

getScaleFactor

Positional Arguments
  • self: Evision.CUDA.ORB.t()
Return
  • retval: double

Python prototype (for reference only):

getScaleFactor() -> retval
Link to this function

getScoreType(named_args)

View Source
@spec getScoreType(Keyword.t()) :: any() | {:error, String.t()}
@spec getScoreType(t()) :: integer() | {:error, String.t()}

getScoreType

Positional Arguments
  • self: Evision.CUDA.ORB.t()
Return
  • retval: integer()

Python prototype (for reference only):

getScoreType() -> retval
@spec getWTA_K(Keyword.t()) :: any() | {:error, String.t()}
@spec getWTA_K(t()) :: integer() | {:error, String.t()}

getWTA_K

Positional Arguments
  • self: Evision.CUDA.ORB.t()
Return
  • retval: integer()

Python prototype (for reference only):

getWTA_K() -> retval
@spec read(Keyword.t()) :: any() | {:error, String.t()}
@spec read(t(), Evision.FileNode.t()) :: t() | {:error, String.t()}
@spec read(t(), binary()) :: t() | {:error, String.t()}

Variant 1:

read

Positional Arguments

Python prototype (for reference only):

read(arg1) -> None

Variant 2:

read

Positional Arguments
  • self: Evision.CUDA.ORB.t()
  • fileName: String

Python prototype (for reference only):

read(fileName) -> None
Link to this function

setBlurForDescriptor(named_args)

View Source
@spec setBlurForDescriptor(Keyword.t()) :: any() | {:error, String.t()}
Link to this function

setBlurForDescriptor(self, blurForDescriptor)

View Source
@spec setBlurForDescriptor(t(), boolean()) :: t() | {:error, String.t()}

setBlurForDescriptor

Positional Arguments
  • self: Evision.CUDA.ORB.t()
  • blurForDescriptor: bool

Python prototype (for reference only):

setBlurForDescriptor(blurForDescriptor) -> None
Link to this function

setEdgeThreshold(named_args)

View Source
@spec setEdgeThreshold(Keyword.t()) :: any() | {:error, String.t()}
Link to this function

setEdgeThreshold(self, edgeThreshold)

View Source
@spec setEdgeThreshold(t(), integer()) :: t() | {:error, String.t()}

setEdgeThreshold

Positional Arguments
  • self: Evision.CUDA.ORB.t()
  • edgeThreshold: integer()

Python prototype (for reference only):

setEdgeThreshold(edgeThreshold) -> None
Link to this function

setFastThreshold(named_args)

View Source
@spec setFastThreshold(Keyword.t()) :: any() | {:error, String.t()}
Link to this function

setFastThreshold(self, fastThreshold)

View Source
@spec setFastThreshold(t(), integer()) :: t() | {:error, String.t()}

setFastThreshold

Positional Arguments
  • self: Evision.CUDA.ORB.t()
  • fastThreshold: integer()

Python prototype (for reference only):

setFastThreshold(fastThreshold) -> None
Link to this function

setFirstLevel(named_args)

View Source
@spec setFirstLevel(Keyword.t()) :: any() | {:error, String.t()}
Link to this function

setFirstLevel(self, firstLevel)

View Source
@spec setFirstLevel(t(), integer()) :: t() | {:error, String.t()}

setFirstLevel

Positional Arguments
  • self: Evision.CUDA.ORB.t()
  • firstLevel: integer()

Python prototype (for reference only):

setFirstLevel(firstLevel) -> None
Link to this function

setMaxFeatures(named_args)

View Source
@spec setMaxFeatures(Keyword.t()) :: any() | {:error, String.t()}
Link to this function

setMaxFeatures(self, maxFeatures)

View Source
@spec setMaxFeatures(t(), integer()) :: t() | {:error, String.t()}

setMaxFeatures

Positional Arguments
  • self: Evision.CUDA.ORB.t()
  • maxFeatures: integer()

Python prototype (for reference only):

setMaxFeatures(maxFeatures) -> None
@spec setNLevels(Keyword.t()) :: any() | {:error, String.t()}
Link to this function

setNLevels(self, nlevels)

View Source
@spec setNLevels(t(), integer()) :: t() | {:error, String.t()}

setNLevels

Positional Arguments
  • self: Evision.CUDA.ORB.t()
  • nlevels: integer()

Python prototype (for reference only):

setNLevels(nlevels) -> None
Link to this function

setPatchSize(named_args)

View Source
@spec setPatchSize(Keyword.t()) :: any() | {:error, String.t()}
Link to this function

setPatchSize(self, patchSize)

View Source
@spec setPatchSize(t(), integer()) :: t() | {:error, String.t()}

setPatchSize

Positional Arguments
  • self: Evision.CUDA.ORB.t()
  • patchSize: integer()

Python prototype (for reference only):

setPatchSize(patchSize) -> None
Link to this function

setScaleFactor(named_args)

View Source
@spec setScaleFactor(Keyword.t()) :: any() | {:error, String.t()}
Link to this function

setScaleFactor(self, scaleFactor)

View Source
@spec setScaleFactor(t(), number()) :: t() | {:error, String.t()}

setScaleFactor

Positional Arguments
  • self: Evision.CUDA.ORB.t()
  • scaleFactor: double

Python prototype (for reference only):

setScaleFactor(scaleFactor) -> None
Link to this function

setScoreType(named_args)

View Source
@spec setScoreType(Keyword.t()) :: any() | {:error, String.t()}
Link to this function

setScoreType(self, scoreType)

View Source
@spec setScoreType(t(), integer()) :: t() | {:error, String.t()}

setScoreType

Positional Arguments
  • self: Evision.CUDA.ORB.t()
  • scoreType: integer()

Python prototype (for reference only):

setScoreType(scoreType) -> None
@spec setWTA_K(Keyword.t()) :: any() | {:error, String.t()}
@spec setWTA_K(t(), integer()) :: t() | {:error, String.t()}

setWTA_K

Positional Arguments
  • self: Evision.CUDA.ORB.t()
  • wta_k: integer()

Python prototype (for reference only):

setWTA_K(wta_k) -> None
@spec write(Keyword.t()) :: any() | {:error, String.t()}
@spec write(t(), binary()) :: t() | {:error, String.t()}

write

Positional Arguments
  • self: Evision.CUDA.ORB.t()
  • fileName: String

Python prototype (for reference only):

write(fileName) -> None
@spec write(t(), Evision.FileStorage.t(), binary()) :: t() | {:error, String.t()}

write

Positional Arguments

Python prototype (for reference only):

write(fs, name) -> None