View Source Evision.CUDA.ORB (Evision v0.2.9)
Summary
Functions
Variant 1:
compute
Variant 1:
compute
Variant 1:
Computes the descriptors for a set of keypoints detected in an image.
Variant 1:
Computes the descriptors for a set of keypoints detected in an image.
Variant 1:
convert
create
create
defaultNorm
descriptorSize
descriptorType
Variant 1:
detect
Variant 1:
detect
detectAndCompute
detectAndCompute
Variant 1:
detectAndComputeAsync
Variant 1:
detectAndComputeAsync
Variant 1:
Detects keypoints in an image.
Variant 1:
Detects keypoints in an image.
empty
getBlurForDescriptor
getDefaultName
getEdgeThreshold
getFastThreshold
getFirstLevel
getMaxFeatures
getNLevels
getPatchSize
getScaleFactor
getScoreType
getWTA_K
Variant 1:
read
setBlurForDescriptor
setEdgeThreshold
setFastThreshold
setFirstLevel
setMaxFeatures
setNLevels
setPatchSize
setScaleFactor
setScoreType
setWTA_K
write
write
Types
@type t() :: %Evision.CUDA.ORB{ref: reference()}
Type that represents an CUDA.ORB
struct.
ref.
reference()
The underlying erlang resource variable.
Functions
@spec compute(t(), [Evision.Mat.maybe_mat_in()], [[Evision.KeyPoint.t()]]) :: {[[Evision.KeyPoint.t()]], [Evision.Mat.t()]} | {:error, String.t()}
@spec compute(t(), Evision.Mat.maybe_mat_in(), [Evision.KeyPoint.t()]) :: {[Evision.KeyPoint.t()], Evision.Mat.t()} | {:error, String.t()}
Variant 1:
compute
Positional Arguments
self:
Evision.CUDA.ORB.t()
images:
[Evision.Mat]
.Image set.
Return
keypoints:
[[Evision.KeyPoint]]
.Input collection of keypoints. Keypoints for which a descriptor cannot be computed are removed. Sometimes new keypoints can be added, for example: SIFT duplicates keypoint with several dominant orientations (for each orientation).
descriptors:
[Evision.Mat]
.Computed descriptors. In the second variant of the method descriptors[i] are descriptors computed for a keypoints[i]. Row j is the keypoints (or keypoints[i]) is the descriptor for keypoint j-th keypoint.
Has overloading in C++
Python prototype (for reference only):
compute(images, keypoints[, descriptors]) -> keypoints, descriptors
Variant 2:
Computes the descriptors for a set of keypoints detected in an image (first variant) or image set (second variant).
Positional Arguments
self:
Evision.CUDA.ORB.t()
image:
Evision.Mat
.Image.
Return
keypoints:
[Evision.KeyPoint]
.Input collection of keypoints. Keypoints for which a descriptor cannot be computed are removed. Sometimes new keypoints can be added, for example: SIFT duplicates keypoint with several dominant orientations (for each orientation).
descriptors:
Evision.Mat.t()
.Computed descriptors. In the second variant of the method descriptors[i] are descriptors computed for a keypoints[i]. Row j is the keypoints (or keypoints[i]) is the descriptor for keypoint j-th keypoint.
Python prototype (for reference only):
compute(image, keypoints[, descriptors]) -> keypoints, descriptors
@spec compute( t(), [Evision.Mat.maybe_mat_in()], [[Evision.KeyPoint.t()]], [{atom(), term()}, ...] | nil ) :: {[[Evision.KeyPoint.t()]], [Evision.Mat.t()]} | {:error, String.t()}
@spec compute( t(), Evision.Mat.maybe_mat_in(), [Evision.KeyPoint.t()], [{atom(), term()}, ...] | nil ) :: {[Evision.KeyPoint.t()], Evision.Mat.t()} | {:error, String.t()}
Variant 1:
compute
Positional Arguments
self:
Evision.CUDA.ORB.t()
images:
[Evision.Mat]
.Image set.
Return
keypoints:
[[Evision.KeyPoint]]
.Input collection of keypoints. Keypoints for which a descriptor cannot be computed are removed. Sometimes new keypoints can be added, for example: SIFT duplicates keypoint with several dominant orientations (for each orientation).
descriptors:
[Evision.Mat]
.Computed descriptors. In the second variant of the method descriptors[i] are descriptors computed for a keypoints[i]. Row j is the keypoints (or keypoints[i]) is the descriptor for keypoint j-th keypoint.
Has overloading in C++
Python prototype (for reference only):
compute(images, keypoints[, descriptors]) -> keypoints, descriptors
Variant 2:
Computes the descriptors for a set of keypoints detected in an image (first variant) or image set (second variant).
Positional Arguments
self:
Evision.CUDA.ORB.t()
image:
Evision.Mat
.Image.
Return
keypoints:
[Evision.KeyPoint]
.Input collection of keypoints. Keypoints for which a descriptor cannot be computed are removed. Sometimes new keypoints can be added, for example: SIFT duplicates keypoint with several dominant orientations (for each orientation).
descriptors:
Evision.Mat.t()
.Computed descriptors. In the second variant of the method descriptors[i] are descriptors computed for a keypoints[i]. Row j is the keypoints (or keypoints[i]) is the descriptor for keypoint j-th keypoint.
Python prototype (for reference only):
compute(image, keypoints[, descriptors]) -> keypoints, descriptors
@spec computeAsync(t(), Evision.Mat.maybe_mat_in()) :: {Evision.Mat.t(), Evision.Mat.t()} | {:error, String.t()}
@spec computeAsync(t(), Evision.CUDA.GpuMat.t()) :: {Evision.CUDA.GpuMat.t(), Evision.CUDA.GpuMat.t()} | {:error, String.t()}
Variant 1:
Computes the descriptors for a set of keypoints detected in an image.
Positional Arguments
self:
Evision.CUDA.ORB.t()
image:
Evision.Mat
.Image.
Keyword Arguments
stream:
Evision.CUDA.Stream.t()
.CUDA stream.
Return
keypoints:
Evision.Mat.t()
.Input collection of keypoints.
descriptors:
Evision.Mat.t()
.Computed descriptors. Row j is the descriptor for j-th keypoint.
Python prototype (for reference only):
computeAsync(image[, keypoints[, descriptors[, stream]]]) -> keypoints, descriptors
Variant 2:
Computes the descriptors for a set of keypoints detected in an image.
Positional Arguments
self:
Evision.CUDA.ORB.t()
image:
Evision.CUDA.GpuMat.t()
.Image.
Keyword Arguments
stream:
Evision.CUDA.Stream.t()
.CUDA stream.
Return
keypoints:
Evision.CUDA.GpuMat.t()
.Input collection of keypoints.
descriptors:
Evision.CUDA.GpuMat.t()
.Computed descriptors. Row j is the descriptor for j-th keypoint.
Python prototype (for reference only):
computeAsync(image[, keypoints[, descriptors[, stream]]]) -> keypoints, descriptors
@spec computeAsync(t(), Evision.Mat.maybe_mat_in(), [{:stream, term()}] | nil) :: {Evision.Mat.t(), Evision.Mat.t()} | {:error, String.t()}
@spec computeAsync(t(), Evision.CUDA.GpuMat.t(), [{:stream, term()}] | nil) :: {Evision.CUDA.GpuMat.t(), Evision.CUDA.GpuMat.t()} | {:error, String.t()}
Variant 1:
Computes the descriptors for a set of keypoints detected in an image.
Positional Arguments
self:
Evision.CUDA.ORB.t()
image:
Evision.Mat
.Image.
Keyword Arguments
stream:
Evision.CUDA.Stream.t()
.CUDA stream.
Return
keypoints:
Evision.Mat.t()
.Input collection of keypoints.
descriptors:
Evision.Mat.t()
.Computed descriptors. Row j is the descriptor for j-th keypoint.
Python prototype (for reference only):
computeAsync(image[, keypoints[, descriptors[, stream]]]) -> keypoints, descriptors
Variant 2:
Computes the descriptors for a set of keypoints detected in an image.
Positional Arguments
self:
Evision.CUDA.ORB.t()
image:
Evision.CUDA.GpuMat.t()
.Image.
Keyword Arguments
stream:
Evision.CUDA.Stream.t()
.CUDA stream.
Return
keypoints:
Evision.CUDA.GpuMat.t()
.Input collection of keypoints.
descriptors:
Evision.CUDA.GpuMat.t()
.Computed descriptors. Row j is the descriptor for j-th keypoint.
Python prototype (for reference only):
computeAsync(image[, keypoints[, descriptors[, stream]]]) -> keypoints, descriptors
@spec convert(t(), Evision.Mat.maybe_mat_in()) :: [Evision.KeyPoint.t()] | {:error, String.t()}
@spec convert(t(), Evision.CUDA.GpuMat.t()) :: [Evision.KeyPoint.t()] | {:error, String.t()}
Variant 1:
convert
Positional Arguments
- self:
Evision.CUDA.ORB.t()
- gpu_keypoints:
Evision.Mat
Return
- keypoints:
[Evision.KeyPoint]
Converts keypoints array from internal representation to standard vector.
Python prototype (for reference only):
convert(gpu_keypoints) -> keypoints
Variant 2:
convert
Positional Arguments
- self:
Evision.CUDA.ORB.t()
- gpu_keypoints:
Evision.CUDA.GpuMat.t()
Return
- keypoints:
[Evision.KeyPoint]
Converts keypoints array from internal representation to standard vector.
Python prototype (for reference only):
convert(gpu_keypoints) -> keypoints
create
Keyword Arguments
- nfeatures:
integer()
. - scaleFactor:
float
. - nlevels:
integer()
. - edgeThreshold:
integer()
. - firstLevel:
integer()
. - wTA_K:
integer()
. - scoreType:
integer()
. - patchSize:
integer()
. - fastThreshold:
integer()
. - blurForDescriptor:
bool
.
Return
- retval:
Evision.CUDA.ORB.t()
Python prototype (for reference only):
create([, nfeatures[, scaleFactor[, nlevels[, edgeThreshold[, firstLevel[, WTA_K[, scoreType[, patchSize[, fastThreshold[, blurForDescriptor]]]]]]]]]]) -> retval
@spec create(Keyword.t()) :: any() | {:error, String.t()}
@spec create( [ blurForDescriptor: term(), edgeThreshold: term(), fastThreshold: term(), firstLevel: term(), nfeatures: term(), nlevels: term(), patchSize: term(), scaleFactor: term(), scoreType: term(), wTA_K: term() ] | nil ) :: t() | {:error, String.t()}
create
Keyword Arguments
- nfeatures:
integer()
. - scaleFactor:
float
. - nlevels:
integer()
. - edgeThreshold:
integer()
. - firstLevel:
integer()
. - wTA_K:
integer()
. - scoreType:
integer()
. - patchSize:
integer()
. - fastThreshold:
integer()
. - blurForDescriptor:
bool
.
Return
- retval:
Evision.CUDA.ORB.t()
Python prototype (for reference only):
create([, nfeatures[, scaleFactor[, nlevels[, edgeThreshold[, firstLevel[, WTA_K[, scoreType[, patchSize[, fastThreshold[, blurForDescriptor]]]]]]]]]]) -> retval
@spec defaultNorm(Keyword.t()) :: any() | {:error, String.t()}
@spec defaultNorm(t()) :: integer() | {:error, String.t()}
defaultNorm
Positional Arguments
- self:
Evision.CUDA.ORB.t()
Return
- retval:
integer()
Python prototype (for reference only):
defaultNorm() -> retval
@spec descriptorSize(Keyword.t()) :: any() | {:error, String.t()}
@spec descriptorSize(t()) :: integer() | {:error, String.t()}
descriptorSize
Positional Arguments
- self:
Evision.CUDA.ORB.t()
Return
- retval:
integer()
Python prototype (for reference only):
descriptorSize() -> retval
@spec descriptorType(Keyword.t()) :: any() | {:error, String.t()}
@spec descriptorType(t()) :: integer() | {:error, String.t()}
descriptorType
Positional Arguments
- self:
Evision.CUDA.ORB.t()
Return
- retval:
integer()
Python prototype (for reference only):
descriptorType() -> retval
@spec detect(t(), [Evision.Mat.maybe_mat_in()]) :: [[Evision.KeyPoint.t()]] | {:error, String.t()}
@spec detect(t(), Evision.Mat.maybe_mat_in()) :: [Evision.KeyPoint.t()] | {:error, String.t()}
Variant 1:
detect
Positional Arguments
self:
Evision.CUDA.ORB.t()
images:
[Evision.Mat]
.Image set.
Keyword Arguments
masks:
[Evision.Mat]
.Masks for each input image specifying where to look for keypoints (optional). masks[i] is a mask for images[i].
Return
keypoints:
[[Evision.KeyPoint]]
.The detected keypoints. In the second variant of the method keypoints[i] is a set of keypoints detected in images[i] .
Has overloading in C++
Python prototype (for reference only):
detect(images[, masks]) -> keypoints
Variant 2:
Detects keypoints in an image (first variant) or image set (second variant).
Positional Arguments
self:
Evision.CUDA.ORB.t()
image:
Evision.Mat
.Image.
Keyword Arguments
mask:
Evision.Mat
.Mask specifying where to look for keypoints (optional). It must be a 8-bit integer matrix with non-zero values in the region of interest.
Return
keypoints:
[Evision.KeyPoint]
.The detected keypoints. In the second variant of the method keypoints[i] is a set of keypoints detected in images[i] .
Python prototype (for reference only):
detect(image[, mask]) -> keypoints
@spec detect(t(), [Evision.Mat.maybe_mat_in()], [{:masks, term()}] | nil) :: [[Evision.KeyPoint.t()]] | {:error, String.t()}
@spec detect(t(), Evision.Mat.maybe_mat_in(), [{:mask, term()}] | nil) :: [Evision.KeyPoint.t()] | {:error, String.t()}
Variant 1:
detect
Positional Arguments
self:
Evision.CUDA.ORB.t()
images:
[Evision.Mat]
.Image set.
Keyword Arguments
masks:
[Evision.Mat]
.Masks for each input image specifying where to look for keypoints (optional). masks[i] is a mask for images[i].
Return
keypoints:
[[Evision.KeyPoint]]
.The detected keypoints. In the second variant of the method keypoints[i] is a set of keypoints detected in images[i] .
Has overloading in C++
Python prototype (for reference only):
detect(images[, masks]) -> keypoints
Variant 2:
Detects keypoints in an image (first variant) or image set (second variant).
Positional Arguments
self:
Evision.CUDA.ORB.t()
image:
Evision.Mat
.Image.
Keyword Arguments
mask:
Evision.Mat
.Mask specifying where to look for keypoints (optional). It must be a 8-bit integer matrix with non-zero values in the region of interest.
Return
keypoints:
[Evision.KeyPoint]
.The detected keypoints. In the second variant of the method keypoints[i] is a set of keypoints detected in images[i] .
Python prototype (for reference only):
detect(image[, mask]) -> keypoints
@spec detectAndCompute(t(), Evision.Mat.maybe_mat_in(), Evision.Mat.maybe_mat_in()) :: {[Evision.KeyPoint.t()], Evision.Mat.t()} | {:error, String.t()}
detectAndCompute
Positional Arguments
- self:
Evision.CUDA.ORB.t()
- image:
Evision.Mat
- mask:
Evision.Mat
Keyword Arguments
- useProvidedKeypoints:
bool
.
Return
- keypoints:
[Evision.KeyPoint]
- descriptors:
Evision.Mat.t()
.
Detects keypoints and computes the descriptors
Python prototype (for reference only):
detectAndCompute(image, mask[, descriptors[, useProvidedKeypoints]]) -> keypoints, descriptors
@spec detectAndCompute( t(), Evision.Mat.maybe_mat_in(), Evision.Mat.maybe_mat_in(), [{:useProvidedKeypoints, term()}] | nil ) :: {[Evision.KeyPoint.t()], Evision.Mat.t()} | {:error, String.t()}
detectAndCompute
Positional Arguments
- self:
Evision.CUDA.ORB.t()
- image:
Evision.Mat
- mask:
Evision.Mat
Keyword Arguments
- useProvidedKeypoints:
bool
.
Return
- keypoints:
[Evision.KeyPoint]
- descriptors:
Evision.Mat.t()
.
Detects keypoints and computes the descriptors
Python prototype (for reference only):
detectAndCompute(image, mask[, descriptors[, useProvidedKeypoints]]) -> keypoints, descriptors
@spec detectAndComputeAsync( t(), Evision.Mat.maybe_mat_in(), Evision.Mat.maybe_mat_in() ) :: {Evision.Mat.t(), Evision.Mat.t()} | {:error, String.t()}
@spec detectAndComputeAsync(t(), Evision.CUDA.GpuMat.t(), Evision.CUDA.GpuMat.t()) :: {Evision.CUDA.GpuMat.t(), Evision.CUDA.GpuMat.t()} | {:error, String.t()}
Variant 1:
detectAndComputeAsync
Positional Arguments
- self:
Evision.CUDA.ORB.t()
- image:
Evision.Mat
- mask:
Evision.Mat
Keyword Arguments
- useProvidedKeypoints:
bool
. - stream:
Evision.CUDA.Stream.t()
.
Return
- keypoints:
Evision.Mat.t()
. - descriptors:
Evision.Mat.t()
.
Detects keypoints and computes the descriptors.
Python prototype (for reference only):
detectAndComputeAsync(image, mask[, keypoints[, descriptors[, useProvidedKeypoints[, stream]]]]) -> keypoints, descriptors
Variant 2:
detectAndComputeAsync
Positional Arguments
- self:
Evision.CUDA.ORB.t()
- image:
Evision.CUDA.GpuMat.t()
- mask:
Evision.CUDA.GpuMat.t()
Keyword Arguments
- useProvidedKeypoints:
bool
. - stream:
Evision.CUDA.Stream.t()
.
Return
- keypoints:
Evision.CUDA.GpuMat.t()
. - descriptors:
Evision.CUDA.GpuMat.t()
.
Detects keypoints and computes the descriptors.
Python prototype (for reference only):
detectAndComputeAsync(image, mask[, keypoints[, descriptors[, useProvidedKeypoints[, stream]]]]) -> keypoints, descriptors
@spec detectAndComputeAsync( t(), Evision.Mat.maybe_mat_in(), Evision.Mat.maybe_mat_in(), [stream: term(), useProvidedKeypoints: term()] | nil ) :: {Evision.Mat.t(), Evision.Mat.t()} | {:error, String.t()}
@spec detectAndComputeAsync( t(), Evision.CUDA.GpuMat.t(), Evision.CUDA.GpuMat.t(), [stream: term(), useProvidedKeypoints: term()] | nil ) :: {Evision.CUDA.GpuMat.t(), Evision.CUDA.GpuMat.t()} | {:error, String.t()}
Variant 1:
detectAndComputeAsync
Positional Arguments
- self:
Evision.CUDA.ORB.t()
- image:
Evision.Mat
- mask:
Evision.Mat
Keyword Arguments
- useProvidedKeypoints:
bool
. - stream:
Evision.CUDA.Stream.t()
.
Return
- keypoints:
Evision.Mat.t()
. - descriptors:
Evision.Mat.t()
.
Detects keypoints and computes the descriptors.
Python prototype (for reference only):
detectAndComputeAsync(image, mask[, keypoints[, descriptors[, useProvidedKeypoints[, stream]]]]) -> keypoints, descriptors
Variant 2:
detectAndComputeAsync
Positional Arguments
- self:
Evision.CUDA.ORB.t()
- image:
Evision.CUDA.GpuMat.t()
- mask:
Evision.CUDA.GpuMat.t()
Keyword Arguments
- useProvidedKeypoints:
bool
. - stream:
Evision.CUDA.Stream.t()
.
Return
- keypoints:
Evision.CUDA.GpuMat.t()
. - descriptors:
Evision.CUDA.GpuMat.t()
.
Detects keypoints and computes the descriptors.
Python prototype (for reference only):
detectAndComputeAsync(image, mask[, keypoints[, descriptors[, useProvidedKeypoints[, stream]]]]) -> keypoints, descriptors
@spec detectAsync(t(), Evision.Mat.maybe_mat_in()) :: Evision.Mat.t() | {:error, String.t()}
@spec detectAsync(t(), Evision.CUDA.GpuMat.t()) :: Evision.CUDA.GpuMat.t() | {:error, String.t()}
Variant 1:
Detects keypoints in an image.
Positional Arguments
self:
Evision.CUDA.ORB.t()
image:
Evision.Mat
.Image.
Keyword Arguments
mask:
Evision.Mat
.Mask specifying where to look for keypoints (optional). It must be a 8-bit integer matrix with non-zero values in the region of interest.
stream:
Evision.CUDA.Stream.t()
.CUDA stream.
Return
keypoints:
Evision.Mat.t()
.The detected keypoints.
Python prototype (for reference only):
detectAsync(image[, keypoints[, mask[, stream]]]) -> keypoints
Variant 2:
Detects keypoints in an image.
Positional Arguments
self:
Evision.CUDA.ORB.t()
image:
Evision.CUDA.GpuMat.t()
.Image.
Keyword Arguments
mask:
Evision.CUDA.GpuMat.t()
.Mask specifying where to look for keypoints (optional). It must be a 8-bit integer matrix with non-zero values in the region of interest.
stream:
Evision.CUDA.Stream.t()
.CUDA stream.
Return
keypoints:
Evision.CUDA.GpuMat.t()
.The detected keypoints.
Python prototype (for reference only):
detectAsync(image[, keypoints[, mask[, stream]]]) -> keypoints
@spec detectAsync( t(), Evision.Mat.maybe_mat_in(), [mask: term(), stream: term()] | nil ) :: Evision.Mat.t() | {:error, String.t()}
@spec detectAsync(t(), Evision.CUDA.GpuMat.t(), [mask: term(), stream: term()] | nil) :: Evision.CUDA.GpuMat.t() | {:error, String.t()}
Variant 1:
Detects keypoints in an image.
Positional Arguments
self:
Evision.CUDA.ORB.t()
image:
Evision.Mat
.Image.
Keyword Arguments
mask:
Evision.Mat
.Mask specifying where to look for keypoints (optional). It must be a 8-bit integer matrix with non-zero values in the region of interest.
stream:
Evision.CUDA.Stream.t()
.CUDA stream.
Return
keypoints:
Evision.Mat.t()
.The detected keypoints.
Python prototype (for reference only):
detectAsync(image[, keypoints[, mask[, stream]]]) -> keypoints
Variant 2:
Detects keypoints in an image.
Positional Arguments
self:
Evision.CUDA.ORB.t()
image:
Evision.CUDA.GpuMat.t()
.Image.
Keyword Arguments
mask:
Evision.CUDA.GpuMat.t()
.Mask specifying where to look for keypoints (optional). It must be a 8-bit integer matrix with non-zero values in the region of interest.
stream:
Evision.CUDA.Stream.t()
.CUDA stream.
Return
keypoints:
Evision.CUDA.GpuMat.t()
.The detected keypoints.
Python prototype (for reference only):
detectAsync(image[, keypoints[, mask[, stream]]]) -> keypoints
@spec empty(Keyword.t()) :: any() | {:error, String.t()}
@spec empty(t()) :: boolean() | {:error, String.t()}
empty
Positional Arguments
- self:
Evision.CUDA.ORB.t()
Return
- retval:
bool
Python prototype (for reference only):
empty() -> retval
@spec getBlurForDescriptor(Keyword.t()) :: any() | {:error, String.t()}
@spec getBlurForDescriptor(t()) :: boolean() | {:error, String.t()}
getBlurForDescriptor
Positional Arguments
- self:
Evision.CUDA.ORB.t()
Return
- retval:
bool
Python prototype (for reference only):
getBlurForDescriptor() -> retval
@spec getDefaultName(Keyword.t()) :: any() | {:error, String.t()}
@spec getDefaultName(t()) :: binary() | {:error, String.t()}
getDefaultName
Positional Arguments
- self:
Evision.CUDA.ORB.t()
Return
- retval:
String
Python prototype (for reference only):
getDefaultName() -> retval
@spec getEdgeThreshold(Keyword.t()) :: any() | {:error, String.t()}
@spec getEdgeThreshold(t()) :: integer() | {:error, String.t()}
getEdgeThreshold
Positional Arguments
- self:
Evision.CUDA.ORB.t()
Return
- retval:
integer()
Python prototype (for reference only):
getEdgeThreshold() -> retval
@spec getFastThreshold(Keyword.t()) :: any() | {:error, String.t()}
@spec getFastThreshold(t()) :: integer() | {:error, String.t()}
getFastThreshold
Positional Arguments
- self:
Evision.CUDA.ORB.t()
Return
- retval:
integer()
Python prototype (for reference only):
getFastThreshold() -> retval
@spec getFirstLevel(Keyword.t()) :: any() | {:error, String.t()}
@spec getFirstLevel(t()) :: integer() | {:error, String.t()}
getFirstLevel
Positional Arguments
- self:
Evision.CUDA.ORB.t()
Return
- retval:
integer()
Python prototype (for reference only):
getFirstLevel() -> retval
@spec getMaxFeatures(Keyword.t()) :: any() | {:error, String.t()}
@spec getMaxFeatures(t()) :: integer() | {:error, String.t()}
getMaxFeatures
Positional Arguments
- self:
Evision.CUDA.ORB.t()
Return
- retval:
integer()
Python prototype (for reference only):
getMaxFeatures() -> retval
@spec getNLevels(Keyword.t()) :: any() | {:error, String.t()}
@spec getNLevels(t()) :: integer() | {:error, String.t()}
getNLevels
Positional Arguments
- self:
Evision.CUDA.ORB.t()
Return
- retval:
integer()
Python prototype (for reference only):
getNLevels() -> retval
@spec getPatchSize(Keyword.t()) :: any() | {:error, String.t()}
@spec getPatchSize(t()) :: integer() | {:error, String.t()}
getPatchSize
Positional Arguments
- self:
Evision.CUDA.ORB.t()
Return
- retval:
integer()
Python prototype (for reference only):
getPatchSize() -> retval
@spec getScaleFactor(Keyword.t()) :: any() | {:error, String.t()}
@spec getScaleFactor(t()) :: number() | {:error, String.t()}
getScaleFactor
Positional Arguments
- self:
Evision.CUDA.ORB.t()
Return
- retval:
double
Python prototype (for reference only):
getScaleFactor() -> retval
@spec getScoreType(Keyword.t()) :: any() | {:error, String.t()}
@spec getScoreType(t()) :: integer() | {:error, String.t()}
getScoreType
Positional Arguments
- self:
Evision.CUDA.ORB.t()
Return
- retval:
integer()
Python prototype (for reference only):
getScoreType() -> retval
@spec getWTA_K(Keyword.t()) :: any() | {:error, String.t()}
@spec getWTA_K(t()) :: integer() | {:error, String.t()}
getWTA_K
Positional Arguments
- self:
Evision.CUDA.ORB.t()
Return
- retval:
integer()
Python prototype (for reference only):
getWTA_K() -> retval
@spec read(t(), Evision.FileNode.t()) :: t() | {:error, String.t()}
@spec read(t(), binary()) :: t() | {:error, String.t()}
Variant 1:
read
Positional Arguments
- self:
Evision.CUDA.ORB.t()
- arg1:
Evision.FileNode
Python prototype (for reference only):
read(arg1) -> None
Variant 2:
read
Positional Arguments
- self:
Evision.CUDA.ORB.t()
- fileName:
String
Python prototype (for reference only):
read(fileName) -> None
setBlurForDescriptor
Positional Arguments
- self:
Evision.CUDA.ORB.t()
- blurForDescriptor:
bool
Python prototype (for reference only):
setBlurForDescriptor(blurForDescriptor) -> None
setEdgeThreshold
Positional Arguments
- self:
Evision.CUDA.ORB.t()
- edgeThreshold:
integer()
Python prototype (for reference only):
setEdgeThreshold(edgeThreshold) -> None
setFastThreshold
Positional Arguments
- self:
Evision.CUDA.ORB.t()
- fastThreshold:
integer()
Python prototype (for reference only):
setFastThreshold(fastThreshold) -> None
setFirstLevel
Positional Arguments
- self:
Evision.CUDA.ORB.t()
- firstLevel:
integer()
Python prototype (for reference only):
setFirstLevel(firstLevel) -> None
setMaxFeatures
Positional Arguments
- self:
Evision.CUDA.ORB.t()
- maxFeatures:
integer()
Python prototype (for reference only):
setMaxFeatures(maxFeatures) -> None
setNLevels
Positional Arguments
- self:
Evision.CUDA.ORB.t()
- nlevels:
integer()
Python prototype (for reference only):
setNLevels(nlevels) -> None
setPatchSize
Positional Arguments
- self:
Evision.CUDA.ORB.t()
- patchSize:
integer()
Python prototype (for reference only):
setPatchSize(patchSize) -> None
setScaleFactor
Positional Arguments
- self:
Evision.CUDA.ORB.t()
- scaleFactor:
double
Python prototype (for reference only):
setScaleFactor(scaleFactor) -> None
setScoreType
Positional Arguments
- self:
Evision.CUDA.ORB.t()
- scoreType:
integer()
Python prototype (for reference only):
setScoreType(scoreType) -> None
setWTA_K
Positional Arguments
- self:
Evision.CUDA.ORB.t()
- wta_k:
integer()
Python prototype (for reference only):
setWTA_K(wta_k) -> None
write
Positional Arguments
- self:
Evision.CUDA.ORB.t()
- fileName:
String
Python prototype (for reference only):
write(fileName) -> None
@spec write(t(), Evision.FileStorage.t(), binary()) :: t() | {:error, String.t()}
write
Positional Arguments
- self:
Evision.CUDA.ORB.t()
- fs:
Evision.FileStorage
- name:
String
Python prototype (for reference only):
write(fs, name) -> None