View Source Evision.ML.NormalBayesClassifier (Evision v0.2.9)

Summary

Types

t()

Type that represents an ML.NormalBayesClassifier struct.

Functions

Computes error on the training or test dataset

Computes error on the training or test dataset

Clears the algorithm state

create

Returns the number of variables in training samples

Returns true if the model is classifier

Returns true if the model is trained

Loads and creates a serialized NormalBayesClassifier from a file

Loads and creates a serialized NormalBayesClassifier from a file

Predicts response(s) for the provided sample(s)

Predicts response(s) for the provided sample(s)

Predicts the response for sample(s).

Predicts the response for sample(s).

Reads algorithm parameters from a file storage

Trains the statistical model

Trains the statistical model

Trains the statistical model

Stores algorithm parameters in a file storage

Types

@type t() :: %Evision.ML.NormalBayesClassifier{ref: reference()}

Type that represents an ML.NormalBayesClassifier struct.

  • ref. reference()

    The underlying erlang resource variable.

Functions

@spec calcError(Keyword.t()) :: any() | {:error, String.t()}
Link to this function

calcError(self, data, test)

View Source
@spec calcError(t(), Evision.ML.TrainData.t(), boolean()) ::
  {number(), Evision.Mat.t()} | {:error, String.t()}

Computes error on the training or test dataset

Positional Arguments
  • self: Evision.ML.NormalBayesClassifier.t()

  • data: Evision.ML.TrainData.t().

    the training data

  • test: bool.

    if true, the error is computed over the test subset of the data, otherwise it's computed over the training subset of the data. Please note that if you loaded a completely different dataset to evaluate already trained classifier, you will probably want not to set the test subset at all with TrainData::setTrainTestSplitRatio and specify test=false, so that the error is computed for the whole new set. Yes, this sounds a bit confusing.

Return
  • retval: float

  • resp: Evision.Mat.t().

    the optional output responses.

The method uses StatModel::predict to compute the error. For regression models the error is computed as RMS, for classifiers - as a percent of missclassified samples (0%-100%).

Python prototype (for reference only):

calcError(data, test[, resp]) -> retval, resp
Link to this function

calcError(self, data, test, opts)

View Source
@spec calcError(
  t(),
  Evision.ML.TrainData.t(),
  boolean(),
  [{atom(), term()}, ...] | nil
) ::
  {number(), Evision.Mat.t()} | {:error, String.t()}

Computes error on the training or test dataset

Positional Arguments
  • self: Evision.ML.NormalBayesClassifier.t()

  • data: Evision.ML.TrainData.t().

    the training data

  • test: bool.

    if true, the error is computed over the test subset of the data, otherwise it's computed over the training subset of the data. Please note that if you loaded a completely different dataset to evaluate already trained classifier, you will probably want not to set the test subset at all with TrainData::setTrainTestSplitRatio and specify test=false, so that the error is computed for the whole new set. Yes, this sounds a bit confusing.

Return
  • retval: float

  • resp: Evision.Mat.t().

    the optional output responses.

The method uses StatModel::predict to compute the error. For regression models the error is computed as RMS, for classifiers - as a percent of missclassified samples (0%-100%).

Python prototype (for reference only):

calcError(data, test[, resp]) -> retval, resp
@spec clear(Keyword.t()) :: any() | {:error, String.t()}
@spec clear(t()) :: t() | {:error, String.t()}

Clears the algorithm state

Positional Arguments
  • self: Evision.ML.NormalBayesClassifier.t()

Python prototype (for reference only):

clear() -> None
@spec create() :: t() | {:error, String.t()}

create

Return
  • retval: Evision.ML.NormalBayesClassifier.t()

Creates empty model Use StatModel::train to train the model after creation.

Python prototype (for reference only):

create() -> retval
@spec create(Keyword.t()) :: any() | {:error, String.t()}
@spec empty(Keyword.t()) :: any() | {:error, String.t()}
@spec empty(t()) :: boolean() | {:error, String.t()}

empty

Positional Arguments
  • self: Evision.ML.NormalBayesClassifier.t()
Return
  • retval: bool

Python prototype (for reference only):

empty() -> retval
Link to this function

getDefaultName(named_args)

View Source
@spec getDefaultName(Keyword.t()) :: any() | {:error, String.t()}
@spec getDefaultName(t()) :: binary() | {:error, String.t()}

getDefaultName

Positional Arguments
  • self: Evision.ML.NormalBayesClassifier.t()
Return

Returns the algorithm string identifier. This string is used as top level xml/yml node tag when the object is saved to a file or string.

Python prototype (for reference only):

getDefaultName() -> retval
@spec getVarCount(Keyword.t()) :: any() | {:error, String.t()}
@spec getVarCount(t()) :: integer() | {:error, String.t()}

Returns the number of variables in training samples

Positional Arguments
  • self: Evision.ML.NormalBayesClassifier.t()
Return
  • retval: integer()

Python prototype (for reference only):

getVarCount() -> retval
Link to this function

isClassifier(named_args)

View Source
@spec isClassifier(Keyword.t()) :: any() | {:error, String.t()}
@spec isClassifier(t()) :: boolean() | {:error, String.t()}

Returns true if the model is classifier

Positional Arguments
  • self: Evision.ML.NormalBayesClassifier.t()
Return
  • retval: bool

Python prototype (for reference only):

isClassifier() -> retval
@spec isTrained(Keyword.t()) :: any() | {:error, String.t()}
@spec isTrained(t()) :: boolean() | {:error, String.t()}

Returns true if the model is trained

Positional Arguments
  • self: Evision.ML.NormalBayesClassifier.t()
Return
  • retval: bool

Python prototype (for reference only):

isTrained() -> retval
@spec load(Keyword.t()) :: any() | {:error, String.t()}
@spec load(binary()) :: t() | {:error, String.t()}

Loads and creates a serialized NormalBayesClassifier from a file

Positional Arguments
  • filepath: String.

    path to serialized NormalBayesClassifier

Keyword Arguments
  • nodeName: String.

    name of node containing the classifier

Return
  • retval: Evision.ML.NormalBayesClassifier.t()

Use NormalBayesClassifier::save to serialize and store an NormalBayesClassifier to disk. Load the NormalBayesClassifier from this file again, by calling this function with the path to the file. Optionally specify the node for the file containing the classifier

Python prototype (for reference only):

load(filepath[, nodeName]) -> retval
@spec load(binary(), [{:nodeName, term()}] | nil) :: t() | {:error, String.t()}

Loads and creates a serialized NormalBayesClassifier from a file

Positional Arguments
  • filepath: String.

    path to serialized NormalBayesClassifier

Keyword Arguments
  • nodeName: String.

    name of node containing the classifier

Return
  • retval: Evision.ML.NormalBayesClassifier.t()

Use NormalBayesClassifier::save to serialize and store an NormalBayesClassifier to disk. Load the NormalBayesClassifier from this file again, by calling this function with the path to the file. Optionally specify the node for the file containing the classifier

Python prototype (for reference only):

load(filepath[, nodeName]) -> retval
@spec predict(Keyword.t()) :: any() | {:error, String.t()}
@spec predict(t(), Evision.Mat.maybe_mat_in()) ::
  {number(), Evision.Mat.t()} | {:error, String.t()}

Predicts response(s) for the provided sample(s)

Positional Arguments
  • self: Evision.ML.NormalBayesClassifier.t()

  • samples: Evision.Mat.

    The input samples, floating-point matrix

Keyword Arguments
  • flags: integer().

    The optional flags, model-dependent. See cv::ml::StatModel::Flags.

Return
  • retval: float

  • results: Evision.Mat.t().

    The optional output matrix of results.

Python prototype (for reference only):

predict(samples[, results[, flags]]) -> retval, results
Link to this function

predict(self, samples, opts)

View Source
@spec predict(t(), Evision.Mat.maybe_mat_in(), [{:flags, term()}] | nil) ::
  {number(), Evision.Mat.t()} | {:error, String.t()}

Predicts response(s) for the provided sample(s)

Positional Arguments
  • self: Evision.ML.NormalBayesClassifier.t()

  • samples: Evision.Mat.

    The input samples, floating-point matrix

Keyword Arguments
  • flags: integer().

    The optional flags, model-dependent. See cv::ml::StatModel::Flags.

Return
  • retval: float

  • results: Evision.Mat.t().

    The optional output matrix of results.

Python prototype (for reference only):

predict(samples[, results[, flags]]) -> retval, results
@spec predictProb(Keyword.t()) :: any() | {:error, String.t()}
Link to this function

predictProb(self, inputs)

View Source
@spec predictProb(t(), Evision.Mat.maybe_mat_in()) ::
  {number(), Evision.Mat.t(), Evision.Mat.t()} | {:error, String.t()}

Predicts the response for sample(s).

Positional Arguments
  • self: Evision.ML.NormalBayesClassifier.t()
  • inputs: Evision.Mat
Keyword Arguments
  • flags: integer().
Return
  • retval: float
  • outputs: Evision.Mat.t().
  • outputProbs: Evision.Mat.t().

The method estimates the most probable classes for input vectors. Input vectors (one or more) are stored as rows of the matrix inputs. In case of multiple input vectors, there should be one output vector outputs. The predicted class for a single input vector is returned by the method. The vector outputProbs contains the output probabilities corresponding to each element of result.

Python prototype (for reference only):

predictProb(inputs[, outputs[, outputProbs[, flags]]]) -> retval, outputs, outputProbs
Link to this function

predictProb(self, inputs, opts)

View Source
@spec predictProb(t(), Evision.Mat.maybe_mat_in(), [{:flags, term()}] | nil) ::
  {number(), Evision.Mat.t(), Evision.Mat.t()} | {:error, String.t()}

Predicts the response for sample(s).

Positional Arguments
  • self: Evision.ML.NormalBayesClassifier.t()
  • inputs: Evision.Mat
Keyword Arguments
  • flags: integer().
Return
  • retval: float
  • outputs: Evision.Mat.t().
  • outputProbs: Evision.Mat.t().

The method estimates the most probable classes for input vectors. Input vectors (one or more) are stored as rows of the matrix inputs. In case of multiple input vectors, there should be one output vector outputs. The predicted class for a single input vector is returned by the method. The vector outputProbs contains the output probabilities corresponding to each element of result.

Python prototype (for reference only):

predictProb(inputs[, outputs[, outputProbs[, flags]]]) -> retval, outputs, outputProbs
@spec read(Keyword.t()) :: any() | {:error, String.t()}
@spec read(t(), Evision.FileNode.t()) :: t() | {:error, String.t()}

Reads algorithm parameters from a file storage

Positional Arguments

Python prototype (for reference only):

read(fn) -> None
@spec save(Keyword.t()) :: any() | {:error, String.t()}
@spec save(t(), binary()) :: t() | {:error, String.t()}

save

Positional Arguments
  • self: Evision.ML.NormalBayesClassifier.t()
  • filename: String

Saves the algorithm to a file. In order to make this method work, the derived class must implement Algorithm::write(FileStorage& fs).

Python prototype (for reference only):

save(filename) -> None
@spec train(Keyword.t()) :: any() | {:error, String.t()}
@spec train(t(), Evision.ML.TrainData.t()) :: boolean() | {:error, String.t()}

Trains the statistical model

Positional Arguments
  • self: Evision.ML.NormalBayesClassifier.t()

  • trainData: Evision.ML.TrainData.t().

    training data that can be loaded from file using TrainData::loadFromCSV or created with TrainData::create.

Keyword Arguments
  • flags: integer().

    optional flags, depending on the model. Some of the models can be updated with the new training samples, not completely overwritten (such as NormalBayesClassifier or ANN_MLP).

Return
  • retval: bool

Python prototype (for reference only):

train(trainData[, flags]) -> retval
Link to this function

train(self, trainData, opts)

View Source
@spec train(t(), Evision.ML.TrainData.t(), [{:flags, term()}] | nil) ::
  boolean() | {:error, String.t()}

Trains the statistical model

Positional Arguments
  • self: Evision.ML.NormalBayesClassifier.t()

  • trainData: Evision.ML.TrainData.t().

    training data that can be loaded from file using TrainData::loadFromCSV or created with TrainData::create.

Keyword Arguments
  • flags: integer().

    optional flags, depending on the model. Some of the models can be updated with the new training samples, not completely overwritten (such as NormalBayesClassifier or ANN_MLP).

Return
  • retval: bool

Python prototype (for reference only):

train(trainData[, flags]) -> retval
Link to this function

train(self, samples, layout, responses)

View Source
@spec train(t(), Evision.Mat.maybe_mat_in(), integer(), Evision.Mat.maybe_mat_in()) ::
  boolean() | {:error, String.t()}

Trains the statistical model

Positional Arguments
  • self: Evision.ML.NormalBayesClassifier.t()

  • samples: Evision.Mat.

    training samples

  • layout: integer().

    See ml::SampleTypes.

  • responses: Evision.Mat.

    vector of responses associated with the training samples.

Return
  • retval: bool

Python prototype (for reference only):

train(samples, layout, responses) -> retval
@spec write(Keyword.t()) :: any() | {:error, String.t()}
@spec write(t(), Evision.FileStorage.t()) :: t() | {:error, String.t()}

Stores algorithm parameters in a file storage

Positional Arguments

Python prototype (for reference only):

write(fs) -> None
@spec write(t(), Evision.FileStorage.t(), binary()) :: t() | {:error, String.t()}

write

Positional Arguments

Has overloading in C++

Python prototype (for reference only):

write(fs, name) -> None