View Source Scholar.Neighbors.KNNClassifier (Scholar v0.3.0)
K-Nearest Neighbors Classifier.
Performs classifiction by computing the (weighted) majority voting among k-nearest neighbors.
Summary
Functions
Fits a k-NN classifier model.
Predicts classes using a k-NN classifier model.
Predicts class probabilities using a k-NN classifier model.
Functions
Fits a k-NN classifier model.
Options
:algorithm
(atom/0
) - Algorithm used to compute the k-nearest neighbors. Possible values::brute
- Brute-force search. SeeScholar.Neighbors.BruteKNN
for more details.:kd_tree
- k-d tree. SeeScholar.Neighbors.KDTree
for more details.:random_projection_forest
- Random projection forest. SeeScholar.Neighbors.RandomProjectionForest
for more details.Module implementing
fit(data, opts)
andpredict(model, query)
. predict/2 must return tuple containing indices of k-nearest neighbors of query points as well as distances between query points and their k-nearest neighbors.
The default value is
:brute
.:num_classes
(pos_integer/0
) - Required. The number of possible classes.:weights
- Weight function used in prediction. Possible values::uniform
- uniform weights. All points in each neighborhood are weighted equally.:distance
- weight points by the inverse of their distance. in this case, closer neighbors of a query point will have a greater influence than neighbors which are further away.
The default value is
:uniform
.
Algorithm-specific options (e.g. :num_neighbors
, :metric
) should be provided together with the classifier options.
Examples
iex> x = Nx.tensor([[1, 2], [2, 3], [3, 4], [4, 5], [5, 6]])
iex> y = Nx.tensor([0, 0, 0, 1, 1])
iex> model = Scholar.Neighbors.KNNClassifier.fit(x, y, num_neighbors: 3, num_classes: 2)
iex> model.algorithm
Scholar.Neighbors.BruteKNN.fit(x, num_neighbors: 3)
iex> model.labels
Nx.tensor([0, 0, 0, 1, 1])
iex> x = Nx.tensor([[1, 2], [2, 3], [3, 4], [4, 5], [5, 6]])
iex> y = Nx.tensor([0, 0, 0, 1, 1])
iex> model = Scholar.Neighbors.KNNClassifier.fit(x, y, algorithm: :kd_tree, num_neighbors: 3, metric: {:minkowski, 1}, num_classes: 2)
iex> model.algorithm
Scholar.Neighbors.KDTree.fit(x, num_neighbors: 3, metric: {:minkowski, 1})
iex> model.labels
Nx.tensor([0, 0, 0, 1, 1])
iex> x = Nx.tensor([[1, 2], [2, 3], [3, 4], [4, 5], [5, 6]])
iex> y = Nx.tensor([0, 0, 0, 1, 1])
iex> key = Nx.Random.key(12)
iex> model = Scholar.Neighbors.KNNClassifier.fit(x, y, algorithm: :random_projection_forest, num_neighbors: 2, num_classes: 2, num_trees: 4, key: key)
iex> model.algorithm
Scholar.Neighbors.RandomProjectionForest.fit(x, num_neighbors: 2, num_trees: 4, key: key)
iex> model.labels
Nx.tensor([0, 0, 0, 1, 1])
Predicts classes using a k-NN classifier model.
Examples
iex> x_train = Nx.tensor([[1, 2], [2, 3], [3, 4], [4, 5], [5, 6]])
iex> y_train = Nx.tensor([0, 0, 0, 1, 1])
iex> model = Scholar.Neighbors.KNNClassifier.fit(x_train, y_train, num_neighbors: 3, num_classes: 2)
iex> x = Nx.tensor([[1, 3], [4, 2], [3, 6]])
iex> Scholar.Neighbors.KNNClassifier.predict(model, x)
Nx.tensor([0, 0, 1])
Predicts class probabilities using a k-NN classifier model.
Examples
iex> x_train = Nx.tensor([[1, 2], [2, 3], [3, 4], [4, 5], [5, 6]])
iex> y_train = Nx.tensor([0, 0, 0, 1, 1])
iex> model = Scholar.Neighbors.KNNClassifier.fit(x_train, y_train, num_neighbors: 3, num_classes: 2)
iex> x = Nx.tensor([[1, 3], [4, 2], [3, 6]])
iex> Scholar.Neighbors.KNNClassifier.predict_probability(model, x)
Nx.tensor(
[
[1.0, 0.0],
[1.0, 0.0],
[0.3333333432674408, 0.6666666865348816]
]
)