EXGBoost (EXGBoost v0.4.0)
Elixir bindings for the XGBoost library. EXGBoost
provides an implementation of XGBoost that works with
Nx tensors.
Xtreme Gradient Boosting (XGBoost) is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. The same code runs on major distributed environment (Hadoop, SGE, MPI) and can solve problems beyond billions of examples.
installation
Installation
def deps do
[
{:exgboost, "~> 0.4"}
]
end
api-data-structures
API Data Structures
EXGBoost's top-level EXGBoost
API works directly and only with Nx.Tensor
for data
representation and with EXGBoost.Booster
structs as an internal representation.
Direct manipulation of EXGBoost.Booster
structs is discouraged.
basic-usage
Basic Usage
key = Nx.Random.key(42)
{x, key} = Nx.Random.normal(key, 0, 1, shape: {10, 5})
{y, key} = Nx.Random.normal(key, 0, 1, shape: {10})
model = EXGBoost.train(x, y)
EXGBoost.predict(model, x)
training
Training
EXGBoost is designed to feel familiar to users of the Python XGBoost library. EXGBoost.train/2
is the
primary entry point for training a model. It accepts an Nx tensor for the features and an Nx tensor for the labels.
EXGBoost.train/2
returns a trained EXGBoost.Booster
struct that can be used for prediction. EXGBoost.train/2
also
accepts a keyword list of options that can be used to configure the training process. See the
XGBoost documentation for the full list of options.
EXGBoost.train/2
has the ability for the user to provide a custom training function that will be used to train the model.
This is done by passing a function to the :obj
option. See EXGBoost.Booster.update/4
for more information on this.
Another feature of EXGBoost.train/2
is the ability to provide a validation set for early stopping. This is done
by passing a list of 3-tuples to the :evals
option. Each 3-tuple should contain an Nx tensor for the features, an Nx tensor
for the labels, and a string label for the validation set name. The validation set will be used to calculate the validation
error at each iteration of the training process. If the validation error does not improve for :early_stopping_rounds
iterations
then the training process will stop. See the XGBoost documentation
for a more detailed explanation of early stopping.
Early stopping is achieved through the use of callbacks. EXGBoost.train/2
accepts a list of callbacks that will be called
at each iteration of the training process. The callbacks can be used to implement custom logic. For example, the user could
implement a callback that will print the validation error at each iteration of the training process or to provide a custom
setup function for training. SeeEXGBoost.Training.Callback
for more information on callbacks.
Please notes that callbacks are called in the order that they are provided. If you provide multiple callbacks that modify
the same parameter then the last callback will trump the previous callbacks. For example, if you provide a callback that
sets the :early_stopping_rounds
parameter to 10 and then provide a callback that sets the :early_stopping_rounds
parameter
to 20 then the :early_stopping_rounds
parameter will be set to 20.
You are also able to pass parameters to be applied to the Booster model using the :params
option. These parameters will
be applied to the Booster model before training begins. This allows you to set parameters that are not available as options
to EXGBoost.train/2
. See the XGBoost documentation for a full
list of parameters.
EXGBoost.train(
x,
y,
obj: :multi_softprob,
evals: [{x_test, y_test, "test"}],
learning_rates: fn i -> i / 10 end,
num_boost_round: 10,
early_stopping_rounds: 3,
max_depth: 3,
eval_metric: [:rmse, :logloss]
)
prediction
Prediction
EXGBoost.predict/2
is the primary entry point for making predictions with a trained model.
It accepts an EXGBoost.Booster
struct (which is the output of EXGBoost.train/2
).
EXGBoost.predict/2
returns an Nx tensor containing the predictions and also accepts
a keyword list of options that can be used to configure the prediction process.
preds = EXGBoost.train(X, y) |> EXGBoost.predict(X)
serliaztion
Serliaztion
A Booster can be serialized to a file using EXGBoost.write_*
and loaded from a file
using EXGBoost.read_*
. The file format can be specified using the :format
option
which can be either :json
or :ubj
. The default is :json
. If the file already exists, it will NOT
be overwritten by default. Boosters can either be serialized to a file or to a binary string.
Boosters can be serialized in three different ways: configuration only, configuration and model, or
model only. dump
functions will serialize the Booster to a binary string.
Functions named with weights
will serialize the model's trained parameters only. This is best used when the model
is already trained and only inferences/predictions are going to be performed. Functions named with config
will
serialize the configuration only. Functions that specify model
will serialize both the model parameters
and the configuration.
output-formats
Output Formats
read
/write
- File.load
/dump
- Binary buffer.
output-contents
Output Contents
config
- Save the configuration only.weights
- Save the model parameters only. Use this when you want to save the model to a format that can be ingested by other XGBoost APIs.model
- Save both the model parameters and the configuration.
Link to this section Summary
Functions
Dump a model config to a buffer as a JSON - encoded string.
Dump a model to a binary encoded in the desired format.
Dump a model's trained parameters to a buffer as a JSON-encoded binary.
Get current values of the global configuration.
Run prediction in-place, Unlike EXGBoost.predict/2
, in-place prediction does not cache the prediction result.
Create a new Booster from a config buffer. The config buffer must be from the output of dump_config/2
.
Read a model from a buffer and return the Booster.
Read a model's trained parameters from a buffer and return the Booster.
Predict with a booster model against a tensor.
Create a new Booster from a config file. The config file must be from the output of write_config/2
.
Read a model from a file and return the Booster.
Read a model's trained parameters from a file and return the Booster.
Set global configuration.
Train a new booster model given a data tensor and a label tensor.
Write a model config to a file as a JSON - encoded string.
Write a model to a file.
Write a model's trained parameters to a file.
Check the build information of the xgboost library.
Check the version of the xgboost library.
Link to this section Functions
dump_config(booster, opts \\ [])
Dump a model config to a buffer as a JSON - encoded string.
options
Options
:format
- The format to serialize to. Can be either:json
or:ubj
. The default value is:json
.
dump_model(booster, opts \\ [])
Dump a model to a binary encoded in the desired format.
options
Options
:format
- The format to serialize to. Can be either:json
or:ubj
. The default value is:json
.
dump_weights(booster, opts \\ [])
Dump a model's trained parameters to a buffer as a JSON-encoded binary.
options
Options
:format
- The format to serialize to. Can be either:json
or:ubj
. The default value is:json
.
get_config()
@spec get_config() :: map()
Get current values of the global configuration.
Global configuration consists of a collection of parameters that can be
applied in the global scope. See Global Parameters
in EXGBoost.Parameters
for the full list of parameters supported in the global configuration.
inplace_predict(boostr, data, opts \\ [])
Run prediction in-place, Unlike EXGBoost.predict/2
, in-place prediction does not cache the prediction result.
options
Options
:base_margin
- Base margin used for boosting from existing model.:missing
- Value used for missing values. If None, defaults toNx.Constants.nan()
.:predict_type
- One of:"value"
- Output model prediction values."margin"
- Output the raw untransformed margin value.
:output_margin
- Whether to output the raw untransformed margin value.:iteration_range
- SeeEXGBoost.predict/2
for details.:strict_shape
- SeeEXGBoost.predict/2
for details.
Returns an Nx.Tensor containing the predictions.
load_config(buffer, opts \\ [])
Create a new Booster from a config buffer. The config buffer must be from the output of dump_config/2
.
options
Options
:booster
(struct of type EXGBoost.Booster) - The Booster to load the model into. If a Booster is provided, the model will be loaded into that Booster. Otherwise, a new Booster will be created. If a Booster is provided, model parameters will be merged with the existing Booster's parameters using Map.merge/2, where the parameters of the provided Booster take precedence.
load_model(buffer)
@spec load_model(binary()) :: EXGBoost.Booster.t()
Read a model from a buffer and return the Booster.
load_weights(buffer)
@spec load_weights(binary()) :: EXGBoost.Booster.t()
Read a model's trained parameters from a buffer and return the Booster.
predict(bst, x, opts \\ [])
Predict with a booster model against a tensor.
The full model will be used unless iteration_range
is specified,
meaning user have to either slice the model or use the best_iteration
attribute to get prediction from best model returned from early stopping.
options
Options
:output_margin
- Whether to output the raw untransformed margin value.:pred_leaf
- When this option is on, the output will be anNx.Tensor
of shape {nsamples, ntrees}, where each row indicates the predicted leaf index of each sample in each tree. Note that the leaf index of a tree is unique per tree, but not globally, so you may find leaf 1 in both tree 1 and tree 0.:pred_contribs
- When this istrue
the output will be a matrix of size{nsample, nfeats + 1}
with each record indicating the feature contributions (SHAP values) for that prediction. The sum of all feature contributions is equal to the raw untransformed margin value of the prediction. Note the final column is the bias term.:approx_contribs
- Approximate the contributions of each feature. Used whenpred_contribs
orpred_interactions
is set totrue
. Changing the default of this parameter (false) is not recommended.:pred_interactions
- When this istrue
the output will be anNx.Tensor
of shape {nsamples, nfeats + 1} indicating the SHAP interaction values for each pair of features. The sum of each row (or column) of the interaction values equals the corresponding SHAP value (from pred_contribs), and the sum of the entire matrix equals the raw untransformed margin value of the prediction. Note the last row and column correspond to the bias term.:validate_features
- When this istrue
, validate that the Booster's and data's feature_names are identical. Otherwise, it is assumed that the feature_names are the same.:training
- Determines whether the prediction value is used for training. This can affect thedart
booster, which performs dropouts during training iterations but uses all trees for inference. If you want to obtain result with dropouts, set this option totrue
. Also, the option is set totrue
when obtaining prediction for custom objective function.:iteration_range
- Specifies which layer of trees are used in prediction. For example, if a random forest is trained with 100 rounds. Specifyingiteration_range=(10, 20)
, then only the forests built during [10, 20) (half open set) rounds are used in this prediction.:strict_shape
- When set totrue
, output shape is invariant to whether classification is used. For both value and margin prediction, the output shape is (n_samples, n_groups), n_groups == 1 when multi-class is not used. Defaults tofalse
, in which case the output shape can be (n_samples, ) if multi-class is not used.
Returns an Nx.Tensor containing the predictions.
read_config(path, opts \\ [])
Create a new Booster from a config file. The config file must be from the output of write_config/2
.
options
Options
:booster
(struct of type EXGBoost.Booster) - The Booster to load the model into. If a Booster is provided, the model will be loaded into that Booster. Otherwise, a new Booster will be created. If a Booster is provided, model parameters will be merged with the existing Booster's parameters using Map.merge/2, where the parameters of the provided Booster take precedence.
read_model(path)
@spec read_model(String.t()) :: EXGBoost.Booster.t()
Read a model from a file and return the Booster.
read_weights(path)
@spec read_weights(String.t()) :: EXGBoost.Booster.t()
Read a model's trained parameters from a file and return the Booster.
set_config(config)
Set global configuration.
Global configuration consists of a collection of parameters that can be
applied in the global scope. See Global Parameters
in EXGBoost.Parameters
for the full list of parameters supported in the global configuration.
train(x, y, opts \\ [])
@spec train(Nx.Tensor.t(), Nx.Tensor.t(), Keyword.t()) :: EXGBoost.Booster.t()
Train a new booster model given a data tensor and a label tensor.
options
Options
:obj
- Specify the learning task and the corresponding learning objective. This function must accept two arguments: preds, dtrain. preds is an array of predicted real valued scores. dtrain is the training data set. This function returns gradient and second order gradient.:num_boost_rounds
- Number of boosting iterations.:evals
- A list of 3-Tuples{x, y, label}
to use as a validation set for early-stopping.:early_stopping_rounds
- Activates early stopping. Target metric needs to increase/decrease (depending on metric) at least everyearly_stopping_rounds
round(s) to continue training. Requires at least one item in:evals
. If there's more than one, will use the last eval set. If there’s more than one metric in theeval_metric
parameter given in the booster's params, the last metric will be used for early stopping. If early stopping occurs, the model will have two additional fields:bst.best_score
bst.best_iteration
.
If these values are
nil
then no early stopping occurred.:verbose_eval
- Requires at least one item inevals
. Ifverbose_eval
is true then the evaluation metric on the validation set is printed at each boosting stage. If verbose_eval is an integer then the evaluation metric on the validation set is printed at every givenverbose_eval
boosting stage. The last boosting stage / the boosting stage found by usingearly_stopping_rounds
is also printed. Example: withverbose_eval=4
and at least one item in evals, an evaluation metric is printed every 4 boosting stages, instead of every boosting stage.:learning_rates
- Either an arity 1 function that accept an integer parameter epoch and returns the corresponding learning rate or a list with the same length as num_boost_rounds.:callbacks
- List ofEXGBoost.Training.Callback
that are called during a given event. It is possible to use predefined callbacks by usingEXGBoost.Training.Callback
module. Callbacks should be in the form of a keyword list where the only valid keys are:before_training
,:after_training
,:before_iteration
, and:after_iteration
. The value of each key should be a list of functions that accepts a booster and an iteration and returns a booster. The function will be called at the appropriate time with the booster and the iteration as the arguments. The function should return the booster. If the function returns a booster with a different memory address, the original booster will be replaced with the new booster. If the function returns the original booster, the original booster will be used. If the function returns a booster with the same memory address but different contents, the behavior is undefined.opts
- Refer toEXGBoost.Parameters
for the full list of options.
write_config(booster, path, opts \\ [])
Write a model config to a file as a JSON - encoded string.
options
Options
:format
- The format to serialize to. Can be either:json
or:ubj
. The default value is:json
.:overwrite
(boolean/0
) - Whether or not to overwrite the file if it already exists. The default value isfalse
.
write_model(booster, path, opts \\ [])
Write a model to a file.
options
Options
:format
- The format to serialize to. Can be either:json
or:ubj
. The default value is:json
.:overwrite
(boolean/0
) - Whether or not to overwrite the file if it already exists. The default value isfalse
.
write_weights(booster, path, opts \\ [])
Write a model's trained parameters to a file.
options
Options
:format
- The format to serialize to. Can be either:json
or:ubj
. The default value is:json
.:overwrite
(boolean/0
) - Whether or not to overwrite the file if it already exists. The default value isfalse
.
xgboost_build_info()
@spec xgboost_build_info() :: map()
Check the build information of the xgboost library.
Returns a map containing information about the build.
xgboost_version()
Check the version of the xgboost library.
Returns a 3-tuple in the form of {major, minor, patch}
.