About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Last updated: Feb 11, 2025
XGBoost Linear© is an advanced implementation of a gradient boosting algorithm with a linear model as the base model. Boosting algorithms iteratively learn weak classifiers and then add them to a final strong classifier. The XGBoost Linear node in watsonx.ai Studio is implemented in Python.
For more information about boosting algorithms, see the XGBoost Tutorials. 1
Note that the XGBoost cross-validation function is not supported in watsonx.ai Studio. You can use the Partition node for this functionality. Also note that XGBoost in watsonx.ai Studio performs one-hot encoding automatically for categorical variables.
1 "XGBoost Tutorials." Scalable and Flexible Gradient Boosting. Web. © 2015-2016 DMLC.