Forecasting with XGBoost
XGBoost (which stands for eXtreme Gradient Boosting) is a powerful and widely used ML algorithm due to its efficiency, scalability, and high performance in both regression and classification tasks.In the previous recipe, you were introduced to the EnsembleForecaster and AutoEnsembleForecaster classes in sktime that aggregate the predictions (e.g., weighted or unweighted average) from the different forecasters. This technique is considered a type of voting ensemble. You were also introduced to several regressors from scikit-learn, such as RandomForestRegressor, which is an ensemble algorithm (Random Forest) that creates multiple small models (decision trees) on different bootstrap samples (random sampling with replacement from the dataset), in which each model is trained on a different bootstrap sample. This sampling technique is applied at the row and column (feature) levels. This technique is considered a type of bagging ensemble (or bootstrap aggregation ensemble...