Exponential smoothing state space model with Box-Cox transformation, ARMA errors, Trend and Seasonal component
(TBATS) model.
Calls forecast::tbats()
from package forecast.
Dictionary
This mlr3::Learner can be instantiated via the dictionary mlr3::mlr_learners or with the associated sugar function mlr3::lrn()
:
Meta Information
Task type: “fcst”
Predict Types: “response”, “quantiles”
Feature Types: “logical”, “integer”, “numeric”, “character”, “factor”, “ordered”, “POSIXct”, “Date”
Required Packages: mlr3, mlr3forecast, forecast
Parameters
Id | Type | Default | Levels | Range |
use.box.cox | logical | NULL | TRUE, FALSE | - |
use.trend | logical | NULL | TRUE, FALSE | - |
use.damped.trend | logical | NULL | TRUE, FALSE | - |
seasonal.periods | untyped | NULL | - | |
use.arma.errors | logical | NULL | TRUE, FALSE | - |
use.parallel | untyped | - | - | |
num.cores | integer | 2 | \([1, \infty)\) | |
bc.lower | integer | 0 | \((-\infty, \infty)\) | |
bc.upper | integer | 1 | \((-\infty, \infty)\) | |
biasadj | logical | FALSE | TRUE, FALSE | - |
References
De Livera, A.M., Hyndman, R.J., Snyder &, D. R (2011). “Forecasting time series with complex seasonal patterns using exponential smoothing.” Journal of the American Statistical Association, 106(496), 1513–1527.
See also
Chapter in the mlr3book: https://mlr3book.mlr-org.com/chapters/chapter2/data_and_basic_modeling.html#sec-learners
Package mlr3learners for a solid collection of essential learners.
Package mlr3extralearners for more learners.
as.data.table(mlr_learners)
for a table of available Learners in the running session (depending on the loaded packages).mlr3pipelines to combine learners with pre- and postprocessing steps.
Package mlr3viz for some generic visualizations.
Extension packages for additional task types:
mlr3proba for probabilistic supervised regression and survival analysis.
mlr3cluster for unsupervised clustering.
mlr3tuning for tuning of hyperparameters, mlr3tuningspaces for established default tuning spaces.
Other Learner:
LearnerFcst
,
mlr_learners_fcst.adam
,
mlr_learners_fcst.arfima
,
mlr_learners_fcst.arima
,
mlr_learners_fcst.auto_adam
,
mlr_learners_fcst.auto_arima
,
mlr_learners_fcst.auto_ces
,
mlr_learners_fcst.bats
,
mlr_learners_fcst.ces
,
mlr_learners_fcst.ets
,
mlr_learners_fcst.nnetar
Super classes
mlr3::Learner
-> mlr3::LearnerRegr
-> mlr3forecast::LearnerFcst
-> mlr3forecast::LearnerFcstForecast
-> LearnerFcstTbats
Methods
Inherited methods
mlr3::Learner$base_learner()
mlr3::Learner$configure()
mlr3::Learner$encapsulate()
mlr3::Learner$format()
mlr3::Learner$help()
mlr3::Learner$predict()
mlr3::Learner$predict_newdata()
mlr3::Learner$print()
mlr3::Learner$reset()
mlr3::Learner$selected_features()
mlr3::Learner$train()
mlr3::LearnerRegr$predict_newdata_fast()
Examples
# Define the Learner and set parameter values
learner = lrn("fcst.tbats")
print(learner)
#>
#> ── <LearnerFcstTbats> (fcst.tbats): TBATS ──────────────────────────────────────
#> • Model: -
#> • Parameters: list()
#> • Packages: mlr3, mlr3forecast, and forecast
#> • Predict Types: [response] and quantiles
#> • Feature Types: logical, integer, numeric, character, factor, ordered,
#> POSIXct, and Date
#> • Encapsulation: none (fallback: -)
#> • Properties: featureless and missings
#> • Other settings: use_weights = 'error'
# Define a Task
task = tsk("airpassengers")
# Create train and test set
ids = partition(task)
# Train the learner on the training ids
learner$train(task, row_ids = ids$train)
# Print the model
print(learner$model)
#> TBATS(0, {0,0}, 1, {<12,5>})
#>
#> Call: forecast::tbats(y = as.ts(task))
#>
#> Parameters
#> Lambda: 0
#> Alpha: 0.6705538
#> Beta: 0.04675065
#> Damping Parameter: 1
#> Gamma-1 Values: 0.004508484
#> Gamma-2 Values: 0.01178641
#>
#> Seed States:
#> [,1]
#> [1,] 4.808973955
#> [2,] -0.006979784
#> [3,] -0.132575268
#> [4,] 0.049822202
#> [5,] -0.009852127
#> [6,] 0.007714254
#> [7,] 0.001576272
#> [8,] 0.035970508
#> [9,] 0.062976543
#> [10,] -0.025967892
#> [11,] -0.036114544
#> [12,] -0.020072721
#> attr(,"lambda")
#> [1] 2.747722e-08
#>
#> Sigma: 0.03474871
#> AIC: 846.5215
# Importance method
if ("importance" %in% learner$properties) print(learner$importance)
# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)
# Score the predictions
predictions$score()
#> regr.mse
#> 1761.171