Skip to contents

Exponential smoothing state space model with Box-Cox transformation, ARMA errors, Trend and Seasonal component (TBATS) model. Calls forecast::tbats() from package forecast.

Dictionary

This mlr3::Learner can be instantiated via the dictionary mlr3::mlr_learners or with the associated sugar function mlr3::lrn():

mlr_learners$get("fcst.tbats")
lrn("fcst.tbats")

Meta Information

  • Task type: “fcst”

  • Predict Types: “response”, “quantiles”

  • Feature Types: “logical”, “integer”, “numeric”, “character”, “factor”, “ordered”, “POSIXct”, “Date”

  • Required Packages: mlr3, mlr3forecast, forecast

Parameters

IdTypeDefaultLevelsRange
use.box.coxlogicalNULLTRUE, FALSE-
use.trendlogicalNULLTRUE, FALSE-
use.damped.trendlogicalNULLTRUE, FALSE-
seasonal.periodsuntypedNULL-
use.arma.errorslogicalNULLTRUE, FALSE-
use.paralleluntyped--
num.coresinteger2\([1, \infty)\)
bc.lowerinteger0\((-\infty, \infty)\)
bc.upperinteger1\((-\infty, \infty)\)
biasadjlogicalFALSETRUE, FALSE-

References

De Livera, A.M., Hyndman, R.J., Snyder &, D. R (2011). “Forecasting time series with complex seasonal patterns using exponential smoothing.” Journal of the American Statistical Association, 106(496), 1513–1527.

See also

Other Learner: LearnerFcst, mlr_learners_fcst.adam, mlr_learners_fcst.arfima, mlr_learners_fcst.arima, mlr_learners_fcst.auto_adam, mlr_learners_fcst.auto_arima, mlr_learners_fcst.auto_ces, mlr_learners_fcst.bats, mlr_learners_fcst.ces, mlr_learners_fcst.ets, mlr_learners_fcst.nnetar

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerFcstTbats$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner and set parameter values
learner = lrn("fcst.tbats")
print(learner)
#> 
#> ── <LearnerFcstTbats> (fcst.tbats): TBATS ──────────────────────────────────────
#> • Model: -
#> • Parameters: list()
#> • Packages: mlr3, mlr3forecast, and forecast
#> • Predict Types: [response] and quantiles
#> • Feature Types: logical, integer, numeric, character, factor, ordered,
#> POSIXct, and Date
#> • Encapsulation: none (fallback: -)
#> • Properties: featureless and missings
#> • Other settings: use_weights = 'error'

# Define a Task
task = tsk("airpassengers")

# Create train and test set
ids = partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

# Print the model
print(learner$model)
#> TBATS(0, {0,0}, 1, {<12,5>})
#> 
#> Call: forecast::tbats(y = as.ts(task))
#> 
#> Parameters
#>   Lambda: 0
#>   Alpha: 0.6705538
#>   Beta: 0.04675065
#>   Damping Parameter: 1
#>   Gamma-1 Values: 0.004508484
#>   Gamma-2 Values: 0.01178641
#> 
#> Seed States:
#>               [,1]
#>  [1,]  4.808973955
#>  [2,] -0.006979784
#>  [3,] -0.132575268
#>  [4,]  0.049822202
#>  [5,] -0.009852127
#>  [6,]  0.007714254
#>  [7,]  0.001576272
#>  [8,]  0.035970508
#>  [9,]  0.062976543
#> [10,] -0.025967892
#> [11,] -0.036114544
#> [12,] -0.020072721
#> attr(,"lambda")
#> [1] 2.747722e-08
#> 
#> Sigma: 0.03474871
#> AIC: 846.5215

# Importance method
if ("importance" %in% learner$properties) print(learner$importance)

# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> regr.mse 
#> 1761.171