Skip to contents

Complex Exponential Smoothing (CES) model. Calls smooth::ces() from package smooth.

Dictionary

This mlr3::Learner can be instantiated via the dictionary mlr3::mlr_learners or with the associated sugar function mlr3::lrn():

mlr_learners$get("fcst.ces")
lrn("fcst.ces")

Meta Information

  • Task type: “fcst”

  • Predict Types: “response”

  • Feature Types: “logical”, “integer”, “numeric”, “character”, “factor”, “ordered”, “POSIXct”, “Date”

  • Required Packages: mlr3, mlr3forecast, smooth

Parameters

IdTypeDefaultLevels
seasonalitycharacternonenone, simple, partial, full
lagsuntyped-
initialcharacterbackcastingbackcasting, optimal, complete
auntypedNULL
buntypedNULL
losscharacterlikelihoodlikelihood, MSE, MAE, HAM, MSEh, TMSE, GTMSE, MSCE
holdoutlogicalFALSETRUE, FALSE
boundscharacteradmissibleadmissible, none
silentlogicalTRUETRUE, FALSE
regressorscharacteruseuse, select, adapt

References

Svetunkov I (2023). “Smooth forecasting with the smooth package in R.” 2301.01790, https://arxiv.org/abs/2301.01790.

Svetunkov, Ivan (2023). Forecasting and Analytics with the Augmented Dynamic Adaptive Model (ADAM), 1st edition. Chapman and Hall/CRC. doi:10.1201/9781003452652 , https://openforecast.org/adam/.

See also

Other Learner: LearnerFcst, mlr_learners_fcst.adam, mlr_learners_fcst.arfima, mlr_learners_fcst.arima, mlr_learners_fcst.auto_adam, mlr_learners_fcst.auto_arima, mlr_learners_fcst.auto_ces, mlr_learners_fcst.bats, mlr_learners_fcst.ets, mlr_learners_fcst.nnetar, mlr_learners_fcst.tbats

Super classes

mlr3::Learner -> mlr3::LearnerRegr -> mlr3forecast::LearnerFcst -> LearnerFcstCes

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerFcstCes$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner and set parameter values
learner = lrn("fcst.ces")
print(learner)
#> 
#> ── <LearnerFcstCes> (fcst.ces): CES ────────────────────────────────────────────
#> • Model: -
#> • Parameters: list()
#> • Packages: mlr3, mlr3forecast, and smooth
#> • Predict Types: [response]
#> • Feature Types: logical, integer, numeric, character, factor, ordered,
#> POSIXct, and Date
#> • Encapsulation: none (fallback: -)
#> • Properties: featureless and missings
#> • Other settings: use_weights = 'error'

# Define a Task
task = tsk("airpassengers")

# Create train and test set
ids = partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

# Print the model
print(learner$model)
#> Time elapsed: 0.03 seconds
#> Model estimated using ces() function: CES(none)
#> With backcasting initialisation
#> Distribution assumed in the model: Normal
#> Loss function type: likelihood; Loss function value: 438.0686
#> a0 + ia1: 1.9916+0.9958i 
#> 
#> Sample size: 96
#> Number of estimated parameters: 3
#> Number of degrees of freedom: 93
#> Information criteria:
#>      AIC     AICc      BIC     BICc 
#> 882.1373 882.3981 889.8303 890.4257 

# Importance method
if ("importance" %in% learner$properties) print(learner$importance)

# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> regr.mse 
#> 21918.49