Skip to contents

Auto Augmented Dynamic Adaptive Model (ADAM) model. Calls smooth::auto.adam() from package smooth.

Dictionary

This mlr3::Learner can be instantiated via the dictionary mlr3::mlr_learners or with the associated sugar function mlr3::lrn():

mlr_learners$get("fcst.auto_adam")
lrn("fcst.auto_adam")

Meta Information

  • Task type: “fcst”

  • Predict Types: “response”

  • Feature Types: “logical”, “integer”, “numeric”, “character”, “factor”, “ordered”, “POSIXct”, “Date”

  • Required Packages: mlr3, mlr3forecast, smooth

Parameters

IdTypeDefaultLevels
modeluntyped"ZXZ"
lagsuntyped-
ordersuntyped-
regressorscharacteruseuse, select, adapt
occurrencecharacternonenone, auto, fixed, general, odds-ratio, inverse-odds-ratio, direct
distributioncharacterdnormdnorm, dlaplace, ds, dgnorm, dlnorm, dinvgauss, dgamma
outlierscharacterignoreignore, use, select
holdoutlogicalFALSETRUE, FALSE
persistenceuntypedNULL
phiuntypedNULL
initialcharacteroptimaloptimal, backcasting, complete
armauntypedNULL
iccharacterAICcAICc, AIC, BIC, BICc
boundscharacterusualusual, admissible, none
silentlogicalTRUETRUE, FALSE
parallellogicalFALSETRUE, FALSE
etscharacterconventionalconventional, adam

References

Svetunkov I (2023). “Smooth forecasting with the smooth package in R.” 2301.01790, https://arxiv.org/abs/2301.01790.

Svetunkov, Ivan (2023). Forecasting and Analytics with the Augmented Dynamic Adaptive Model (ADAM), 1st edition. Chapman and Hall/CRC. doi:10.1201/9781003452652 , https://openforecast.org/adam/.

See also

Other Learner: LearnerFcst, mlr_learners_fcst.adam, mlr_learners_fcst.arfima, mlr_learners_fcst.arima, mlr_learners_fcst.auto_arima, mlr_learners_fcst.auto_ces, mlr_learners_fcst.bats, mlr_learners_fcst.ces, mlr_learners_fcst.ets, mlr_learners_fcst.nnetar, mlr_learners_fcst.tbats

Super classes

mlr3::Learner -> mlr3::LearnerRegr -> mlr3forecast::LearnerFcst -> LearnerFcstAutoAdam

Methods

Inherited methods


Method new()

Creates a new instance of this R6 class.

Usage


Method clone()

The objects of this class are cloneable with this method.

Usage

LearnerFcstAutoAdam$clone(deep = FALSE)

Arguments

deep

Whether to make a deep clone.

Examples

# Define the Learner and set parameter values
learner = lrn("fcst.auto_adam")
print(learner)
#> 
#> ── <LearnerFcstAutoAdam> (fcst.auto_adam): Auto ADAM ───────────────────────────
#> • Model: -
#> • Parameters: list()
#> • Packages: mlr3, mlr3forecast, and smooth
#> • Predict Types: [response]
#> • Feature Types: logical, integer, numeric, character, factor, ordered,
#> POSIXct, and Date
#> • Encapsulation: none (fallback: -)
#> • Properties: featureless and missings
#> • Other settings: use_weights = 'error'

# Define a Task
task = tsk("airpassengers")

# Create train and test set
ids = partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

# Print the model
print(learner$model)
#> Time elapsed: 3.67 seconds
#> Model estimated using auto.adam() function: ETS(MAM)
#> With backcasting initialisation
#> Distribution assumed in the model: Normal
#> Loss function type: likelihood; Loss function value: 322.1466
#> Persistence vector g:
#>  alpha   beta  gamma 
#> 0.7152 0.0000 0.0000 
#> 
#> Sample size: 96
#> Number of estimated parameters: 4
#> Number of degrees of freedom: 92
#> Information criteria:
#>      AIC     AICc      BIC     BICc 
#> 652.2932 652.7327 662.5506 663.5537 

# Importance method
if ("importance" %in% learner$properties) print(learner$importance)

# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
#> regr.mse 
#> 3837.401