Refits a SLOPE model using the optimal parameters found through cross-validation. This is a convenience function to avoid having to manually extract optimal parameters and refit.
Arguments
- object
an object of class
'TrainedSLOPE', typically from a call tocvSLOPE()ortrainSLOPE()- x
the design matrix. If
NULL(default), uses the training data stored inobject.- y
the response vector. If
NULL(default), uses the training data stored inobject.- measure
which performance measure to use for selecting optimal parameters. If
NULL(default), uses the first measure in theTrainedSLOPEobject.- ...
additional arguments passed to
SLOPE()
See also
Other model-tuning:
cvSLOPE(),
plot.TrainedSLOPE(),
summary.TrainedSLOPE(),
trainSLOPE()
Examples
# Cross-validation
tune <- trainSLOPE(
bodyfat$x,
bodyfat$y,
q = c(0.1, 0.2),
measure = "mse"
)
# Refit with optimal parameters
fit <- refit(tune)
# Use the fitted model
coef(fit)
#> 14 x 1 sparse Matrix of class "dgCMatrix"
#>
#> [1,] -17.90196837
#> [2,] 0.06259865
#> [3,] -0.08747577
#> [4,] -0.06916852
#> [5,] -0.47293893
#> [6,] -0.02449479
#> [7,] 0.95438181
#> [8,] -0.20735421
#> [9,] 0.23860527
#> [10,] .
#> [11,] 0.17623295
#> [12,] 0.18071372
#> [13,] 0.45314128
#> [14,] -1.61827678
predict(fit, bodyfat$x)
#> [1] 16.139590 8.854464 18.526405 11.943010 27.248018 16.928320 16.732282
#> [8] 13.898653 9.698268 10.152639 9.007396 12.906688 17.828670 25.035700
#> [15] 24.095781 22.956362 23.522443 19.457441 16.742204 23.172979 21.113975
#> [22] 19.874287 9.570607 11.123790 8.159657 8.514984 9.100399 17.945091
#> [29] 6.354052 11.654804 14.617692 11.004365 6.164772 23.780454 32.277172
#> [36] 37.581033 24.123687 21.888016 44.069213 32.501524 36.752286 32.507342
#> [43] 34.249902 26.381131 10.902965 9.959660 7.791920 9.524858 17.799068
#> [50] 5.534465 14.738358 9.501606 14.239489 10.778114 7.683228 23.570228
#> [57] 25.850581 27.681105 28.648363 26.284994 25.359952 23.538884 27.488145
#> [64] 27.220171 30.092649 26.201025 15.447751 15.711613 8.725268 13.831437
#> [71] 19.518899 12.790915 11.061519 11.594194 16.812274 12.111756 8.941683
#> [78] 19.365905 23.181852 24.796230 22.166963 17.891166 22.975633 21.324753
#> [85] 27.193629 20.935841 17.570345 21.799653 12.693308 14.170937 21.707618
#> [92] 17.619569 11.033014 21.678433 14.757003 16.273033 16.831995 16.071501
#> [99] 17.823416 19.143161 18.496462 19.918173 17.298970 17.175923 23.906672
#> [106] 17.795902 25.864728 22.249684 12.448984 21.056207 19.244989 32.430654
#> [113] 20.879773 19.873511 20.128118 16.435154 16.879762 14.379364 18.169114
#> [120] 13.461510 20.376294 22.503313 14.069162 17.116056 15.483720 20.994237
#> [127] 21.503762 8.995321 18.656908 15.002013 18.321505 20.168561 24.806551
#> [134] 21.075206 16.057769 26.842456 16.672340 26.058387 17.897384 28.753927
#> [141] 21.165762 21.288245 18.908271 5.403915 9.316465 12.525017 22.463493
#> [148] 22.965131 6.071641 26.661136 8.507706 21.861410 4.080357 16.591720
#> [155] 21.538821 11.082969 28.605697 15.593212 11.755283 18.859669 11.800963
#> [162] 17.699667 15.509489 15.292169 26.996477 17.939001 17.781455 19.012221
#> [169] 37.496039 20.159635 11.661598 8.213530 16.667960 16.579749 21.344762
#> [176] 11.160395 15.937969 28.018418 20.680453 23.591262 25.033639 4.487411
#> [183] 16.279264 16.870866 17.903457 11.143301 27.501783 21.971991 22.635190
#> [190] 26.500991 10.372318 30.197544 17.952644 26.634702 16.791305 22.748901
#> [197] 17.079022 19.844715 5.764547 17.972280 15.563015 16.025705 27.503684
#> [204] 15.095763 35.731029 15.061097 22.883405 26.795604 13.811974 14.614901
#> [211] 12.823306 25.306008 15.383158 21.421590 14.243322 41.180302 11.505665
#> [218] 9.045443 23.815598 17.900731 20.845356 30.611012 17.285725 16.386625
#> [225] 20.317912 12.941471 19.095903 21.745386 17.251905 20.606000 20.032969
#> [232] 21.355706 16.458459 23.420025 21.154863 22.439775 21.499304 34.473420
#> [239] 14.272600 25.949076 15.003207 36.746990 27.974722 32.883287 29.422928
#> [246] 13.937621 30.703630 14.742499 25.841769 36.940574 24.546903 27.377912
