Refits a SLOPE model using the optimal parameters found through cross-validation. This is a convenience function to avoid having to manually extract optimal parameters and refit.
Arguments
- object
an object of class
'TrainedSLOPE', typically from a call tocvSLOPE()ortrainSLOPE()- x
the design matrix
- y
the response vector
- measure
which performance measure to use for selecting optimal parameters. If
NULL(default), uses the first measure in theTrainedSLOPEobject.- ...
additional arguments passed to
SLOPE()
See also
Other model-tuning:
cvSLOPE(),
plot.TrainedSLOPE(),
summary.TrainedSLOPE(),
trainSLOPE()
Examples
# Cross-validation
tune <- trainSLOPE(
bodyfat$x,
bodyfat$y,
q = c(0.1, 0.2),
measure = "mse"
)
# Refit with optimal parameters
fit <- refit(tune, bodyfat$x, bodyfat$y)
# Use the fitted model
coef(fit)
#> 14 x 1 sparse Matrix of class "dgCMatrix"
#>
#> [1,] -19.28461388
#> [2,] 0.06708922
#> [3,] -0.09058704
#> [4,] -0.06255449
#> [5,] -0.47348606
#> [6,] .
#> [7,] 0.93731359
#> [8,] -0.20162704
#> [9,] 0.25316204
#> [10,] .
#> [11,] 0.18045990
#> [12,] 0.18434177
#> [13,] 0.41840188
#> [14,] -1.63205744
predict(fit, bodyfat$x)
#> [1] 16.056086 8.738975 18.566758 11.974815 27.044500 16.914593 16.798387
#> [8] 13.840382 9.753032 10.089792 9.073714 12.954474 17.891100 24.871167
#> [15] 24.055588 23.025851 23.448776 19.450162 16.865911 23.151627 21.052272
#> [22] 19.807806 9.568734 10.996264 8.161681 8.379190 9.055239 17.953986
#> [29] 6.345920 11.756449 14.680824 11.016217 6.041836 23.858234 32.240421
#> [36] 37.694935 24.050434 21.905353 44.124252 32.478270 36.762291 32.350264
#> [43] 34.316493 26.231141 10.917690 10.108578 7.834600 9.467766 17.824409
#> [50] 5.528327 14.645791 9.469559 14.174228 10.852621 7.560541 23.513966
#> [57] 25.806261 27.741143 28.727664 26.311434 25.340592 23.577019 27.452153
#> [64] 27.265305 30.011204 26.188435 15.403085 15.722396 8.741738 13.825343
#> [71] 19.568260 12.791750 11.103615 11.635469 16.905629 12.158969 8.960145
#> [78] 19.464569 23.207286 24.790751 22.114729 18.046562 22.963636 21.402150
#> [85] 27.238478 21.046589 17.695167 21.971795 12.731411 14.175095 21.662340
#> [92] 17.618180 11.099640 21.670617 14.824203 16.381455 16.740131 16.229300
#> [99] 17.745036 19.245068 18.486817 20.001274 17.239481 17.109821 23.916661
#> [106] 17.604272 25.794089 22.220341 12.639908 21.072810 19.225128 32.499840
#> [113] 20.762663 19.850776 20.198644 16.470136 16.847243 14.310972 18.084707
#> [120] 13.558696 20.238892 22.296538 14.256807 17.124878 15.582612 21.047602
#> [127] 21.411110 9.081427 18.670826 15.000158 18.348861 20.088610 24.778855
#> [134] 21.080339 16.012050 26.763065 16.597714 25.962660 17.945118 28.672522
#> [141] 21.135332 21.257444 19.034050 5.391281 9.230621 12.445779 22.400845
#> [148] 23.035719 6.130661 26.677558 8.468103 21.727570 4.099159 16.413423
#> [155] 21.338622 11.000910 28.450914 15.552440 11.371401 18.833282 11.796192
#> [162] 17.665339 15.395541 15.161942 26.895297 17.857982 17.668502 18.827338
#> [169] 37.426754 20.091931 11.586487 8.244465 16.632931 16.536697 21.723251
#> [176] 11.228186 16.002217 28.157996 20.710534 23.510013 25.067446 4.382665
#> [183] 16.258538 16.836298 17.830836 10.976317 27.527502 21.949033 22.616915
#> [190] 26.397300 10.322520 30.129669 17.958701 26.553588 16.811681 22.657546
#> [197] 17.180864 19.796101 5.794936 18.042550 15.673107 15.975623 27.569291
#> [204] 15.177829 35.775456 15.302422 22.833179 26.607801 13.905559 14.631924
#> [211] 12.749561 25.443569 15.393304 21.370995 14.334441 41.113500 11.599029
#> [218] 9.124350 23.845944 17.889809 20.838405 30.672912 17.313969 16.446033
#> [225] 20.315962 13.109515 19.119923 21.856839 17.367775 20.497351 20.138486
#> [232] 21.376196 16.519851 23.366029 21.045522 22.537465 21.532095 34.569323
#> [239] 14.444930 25.917664 15.188497 36.762930 28.134806 32.967696 29.612647
#> [246] 14.090850 30.744269 14.705182 25.882858 37.005425 24.550591 27.396597
