Issue is I have run several xgboost tuning processes already but only saved the results in text format or more precisely the metadata, where model-parameters and performance are saved.
it has following structure:
str(p)
'data.frame': 130 obs. of 10 variables:
$ mtry : int 922 1046 512 1317 675 1303 518 1029 1345 1180 ...
$ min_n : int 34 36 73 89 91 32 73 52 75 93 ...
$ tree_depth : int 44 33 43 37 34 48 25 19 38 41 ...
$ learn_rate : num 0.0236 0.0257 0.0292 0.0254 0.0271 0.023 0.025 0.0226 0.0281 0.0641 ...
$ loss_reduction: num 0.0268 0.745 0.148 0.171 0.0275 ...
$ sample_size : num 0.967 0.947 0.789 0.825 0.973 0.521 0.798 0.813 0.993 0.959 ...
$ .metric : chr "mn_log_loss" "mn_log_loss" "mn_log_loss" "mn_log_loss" ...
$ .estimator : chr "binary" "binary" "binary" "binary" ...
$ mean : num 0.423 0.424 0.424 0.424 0.424 0.425 0.425 0.426 0.427 0.427 ...
$ std_err : num 0.000382 0.000439 0.000408 0.000344 0.000368 0.000407 0.000386 0.000398 0.000392 0.000441 ...
Now I want to use the metadata as an initial for a tune_bayes operation:
Error in check_initial():
! initial should be a positive integer or the results of [tune_grid()]
Run rlang::last_trace() to see where the error occurred.
How can I bring it into a matching format without rerunning time costly computations?
This how tune_grid results look:
Tuning results
5-fold cross-validation using stratification
A tibble: 5 × 4
splits id .metrics .notes
1 <split [843580/210897]> Fold1 <tibble [1 × 10]> <tibble [0 × 3]>
2 <split [843582/210895]> Fold2 <tibble [1 × 10]> <tibble [0 × 3]>
3 <split [843582/210895]> Fold3 <tibble [1 × 10]> <tibble [0 × 3]>
4 <split [843582/210895]> Fold4 <tibble [1 × 10]> <tibble [0 × 3]>
5 <split [843582/210895]> Fold5 <tibble [1 × 10]> <tibble [0 × 3]>
Here is the documentation of tune_grid:
https://github.com/tidymodels/tune/blob/main/R/tune_grid.R
It did not get me far.
Thanks!