0
Follow
2
View

Why does tune_race_anova() fail at a task, that tune_grid() is able to handle?

ctnzwj 注册会员
2023-01-26 11:58

I think your problem in this particular case may be that your metric is too close to the same for every possible model. You can see this with a simpler example, but it looks like it is happening for when you add in more steps too:

library(tidymodels)
library(tune)
#> Registered S3 method overwritten by 'finetune':
#>   method            from
#>   obj_sum.tune_race tune
data(cells, package = "modeldata")

set.seed(31)
split <- cells %>% 
  select(-case) %>%
  initial_split(prop = 0.8)

set.seed(234)
folds <- training(split) %>% vfold_cv(v = 3)
folds
#> #  3-fold cross-validation 
#> # A tibble: 3 × 2
#>   splits             id   
#>    
#> 1  Fold1
#> 2  Fold2
#> 3  Fold3

xgb_spec <- boost_tree(mode = "classification", trees = tune()) 

set.seed(234)
workflow(class ~ ., xgb_spec) %>% 
  grid(
    resamples = folds,
    grid = 5
  ) %>%
  collect_metrics()
#> # A tibble: 10 × 7
#>    trees .metric  .estimator  mean     n std_err .config             
#>                         
#>  1   354 accuracy binary     0.834     3 0.00619 Preprocessor1_Model1
#>  2   354 roc_auc  binary     0.907     3 0.00340 Preprocessor1_Model1
#>  3   736 accuracy binary     0.836     3 0.00619 Preprocessor1_Model2
#>  4   736 roc_auc  binary     0.907     3 0.00314 Preprocessor1_Model2
#>  5   972 accuracy binary     0.835     3 0.00638 Preprocessor1_Model3
#>  6   972 roc_auc  binary     0.907     3 0.00316 Preprocessor1_Model3
#>  7  1396 accuracy binary     0.836     3 0.00619 Preprocessor1_Model4
#>  8  1396 roc_auc  binary     0.907     3 0.00302 Preprocessor1_Model4
#>  9  1949 accuracy binary     0.835     3 0.00577 Preprocessor1_Model5
#> 10  1949 roc_auc  binary     0.907     3 0.00292 Preprocessor1_Model5

Created on 2022-02-22 by the reprex package (v2.0.1)

I think finetune may not be handling the metrics being the same well, and is erroring in a confusing way.

About the Author

Question Info

Publish Time
2023-01-26 11:58
Update Time
2023-01-26 11:58