関数に引数値ROC
として渡すmetric
caretSBF
私たちの目的は、ROC サマリー メトリックをモデル選択に使用する一方で、sbf()
機能選択のためにフィルタリングによる選択関数を実行することです。
データセットは、パッケージから実行までBreastCancer
の再現可能な例として使用され、mlbench
train()
sbf()
metric = "Accuracy"
metric = "ROC"
モデルを最適化するために、および関数によって適用される引数を確実に取得する必要sbf()
があります。この目的のために、関数で関数を使用することを計画しました。関数は を呼び出し、に渡されます。metric
train()
rfe()
train()
sbf()
caretSBF$fit
train()
caretSBF
sbfControl
出力から、引数は一部ではなく、metric
単に使用されているようです。つまり、出力の では、引数はとで使用されているように適用されませんでした。inner resampling
sbf
outer resampling
metric
train()
rfe()
caretSBF
which uses を使用してきtrain()
たように、metric
引数のスコープは に限定されているtrain()
ため、 には渡されないようsbf
です。
モデルを最適化するために引数をsbf()
使用するかどうか、つまり?metric
outer resampling
train()
これは、metric
再現可能な例に関する私たちの作業です。Accuracy
ROC
sbf
I. データセクション
## Loading required packages
library(mlbench)
library(caret)
## Loading `BreastCancer` Dataset from *mlbench* package
data("BreastCancer")
## Data cleaning for missing values
# Remove rows/observation with NA Values in any of the columns
BrC1 <- BreastCancer[complete.cases(BreastCancer),]
# Removing Class and Id Column and keeping just Numeric Predictors
Num_Pred <- BrC1[,2:10]
Ⅱ.カスタマイズされた要約機能
fiveStats 集計関数の定義
fiveStats <- function(...) c(twoClassSummary(...),
defaultSummary(...))
III. 列車区間
trControl の定義
trCtrl <- trainControl(method="repeatedcv", number=10,
repeats=1, classProbs = TRUE, summaryFunction = fiveStats)
TRAIN + METRIC = 「精度」
set.seed(1)
TR_acc <- train(Num_Pred,BrC1$Class, method="rf",metric="Accuracy",
trControl = trCtrl,tuneGrid=expand.grid(.mtry=c(2,3,4,5)))
TR_acc
# Random Forest
#
# 683 samples
# 9 predictor
# 2 classes: 'benign', 'malignant'
#
# No pre-processing
# Resampling: Cross-Validated (10 fold, repeated 1 times)
# Summary of sample sizes: 615, 615, 614, 614, 614, 615, ...
# Resampling results across tuning parameters:
#
# mtry ROC Sens Spec Accuracy Kappa
# 2 0.9936532 0.9729798 0.9833333 0.9765772 0.9490311
# 3 0.9936544 0.9729293 0.9791667 0.9750853 0.9457534
# 4 0.9929957 0.9684343 0.9750000 0.9706948 0.9361373
# 5 0.9922907 0.9684343 0.9666667 0.9677536 0.9295782
#
# Accuracy was used to select the optimal model using the largest value.
# The final value used for the model was mtry = 2.
列車 + メートル法 = "ROC"
set.seed(1)
TR_roc <- train(Num_Pred,BrC1$Class, method="rf",metric="ROC",
trControl = trCtrl,tuneGrid=expand.grid(.mtry=c(2,3,4,5)))
TR_roc
# Random Forest
#
# 683 samples
# 9 predictor
# 2 classes: 'benign', 'malignant'
#
# No pre-processing
# Resampling: Cross-Validated (10 fold, repeated 1 times)
# Summary of sample sizes: 615, 615, 614, 614, 614, 615, ...
# Resampling results across tuning parameters:
#
# mtry ROC Sens Spec Accuracy Kappa
# 2 0.9936532 0.9729798 0.9833333 0.9765772 0.9490311
# 3 0.9936544 0.9729293 0.9791667 0.9750853 0.9457534
# 4 0.9929957 0.9684343 0.9750000 0.9706948 0.9361373
# 5 0.9922907 0.9684343 0.9666667 0.9677536 0.9295782
#
# ROC was used to select the optimal model using the largest value.
# The final value used for the model was mtry = 3.
IV. キャレットSBFの編集
キャレットSBF要約機能の編集
caretSBF$summary <- fiveStats
V. SBF セクション
sbfControl の定義
sbfCtrl <- sbfControl(functions=caretSBF,
method="repeatedcv", number=10, repeats=1,
verbose=T, saveDetails = T)
SBF + METRIC = 「精度」
set.seed(1)
sbf_acc <- sbf(Num_Pred, BrC1$Class,
sbfControl = sbfCtrl,
trControl = trCtrl, method="rf", metric="Accuracy")
## sbf_acc
sbf_acc
# Selection By Filter
#
# Outer resampling method: Cross-Validated (10 fold, repeated 1 times)
#
# Resampling performance:
#
# ROC Sens Spec Accuracy Kappa ROCSD SensSD SpecSD AccuracySD KappaSD
# 0.9931 0.973 0.9833 0.9766 0.949 0.006272 0.0231 0.02913 0.01226 0.02646
#
# Using the training set, 9 variables were selected:
# Cl.thickness, Cell.size, Cell.shape, Marg.adhesion, Epith.c.size...
#
# During resampling, the top 5 selected variables (out of a possible 9):
# Bare.nuclei (100%), Bl.cromatin (100%), Cell.shape (100%), Cell.size (100%), Cl.thickness (100%)
#
# On average, 9 variables were selected (min = 9, max = 9)
## Class of sbf_acc
class(sbf_acc)
# [1] "sbf"
## Names of elements of sbf_acc
names(sbf_acc)
# [1] "pred" "variables" "results" "fit" "optVariables"
# [6] "call" "control" "resample" "metrics" "times"
# [11] "resampledCM" "obsLevels" "dots"
## sbf_acc fit element*
sbf_acc$fit
# Random Forest
#
# 683 samples
# 9 predictor
# 2 classes: 'benign', 'malignant'
#
# No pre-processing
# Resampling: Cross-Validated (10 fold, repeated 1 times)
# Summary of sample sizes: 615, 614, 614, 615, 615, 615, ...
# Resampling results across tuning parameters:
#
# mtry ROC Sens Spec Accuracy Kappa
# 2 0.9933176 0.9706566 0.9833333 0.9751492 0.9460717
# 5 0.9920034 0.9662121 0.9791667 0.9707801 0.9363708
# 9 0.9914825 0.9684343 0.9708333 0.9693308 0.9327662
#
# Accuracy was used to select the optimal model using the largest value.
# The final value used for the model was mtry = 2.
## Elements of sbf_acc fit
names(sbf_acc$fit)
# [1] "method" "modelInfo" "modelType" "results" "pred"
# [6] "bestTune" "call" "dots" "metric" "control"
# [11] "finalModel" "preProcess" "trainingData" "resample" "resampledCM"
# [16] "perfNames" "maximize" "yLimits" "times" "levels"
## sbf_acc fit final Model
sbf_acc$fit$finalModel
# Call:
# randomForest(x = x, y = y, mtry = param$mtry)
# Type of random forest: classification
# Number of trees: 500
# No. of variables tried at each split: 2
#
# OOB estimate of error rate: 2.34%
# Confusion matrix:
# benign malignant class.error
# benign 431 13 0.02927928
# malignant 3 236 0.01255230
## sbf_acc metric
sbf_acc$fit$metric
# [1] "Accuracy"
## sbf_acc fit best Tune*
sbf_acc$fit$bestTune
# mtry
# 1 2
SBF + メトリック = "ROC"
set.seed(1)
sbf_roc <- sbf(Num_Pred, BrC1$Class,
sbfControl = sbfCtrl,
trControl = trCtrl, method="rf", metric="ROC")
## sbf_roc
sbf_roc
# Selection By Filter
#
# Outer resampling method: Cross-Validated (10 fold, repeated 1 times)
#
# Resampling performance:
#
# ROC Sens Spec Accuracy Kappa ROCSD SensSD SpecSD AccuracySD KappaSD
# 0.9931 0.973 0.9833 0.9766 0.949 0.006272 0.0231 0.02913 0.01226 0.02646
#
# Using the training set, 9 variables were selected:
# Cl.thickness, Cell.size, Cell.shape, Marg.adhesion, Epith.c.size...
#
# During resampling, the top 5 selected variables (out of a possible 9):
# Bare.nuclei (100%), Bl.cromatin (100%), Cell.shape (100%), Cell.size (100%), Cl.thickness (100%)
#
# On average, 9 variables were selected (min = 9, max = 9)
## Class of sbf_roc
class(sbf_roc)
# [1] "sbf"
## Names of elements of sbf_roc
names(sbf_roc)
# [1] "pred" "variables" "results" "fit" "optVariables"
# [6] "call" "control" "resample" "metrics" "times"
# [11] "resampledCM" "obsLevels" "dots"
## sbf_roc fit element*
sbf_roc$fit
# Random Forest
#
# 683 samples
# 9 predictor
# 2 classes: 'benign', 'malignant'
#
# No pre-processing
# Resampling: Cross-Validated (10 fold, repeated 1 times)
# Summary of sample sizes: 615, 614, 614, 615, 615, 615, ...
# Resampling results across tuning parameters:
#
# mtry ROC Sens Spec Accuracy Kappa
# 2 0.9933176 0.9706566 0.9833333 0.9751492 0.9460717
# 5 0.9920034 0.9662121 0.9791667 0.9707801 0.9363708
# 9 0.9914825 0.9684343 0.9708333 0.9693308 0.9327662
#
# ROC was used to select the optimal model using the largest value.
# The final value used for the model was mtry = 2.
## Elements of sbf_roc fit
names(sbf_roc$fit)
# [1] "method" "modelInfo" "modelType" "results" "pred"
# [6] "bestTune" "call" "dots" "metric" "control"
# [11] "finalModel" "preProcess" "trainingData" "resample" "resampledCM"
# [16] "perfNames" "maximize" "yLimits" "times" "levels"
## sbf_roc fit final Model
sbf_roc$fit$finalModel
# Call:
# randomForest(x = x, y = y, mtry = param$mtry)
# Type of random forest: classification
# Number of trees: 500
# No. of variables tried at each split: 2
#
# OOB estimate of error rate: 2.34%
# Confusion matrix:
# benign malignant class.error
# benign 431 13 0.02927928
# malignant 3 236 0.01255230
## sbf_roc metric
sbf_roc$fit$metric
# [1] "ROC"
## sbf_roc fit best Tune
sbf_roc$fit$bestTune
# mtry
# 1 2
引数をsbf()
使用してモデルを最適化しますか? metric
はいの場合、デフォルトとして何を使用しmetric
ますsbf()
か? sbf()
引数を使用する場合metric
、どのように設定しROC
ますか?
ありがとう。