br-test.arff
data. Train your algorithm on
the entire training set using your chosen parameter values, and
evaluate on the BR test set. C kernel kernel-params validation errorwhere
ccc kkk ppp eee
kkk
is "polynomial" or "rbf" and ppp
is the parameter value of the kernel (exponent for polynomial and
gamma for rbf). Include one line for each combination of C, kernel,
and kernel parameters that you tried. java -Xmx200m -jar
weka.jar
to request 200 megabytes of memory for the java vm.For J48, set the "unpruned" option to True. You can use the default settings for all other parameters of J48, NaiveBayesSimple, and Logistic Regression. Optional: Rerun the experiments with pruning turned on and see if it makes any difference.
In addition to running Bagging and AdaBoostM1, you should rerun a single decision tree, a single Naive Bayes, and a single logistic regression.
hw2-1
, hw2-2
,
and br
.
However, we will not construct learning curves this time. Instead, you
should just train
on the following three files:
Domain Training Data File Test Data File
BR br-train.arff br-test.arff
hw2-1 hw2-1-200.arff hw2-1-test.arff
hw2-2 hw2-2-200.arff hw2-2-test.arff
hw2-1:Where
Base learner Single Bagging Boosting
J48 xxx yyy zzz
Logistic xxx yyy zzz
NaiveBayes xxx yyy zzz
hw2-2:
Base learner Single Bagging Boosting
J48 xxx yyy zzz
Logistic xxx yyy zzz
NaiveBayes xxx yyy zzz
br:
Base learner Single Bagging Boosting
J48 xxx yyy zzz
Logistic xxx yyy zzz
NaiveBayes xxx yyy zzz
xxx
gives the error rate of a single classifier of
the indicated Base
Learning, yyy
gives the error rate of a bagging (30
iterations), and
zzz
gives the error rate of AdaboostM1 (maximum 30
iterations).