Topic 15 Support Vector Machines (Part 2)
15.1 Exercises
You can download a template RMarkdown file to start from here.
NOTE: completing these exercises (both Part 1 and Part 2) is your Homework 6, due Wednesday, April 3 at midnight.
SVMs work well in high-dimensional settings but can be computationally intensive to fit. We’ll look at a smaller illustrative data example to get familiar with how to train SVMs in caret. Read in the data as below, and you’ll also need to install the kernlab package before starting.
dat <- read.csv("https://www.dropbox.com/s/abft0f9brohsmvd/svm_example.csv?dl=1")- Data exploration
Make a visualization of both predictor variables and the class label by coloring points according to their class.
ggplot(dat, aes(x = ???, y = ???, ???)) + geom_point()Would you expect a support vector classifier or a support vector machine to perform better for this data? Why?
Support vector classifer
We can fit an ordinary support vector classifier usingmethod = "svmLinear", incaret. TheCparameter in tune grid is the cost associated with a violation of the margin.svm_mod_linear <- train( y ~ ., data = dat, method = "svmLinear", metric = "Accuracy", tuneGrid = data.frame(C = c(0.25,0.5,1,2,4,8,16,32,64,128)), trControl = trainControl(method = "cv", number = 10) )- Write a few sentences describing the modeling process that this code performs, as if for the Methods section of a scientific paper. You don’t need to describe how SVMs work, but do describe what models you fit, how they were evaluated, and how you chose a final model. (This is practice for writing up your final project reports.)
- Print the
svm_mod_linearobject to display a summary of model performance. What do you notice about performance, and why might this be the case given your visualization?
Support vector machine
A common choice for the kernel function in SVMs is the radial basis function. We can fit such an SVM withmethod = "svmRadialCost", incaret. There is a \(\gamma\) parameter within the radial kernel that we could tune, but we would also want to tune costCas well. It can be hard to tune 2 parameters at once, somethod = "svmRadialCost"uses a default rule to first pick a good \(\gamma\) and leaves us to tune costC.svm_mod_radial <- train( y ~ ., data = dat, method = "svmRadialCost", metric = "Accuracy", tuneGrid = data.frame(C = c(0.25,0.5,1,2,4,8,16,32,64,128)), trControl = trainControl(method = "cv", number = 10) )- Like in the last exercise, write a few sentences describing the modeling process that this code performs, as if for the Methods section of a scientific paper.
- Print the
svm_mod_radialobject to display a summary of model performance. How does performance compare to the support vector classifier?