Comparison of Boosting and SVM


Abstract

We compare two popular algorithms in current machine learning and statistical learning areas, boosting method represented by AdaBoost and kernel based SVM (Support Vector Machine) using 13 real data sets. This comparative study shows that boosting method has smaller prediction error in data with heavy noise, whereas SVM has smaller prediction error in the data with little noise.

Keywords

References

  1. An Introduction to Support Vector Machines Cristianini, N.;Shawe-Taylor, J.
  2. Journal of Computer and System Sciences v.55 A decision theoretic generalization of on-line learning and an application to Boosting Freund, Y.;Schapir, E.R.
  3. The annals of statistic v.29 no.5 Greedy function approximation : a gradient boosting machine Friedman, J.
  4. The annals of statistic v.28 no.2 Additive logistic regression : A statistical view of boosting (with discussion and a rejoinder by the authors) Friedman, J.;Hastie, T.;Tibshirani, R.
  5. The annals of statistic v.30 On weak base hypotheses and their implications for boosting regression and classification Jiang, W.
  6. Functional gradient techniques for combining hypotheses;Advances in Large Margin Classifiers Mason, L.;Baxter, J.;Bartlett, P.;Frean, M.;Smola, A.J.(ed.);Bartlett, P.L.(ed.);Scholkopf, B.(ed.);Schuurmans, D.(ed.)
  7. The annals of statistic v.26 no.5 Boosting the margin: A new explanation for the effectiveness of voting methods Schapire, R.;Freund, Y.;Bartlett, P.;Lee, W.
  8. Machine Learning v.37 no.3 Improved boosting algorithms using confidence -rated predictions Schapire, R.;Singer, Y.
  9. Statistical Learning Theory Vapnik, V.