PDF

Description

Support Vector Machine is a very important technique used for classification and regression. Although very accurate, the speed of SVM classification decreases with increase in the number of support vectors. This paper describes one method for reducing the number of support vectors through the application of Kernel PCA. This method is different from other proposed methods as we show that the exact choice of the reduced support vectors is not important as long as the vectors span a fixed subspace. This method reduces the number of support vectors by upto 90% without any significant degradation in performance. We also propose a heuristic to determine the reducibility of an SVM.

Details

Files

Statistics

from
to
Export
Download Full History