Pruning Error Minimization in Least Squares Support Vector Machines

Share/Save/Bookmark

Kruif de, Bas J. and Vries de, Theo J.A. (2003) Pruning Error Minimization in Least Squares Support Vector Machines. IEEE Transactions on Neural Networks, 14 (3). pp. 696-702. ISSN 1045-9227

[img]
Preview
PDF
340Kb
Abstract:The support vector machine (SVM) is a method for classification and for function approximation. This method commonly makes use of an /spl epsi/-insensitive cost function, meaning that errors smaller than /spl epsi/ remain unpunished. As an alternative, a least squares support vector machine (LSSVM) uses a quadratic cost function. When the LSSVM method is used for function approximation, a nonsparse solution is obtained. The sparseness is imposed by pruning, i.e., recursively solving the approximation problem and subsequently omitting data that has a small error in the previous pass. However, omitting data with a small approximation error in the previous pass does not reliably predict what the error will be after the sample has been omitted. In this paper, a procedure is introduced that selects from a data set the training sample that will introduce the smallest approximation error when it will be omitted. It is shown that this pruning scheme outperforms the standard one.
Item Type:Article
Copyright:©2003 IEEE
Faculty:
Electrical Engineering, Mathematics and Computer Science (EEMCS)
Research Group:
Link to this item:http://purl.utwente.nl/publications/45431
Official URL:http://dx.doi.org/10.1109/TNN.2003.810597
Export this item as:BibTeX
EndNote
HTML Citation
Reference Manager

 

Repository Staff Only: item control page

Metis ID: 212279