ε-SSVR: A smooth support vector machine for ε-insensitive regression

Yuh-Jye Lee*, Wen Feng Hsieh, Chien Ming Huang

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

137 Scopus citations

Abstract

A new smoothing strategy for solving ε-support vector regression (ε-SVR), tolerating a small error in fitting a given data set linearly or nonlinearly, is proposed in this paper. Conventionally, ε-SVR is formulated as a constrained minimization problem, namely, a convex quadratic programming problem. We apply the smoothing techniques that have been used for solving the support vector machine for classification, to replace the ε-insensitive loss function by an accurate smooth approximation. This will allow us to solve ε-SVR as an unconstrained minimization problem directly. We term this reformulated problem as ε-smooth support vector regression (ε-SSVR). We also prescribe a Newton-Armijo algorithm that has been shown to be convergent globally and quadratically to solve our ε-SSVR. In order to handle the case of nonlinear regression with a massive data set, we also introduce the reduced kernel technique in this paper to avoid the computational difficulties in dealing with a huge and fully dense kernel matrix. Numerical results and comparisons are given to demonstrate the effectiveness and speed of the algorithm.

Original languageEnglish
Pages (from-to)678-685
Number of pages8
JournalIEEE Transactions on Knowledge and Data Engineering
Volume17
Issue number5
DOIs
StatePublished - 1 May 2005

Keywords

  • Kernel method
  • Newton-armijo algorithm
  • Support vector machine
  • ε-insensitive loss function
  • ε-smooth support vector regression

Fingerprint Dive into the research topics of 'ε-SSVR: A smooth support vector machine for ε-insensitive regression'. Together they form a unique fingerprint.

Cite this