This article puts forward a novel smooth rotated hyperbola model for support vector machine( RHSSVM) for classification. As is well known,the support vector machine( SVM) is based on statistical learning theory( SLT)and performs its high precision on data classification. However,the objective function is non-differentiable at the zero point. Therefore the fast algorithms cannot be used to train and test the SVM. To deal with it,the proposed method is based on the approximation property of the hyperbola to its asymptotic lines. Firstly,we describe the development of RHSSVM from the basic linear SVM optimization programming. Then we extend the linear model to non-linear model. We prove the solution of RHSSVM is convergent,unique,and global optimal. We show how RHSSVM can be practically implemented. At last,the theoretical analysis illustrates that compared with other three typical models,the rotated hyperbola model has the least error on approximating the plus function. Meanwhile,computer simulations show that the RHSSVM can reduce the consuming time at most 54. 6% and can efficiently handle large scale and high dimensional programming.
Support vector machines (SVMs) have shown remarkable success in many applications. However, the non-smooth feature of objective function is a limitation in practical application of SVMs. To overcome this disadvantage, a twice continuously differentiable piecewise-smooth function is constructed to smooth the objective function of unconstrained support vector machine (SVM), and it issues a piecewise-smooth support vector machine (PWESSVM). Comparing to the other smooth approximation functions, the smooth precision has an obvious improvement. The theoretical analysis shows PWESSVM is globally convergent. Numerical results and comparisons demonstrate the classification performance of our algorithm is better than other competitive baselines.