An Online Algorithm for Support Vector Machine Based on Subgradient Projection

Hua Liu, Zonghai Sun

Abstract


This paper presents an online algorithm for support vector regression based on subgradient projection in reproducing kernel Hilbert spaces. Firstly, the paper choose the distance between f and the intersection of SK to characterize the empirical risk of support vector machine learning, and formulate a new expression of support vector regression sequentially. According to the optimization theory, the paper obtains the optimal solution of the new formulation of support vector regression by choosing the feasible clearance i.e. the difference between the primal objective function value and the dual objective function value to characterize the optimal solution of support vector machine. Secondly, the paper explains the selection of subgradient based on set theory and projection theory what is sensitive to the convergence of the algorithm. Finally, compared with projection adaptive natural gradient algorithm (PANG), the paper has verified that the accuracy of two algorithms are similar, however, the adaptive projected subgradient algorithm (APSA) trains much faster than the adaptive natural gradient algorithm by simulating the Mackey-Glass (MG) system and a class of nonlinear control system.

Full Text: PDF