Mini-batch Quasi-Newton optimization for Large Scale Linear Support Vector Regression
- DOI
- 10.2991/icmmcce-15.2015.503How to use a DOI?
- Keywords
- stochastic, optimization, linear support vector regression, quasi-Newton.
- Abstract
Linear support vector regression (SVR) is a popular machine learning algorithm. However, as the amount of data increases, the learning procedure of SVR becomes time consuming. In this paper, we propose a mini-batch quasi-Newton optimization algorithm to speed up the training process of linear SVR. The main idea of the proposed optimization method is to use a small set of training data to estimate the first and second order gradient information and incorporate them into the framework of the popular limited memory BFGS quasi-Newton algorithm. Some modifications have been made to the generation of correction pairs of the BFGS algorithm in order to avoid the source of noise. Experimental results show that the proposed method outperforms some state-of-art methods in both training time and generalization ability.
- Copyright
- © 2015, the Authors. Published by Atlantis Press.
- Open Access
- This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - CONF AU - Xin Xie AU - Chao Chen AU - Zhijian Chen PY - 2015/12 DA - 2015/12 TI - Mini-batch Quasi-Newton optimization for Large Scale Linear Support Vector Regression BT - Proceedings of the 4th International Conference on Mechatronics, Materials, Chemistry and Computer Engineering 2015 PB - Atlantis Press SN - 2352-538X UR - https://doi.org/10.2991/icmmcce-15.2015.503 DO - 10.2991/icmmcce-15.2015.503 ID - Xie2015/12 ER -