hi-RF: Incremental Learning Random Forest for Large-Scale Multi-class Data Classification
- DOI
- 10.2991/aiie-16.2016.72How to use a DOI?
- Keywords
- large scale multi-class classification; Incremental Learning; random forest; heterogeneous incremental Nearest Class Mean Random Forest
- Abstract
In recent years, dynamically growing data and large-scale data classification research. Most traditional methods struggle to balance the precision and computational burden when data and its number of classes increased. However, some methods are with weak precision, and the others are time-consuming. In this paper, we propose an incremental learning method, namely, heterogeneous incremental Nearest Class Mean Random Forest (hi-RF), to handle this issue. It is a heterogeneous method that either replaces trees or updates trees leaves in the random forest adaptively, to reduce the computational time in comparable performance, when data of new classes arrive. Specifically, to keep the accuracy, one proportion of trees are replaced by new NCM decision trees; to reduce the computational load, the rest trees are updated their leaves probabilities only. Most of all, out-of-bag estimation and out-of-bag boosting are proposed to balance the accuracy and the computational efficiency. Fair experiments were conducted and demonstrated its comparable precision with much less computational time.
- Copyright
- © 2016, the Authors. Published by Atlantis Press.
- Open Access
- This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - CONF AU - Tingting Xie AU - Changjian Wang AU - Yuxing Peng PY - 2016/11 DA - 2016/11 TI - hi-RF: Incremental Learning Random Forest for Large-Scale Multi-class Data Classification BT - Proceedings of the 2016 2nd International Conference on Artificial Intelligence and Industrial Engineering (AIIE 2016) PB - Atlantis Press SP - 312 EP - 321 SN - 1951-6851 UR - https://doi.org/10.2991/aiie-16.2016.72 DO - 10.2991/aiie-16.2016.72 ID - Xie2016/11 ER -