Fuzzy Mutual Information Based min-Redundancy and Max-Relevance Heterogeneous Feature Selection
- DOI
- 10.2991/ijcis.2011.4.4.18How to use a DOI?
- Keywords
- Feature selection; fuzzy mutual information; redundancy; relevance; stability
- Abstract
Feature selection is an important preprocessing step in pattern classification and machine learning, and mutual information is widely used to measure relevance between features and decision. However, it is difficult to directly calculate relevance between continuous or fuzzy features using mutual information. In this paper we introduce the fuzzy information entropy and fuzzy mutual information for computing relevance between numerical or fuzzy features and decision. The relationship between fuzzy information entropy and differential entropy is also discussed. Moreover, we combine fuzzy mutual information with "min-Redundancy-Max-Relevance", "Max-Dependency" and min-Redundancy-Max-Dependency" algorithms. The performance and stability of the proposed algorithms are tested on benchmark data sets. Experimental results show the proposed algorithms are effective and stable.
- Copyright
- © 2011, the Authors. Published by Atlantis Press.
- Open Access
- This is an open access article distributed under the CC BY-NC license (http://creativecommons.org/licenses/by-nc/4.0/).
Cite this article
TY - JOUR AU - Daren Yu AU - Shuang An AU - Qinghua Hu PY - 2011 DA - 2011/06/01 TI - Fuzzy Mutual Information Based min-Redundancy and Max-Relevance Heterogeneous Feature Selection JO - International Journal of Computational Intelligence Systems SP - 619 EP - 633 VL - 4 IS - 4 SN - 1875-6883 UR - https://doi.org/10.2991/ijcis.2011.4.4.18 DO - 10.2991/ijcis.2011.4.4.18 ID - Yu2011 ER -