This paper presents a novel multiclass feature selection algorithm based on weighted conditional entropy, also referred to as uncertainty. The goal of the proposed algorithm is to select a feature subset such that, for each feature sample, there exists a feature that has a low uncertainty score in the selected feature subset. Features are first quantized into different bins. The proposed feature selection method first computes an uncertainty vector from weighted conditional entropy. Lower the uncertainty score for a class, better is the separability of the samples in that class. Next, an iterative feature selection method selects a feature in each iteration by (1) computing the minimum uncertainty score for each feature sample for all possible feature subset candidates, (2) computing the average minimum uncertainty score across all feature samples, and (3) selecting the feature that achieves the minimum of the mean of the minimum uncertainty score. The experimental results show that the proposed algorithm outperforms mRMR and achieves lower misclassification rates using various types of publicly available datasets. In most cases, the number of features necessary for a specified misclassification error is less than that required by traditional methods. For all datasets, the misclassification error is reduced by 5∼25% on average, compared to a traditional method.
|Original language||English (US)|
|Number of pages||14|
|Journal||Journal of Signal Processing Systems|
|State||Published - Jan 1 2020|
Bibliographical notePublisher Copyright:
© 2019, Springer Science+Business Media, LLC, part of Springer Nature.
- Feature selection
- Multi-class classification
- Mutual information
- Uncertainty score
- Weighted conditional entropy