TY - GEN
T1 - On Dynamic Soft Dimension Reduction in Evolving Fuzzy Classifiers
AU - Lughofer, Edwin
PY - 2010/7
Y1 - 2010/7
N2 - This paper deals with the problem of dynamic dimension reduction during the on-line update and evolution of fuzzy classifiers. With ’dynamic’ it is meant that the importance of features for discriminating between the classes changes over time when new data is sent into the classifiers’ update mechanisms. In order to avoid discontinuity in the incremental learning process, i.e. permanently exchanging some features in the input structure of the fuzzy classifiers, we include feature weights (lying in [0,1]) into the training and update of the fuzzy classifiers, which measure the importance levels of the various features and can be smoothly updated with new incoming samples. In some cases, when the weights become (approximately) 0, an automatic switching off of some features and therefore a (soft) dimension reduction is achieved. The approaches for incrementally updating the feature weights are based on a leave-one-feature-out and on a feature-wise separability criterion. We will describe the integration concept of the feature weights in evolving fuzzy classifiers using single and multi-model architecture. The whole approach will be evaluated based on high-dimensional on-line real-world classification scenarios.
AB - This paper deals with the problem of dynamic dimension reduction during the on-line update and evolution of fuzzy classifiers. With ’dynamic’ it is meant that the importance of features for discriminating between the classes changes over time when new data is sent into the classifiers’ update mechanisms. In order to avoid discontinuity in the incremental learning process, i.e. permanently exchanging some features in the input structure of the fuzzy classifiers, we include feature weights (lying in [0,1]) into the training and update of the fuzzy classifiers, which measure the importance levels of the various features and can be smoothly updated with new incoming samples. In some cases, when the weights become (approximately) 0, an automatic switching off of some features and therefore a (soft) dimension reduction is achieved. The approaches for incrementally updating the feature weights are based on a leave-one-feature-out and on a feature-wise separability criterion. We will describe the integration concept of the feature weights in evolving fuzzy classifiers using single and multi-model architecture. The whole approach will be evaluated based on high-dimensional on-line real-world classification scenarios.
UR - https://www.scopus.com/pages/publications/77954874430
U2 - 10.1007/978-3-642-14049-5_9
DO - 10.1007/978-3-642-14049-5_9
M3 - Conference proceedings
SN - 3642140483
SN - 9783642140488
VL - 6178
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 79
EP - 88
BT - Computational Intelligence for Knowledge-Based Systems Design - 13th International Conference on Information Processing and Management of Uncertainty, IPMU 2010, Proceedings
A2 - Hüllermeier, Kruse, Hoffmann, null
PB - Springer Verlag
ER -