On Dynamic Soft Dimension Reduction in Evolving Fuzzy Classifiers

  • Edwin Lughofer (Speaker)

Activity: Talk or presentationContributed talkunknown

Description

This paper deals with the problem of dynamic dimension reduction during the on-line update and evolution of fuzzy classifiers. With ’dynamic’ it is meant that the importance of features for discriminating between the classes changes over time when new data is sent into the classifiers’ update mechanisms. In order to avoid discontinuity in the incremental learning process, i.e. permanently exchanging some features in the input structure of the fuzzy classifiers, we include feature weights (lying in [0,1]) into the training and update of the fuzzy classifiers, which measure the importance levels of the various features and can be smoothly updated with new incoming samples. In some cases, when the weights become (approximately) 0, an automatic switching off of some features and therefore a (soft) dimension reduction is achieved. The approaches for incrementally updating the feature weights are based on a leave-one-feature-out and on a feature-wise separability criterion. We will describe the integration concept of the feature weights in evolving fuzzy classifiers using single and multi-model architecture. The whole approach will be evaluated based on high-dimensional on-line real-world classification scenarios.
Period30 Jun 2010
Event title13th International Conference on Information Processing and Management of Uncertainty, IPMU 2010
Event typeConference
LocationGermanyShow on map

Fields of science

  • 101013 Mathematical logic
  • 202027 Mechatronics
  • 101029 Mathematical statistics
  • 102001 Artificial intelligence
  • 102003 Image processing