Abstract
In this paper, we are dealing with a novel data-driven learning method (SparseFIS) for Takagi-Sugeno fuzzy systems, extended by including rule weights.
Our learning method consists of three phases:
the first phase conducts a clustering process in the input/output feature space with
iterative vector quantization and projects the obtained clusters onto one-dimensional axes
to form the fuzzy sets (centers and widths) in the antecedent parts of the rules.
Hereby, the number of clusters = rules is pre-defined and denotes a kind of upper bound on a reasonable granularity. The second phase optimizes the rule weights in the fuzzy systems with respect to least squares error measure by applying a sparsity-constrained steepest descent optimization procedure. Depending on the sparsity threshold, weights of many or few rules can be forced towards 0, and thereby switching off (eliminating) some rules (rule selection).
The third phase estimates the linear consequent parameters by a regularized sparsity constrained optimization procedure for each rule separately (local learning approach). Sparsity constraints are applied in order to force linear parameters to be 0, triggering a feature selection mechanism per rule. Global feature selection is achieved, whenever the linear parameters of some features in each rule are (near) 0.
The method is evaluated based on high-dimensional data from industrial processes and based on benchmark data sets
from the internet and compared to well-known batch training methods in terms of accuracy and complexity of the fuzzy
systems.
Original language | English |
---|---|
Article number | 5411778 |
Pages (from-to) | 396-411 |
Number of pages | 16 |
Journal | IEEE Transactions on Fuzzy Systems |
Volume | 18 |
Issue number | 2 |
DOIs | |
Publication status | Published - Apr 2010 |
Fields of science
- 101 Mathematics
- 101004 Biomathematics
- 101027 Dynamical systems
- 101013 Mathematical logic
- 101028 Mathematical modelling
- 101014 Numerical mathematics
- 101020 Technical mathematics
- 101024 Probability theory
- 102001 Artificial intelligence
- 102003 Image processing
- 102009 Computer simulation
- 102019 Machine learning
- 102023 Supercomputing
- 202027 Mechatronics
- 206001 Biomedical engineering
- 206003 Medical physics
- 102035 Data science
JKU Focus areas
- Engineering and Natural Sciences (in general)
- Digital Transformation