Improving the Robustness of Recursive Consequent Parameters Learning in Evolving Neuro-Fuzzy Systems

Edwin Lughofer

Research output: Contribution to journalArticlepeer-review

Abstract

During the last 15 to 20 years, evolving (neuro-) fuzzy systems (E(N) FS) have enjoyed more and more attraction in the context of data stream mining and modeling processes. This is because they can be updated on the fly in a single-pass sample-wise manner and are able to perform autonomous changes of the models on structural level in order to react onto process drifts. A wide variety of evolving (neuro-) fuzzy systems approaches have been proposed in order to handle data stream mining and modeling processes by dynamically updating the rule structure and antecedents. The current denominator in the update of the consequent (output weight) parameters is the usage of the recursive (fuzzily weighted) least squares estimator (R(FW) LS), as being applied in almost all E(N) FS approaches. In this paper, we propose and examine alternative variants for consequent parameter updates, namely multi-innovation RFWLS, recursive correntropy and especially recursive weighted total least squares (RWTLS). Multi-innovation RFWLS guarantees more stability in the update whenever structural changes (i.e. changes in the antecedents) in the E(N) FS are performed. This is because rule membership degrees are actualized on (a portion of) past samples and properly integrated in each update step. Recursive correntropy addresses the problematic of outliers by down-weighing the influence of higher errors in the parameter updates. Recursive weighted total least squares also takes into account a possible noise level in the input variables (and not solely in the target variable as done in RFWLS). The approaches are compared with standard RFWLS i.) on three data stream regression problems from practical applications, which are affected by noise levels and where one embeds a known drift, and ii.) on a time-series based forecasting problem. The results based on accumulated prediction error trends over time indicate that RFWLS can be largely outperformed by the proposed alternative variants, and this with even lower sensitivity on various data noise levels. So, the proposed variants could be worth of being further considered as promising and serious alternatives.
Original languageEnglish
Pages (from-to)555-574
Number of pages20
JournalInformation Sciences
Volume545
DOIs
Publication statusPublished - 2021

Fields of science

  • 101 Mathematics
  • 101013 Mathematical logic
  • 101024 Probability theory
  • 102001 Artificial intelligence
  • 102003 Image processing
  • 102019 Machine learning
  • 102035 Data science
  • 603109 Logic
  • 202027 Mechatronics

JKU Focus areas

  • Digital Transformation

Cite this