Special Issue "Fast Learning of Neural Networks with Application to Big Data Processes"

  • Mu-Yen Chen (Other)
  • Edwin Lughofer (Other)
  • Yongping Pan (Other)
  • Jianbin Qiu (Other)
  • Jose de Jesus Rubio (Other)

Activity: Other

Description

Fast learning of neural networks, especially applied for big data processes recently has gained wide attention, with successful showcases in different areas such as the classification, prediction, pattern recognition, and identification. There are many different research directions how to realize fast learning for neural networks which have obtained success in various domains such as the incremental learning of neurons and parameters, evolving techniques over time (in single-pass sample-wise update mode), with changing structure allowed, shallow neural networks with one layered structure only (e.g., extreme learning machines), learning elastic memory online, deep learning, unsupervised learning, fast optimization algorithms than pure backpropagation, or boosting of neural networks with small weak learners (small neural networks) combined. This special issue collected seven high quality papers reporting the performance results related to some of the previously mentioned emerging research directions how to realize fast learning for neural networks.
Period20 Dec 2019

Fields of science

  • 101013 Mathematical logic
  • 101024 Probability theory
  • 202027 Mechatronics
  • 102019 Machine learning
  • 603109 Logic
  • 101 Mathematics
  • 102035 Data science
  • 102001 Artificial intelligence
  • 102003 Image processing

JKU Focus areas

  • Digital Transformation