Abstract
Arguably the key reason for the success of deep neural networks is their ability to autonomously form non-linear combinations of the input features, which can be used in subsequent layers of the network. The analogon to this capability in inductive rule learning is to learn astructured rule base, where the inputs are combined to learn new aux-iliary concepts, which can then be used as inputs by subsequent rules. Yet, research on rule learning algorithms that have such capabilities is still in their infancy, which is — we would argue — one of the key impediments to substantial progress in this field. In this position paper, we want to draw attention to this unsolved problem, with a particular focuson previous work in predicate invention and multi-label rule learning
Original language | English |
---|---|
Title of host publication | Proceedings of the 2nd Workshop on Deep Continuous-Discrete Machine Learning (DeCoDeML) |
Editors | Kristian Kersting and Stefan Kramer and Zahra Ahmadi |
Number of pages | 6 |
Publication status | Published - 2020 |
Fields of science
- 102019 Machine learning
- 102033 Data mining
JKU Focus areas
- Digital Transformation