On the Potential of Deep Symbolic Models for Classification Problems

Florian Beck*, Johannes Fürnkranz, Van Quoc Phuong Huynh

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference proceedingspeer-review

Abstract

With the rise of neural network approaches for machine learning problems, the focus has shifted to learning deep concepts across multiple layers. Although single-hidden-layer networks are sufficient to arbitrarily approximate any function, multilayer networks have proven to be superior in predictive performance for many applications. Unlike for neural networks, in symbolic machine learning approaches like decision tree or rule learning algorithms, the benefits of hidden layers of learned intermediate concepts remain uncertain. In this work, we empirically investigate the potential gains of deep concepts for symbolic approaches from three perspectives. First, we compare the number of possible flat and deep Boolean expressions with limited complexity, underlining the higher expressive power of deep models in such a setting. Second, we use logic minimization algorithms to generate minimal flat and deep Boolean formulas for artificial Boolean classification problems with different numbers of attributes and training examples, showing under which circumstances the use of deep concepts can lead to noticeably less complex models. Third, we compare the predictive performance of flat and deep models with a fixed maximum complexity on these datasets. We interpret these results as evidence that encourages further investigation of algorithms for learning complexity-bounded deep rule sets.
Original languageEnglish
Title of host publicationProceedings of the 28th International Conference on Discovery Science (DS)
EditorsSašo Džeroski, Jurica Levatić, Gianvito Pio, Nikola Simidjievski
Place of PublicationLjubljana, Slovenia
PublisherSpringer, Cham
Pages161-175
Number of pages15
ISBN (Print)9783032054609
DOIs
Publication statusPublished - 2025

Publication series

NameLecture Notes in Artificial Intelligence
PublisherSpringer, Cham

Fields of science

  • 102001 Artificial intelligence
  • 102035 Data science
  • 102033 Data mining
  • 102019 Machine learning
  • 102028 Knowledge engineering
  • 102031 Theoretical computer science

JKU Focus areas

  • Digital Transformation

Cite this