Unsupervised Continual Learning via Self-Adaptive Deep Clustering Approach

Mahardhika Pratama, Andri Ashfahani, Edwin Lughofer

Research output: Chapter in Book/Report/Conference proceedingConference proceedingspeer-review

Abstract

Unsupervised continual learning remains a relatively uncharted territory in the existing literature because the vast majority of existing works call for unlimited access of ground truth incurring expensive labelling cost. Another issue lies in the problem of task boundaries and task IDs which must be known for model's updates or model's predictions hindering feasibility for real-time deployment. Knowledge Retention in Self-Adaptive Deep Continual Learner, (KIERA), is proposed in this paper. KIERA is developed from the notion of flexible deep clustering approach possessing an elastic network structure to cope with changing environments in the timely manner. The centroid-based experience replay is put forward to overcome the catastrophic forgetting problem. KIERA does not exploit any labelled samples for model updates while featuring a task-agnostic merit. The advantage of KIERA has been numerically validated in popular continual learning problems where it shows highly competitive performance compared to state-of-the art approaches.
Original languageEnglish
Title of host publicationProceedings of the International Workshop on Continual Semi-Supervised Learning
Number of pages14
Volume13418
Publication statusPublished - Sept 2022

Publication series

NameLecture Notes in Computer Science (LNCS)

Fields of science

  • 101 Mathematics
  • 101013 Mathematical logic
  • 101024 Probability theory
  • 102001 Artificial intelligence
  • 102003 Image processing
  • 102019 Machine learning
  • 102035 Data science
  • 603109 Logic
  • 202027 Mechatronics

JKU Focus areas

  • Digital Transformation

Cite this