Exploring Diverse Solutions for Underdetermined Problems

Research output: Chapter in Book/Report/Conference proceedingPosterpeer-review

Abstract

This work explores the utility of a recently proposed diversity loss in training generative, theory-informed models on underdetermined problems with multiple solutions. Unlike data-driven methods, theory-informed learning often operates in data-free settings, optimizing neural networks to satisfy objectives and constraints. We demonstrate how this diversity loss encourages the generation of diverse solutions across various example problems, effectively avoiding mode collapse and enabling exploration of the solution space.
Original languageEnglish
Title of host publicationICML 2025 Workshop on Methods and Opportunities at Small Scale
Number of pages6
Edition1
Publication statusPublished - 10 Jun 2025

Fields of science

  • 101019 Stochastics
  • 102003 Image processing
  • 103029 Statistical physics
  • 101018 Statistics
  • 101017 Game theory
  • 102001 Artificial intelligence
  • 202017 Embedded systems
  • 101016 Optimisation
  • 101015 Operations research
  • 101014 Numerical mathematics
  • 101029 Mathematical statistics
  • 101028 Mathematical modelling
  • 101026 Time series analysis
  • 101024 Probability theory
  • 102032 Computational intelligence
  • 102004 Bioinformatics
  • 102013 Human-computer interaction
  • 101027 Dynamical systems
  • 305907 Medical statistics
  • 101004 Biomathematics
  • 305905 Medical informatics
  • 101031 Approximation theory
  • 102033 Data mining
  • 102 Computer Sciences
  • 305901 Computer-aided diagnosis and therapy
  • 102019 Machine learning
  • 106007 Biostatistics
  • 102018 Artificial neural networks
  • 106005 Bioinformatics
  • 202037 Signal processing
  • 202036 Sensor systems
  • 202035 Robotics

JKU Focus areas

  • Digital Transformation

Cite this