Autoregressive activity prediction for low-data drug discovery

Johannes Schimunek, Lukas Friedrich, Daniel Kuhn, Günter Klambauer

Research output: Chapter in Book/Report/Conference proceedingConference proceedingspeer-review

Abstract

Autoregressive modeling is the main learning paradigm behind the currently so successful large language models (LLM). For sequential tasks, such as generating natural language, autoregressive modeling is a natural choice: the sequence is generated by continuously appending the next sequence token. In this work, we investigate whether the autoregressive modeling paradigm could also be successfully used for molecular activity and property prediction models, which are equivalent to LLMs in molecular sciences. To this end, we formulate autoregressive activity prediction modeling (AR-APM), draw relations to transductive and active learning, and assess the predictive quality of AR-APM models in few-shot learning scenarios. Our experiments show that using an existing few-shot learning system without any other changes, except switching to autoregressive mode for inference, improves ∆AUC-PR up to ∼40%. Code is available here: https://github.com/ml-jku/autoregressive_activity_prediction.
Original languageEnglish
Title of host publicationPML4LRS workshop @ ICLR 2024
Number of pages12
Publication statusPublished - Mar 2024

Fields of science

  • 305907 Medical statistics
  • 202017 Embedded systems
  • 202036 Sensor systems
  • 101004 Biomathematics
  • 101014 Numerical mathematics
  • 101015 Operations research
  • 101016 Optimisation
  • 101017 Game theory
  • 101018 Statistics
  • 101019 Stochastics
  • 101024 Probability theory
  • 101026 Time series analysis
  • 101027 Dynamical systems
  • 101028 Mathematical modelling
  • 101029 Mathematical statistics
  • 101031 Approximation theory
  • 102 Computer Sciences
  • 102001 Artificial intelligence
  • 102003 Image processing
  • 102004 Bioinformatics
  • 102013 Human-computer interaction
  • 102018 Artificial neural networks
  • 102019 Machine learning
  • 102032 Computational intelligence
  • 102033 Data mining
  • 305901 Computer-aided diagnosis and therapy
  • 305905 Medical informatics
  • 202035 Robotics
  • 202037 Signal processing
  • 103029 Statistical physics
  • 106005 Bioinformatics
  • 106007 Biostatistics

JKU Focus areas

  • Digital Transformation

Cite this