On Recursive Marginal and MAP Inference in State Observation Models

  • Branislav Rudic*
  • , Valentin Sturm
  • , Dmitry Efrosinin
  • *Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference proceedingspeer-review

Abstract

Maximum A Posteriori (MAP) inference in state observation models typically covers decoding either marginal MAP state estimates or the joint MAP state sequence estimate. This paper addresses a novel yet fundamental MAP inference method denoted as predecessor decoding. This method recursively decodes the most probable predecessors of a chosen initial state using only the marginal distributions from a forward filtering pass. We elaborate on the motivations, abstract relations and analogues, and in particular, the differences between marginal MAP, joint MAP, and MAP predecessors. We conclude by comparing recent results, where predecessor decoding has been utilized for Gaussian mixture models.
Original languageEnglish
Title of host publicationInformation Technologies and Mathematical Modelling. Queueing Theory and Related Fields - 23rd International Conference, ITMM 2024, Revised Selected Papers
Subtitle of host publication23rd International Conference, ITMM 2024, Karshi, Uzbekistan, October 20–26, 2024, Revised Selected Papers
EditorsAlexander Dudin, Anatoly Nazarov, Alexander Moiseev
PublisherSpringer Nature
Pages85-97
Number of pages13
Edition1
ISBN (Print)978-3-031-88306-4
DOIs
Publication statusPublished - 06 May 2025

Publication series

NameCommunications in Computer and Information Science
Volume2472 CCIS
ISSN (Print)1865-0929
ISSN (Electronic)1865-0937

Fields of science

  • 203 Mechanical Engineering
  • 202034 Control engineering
  • 101 Mathematics
  • 202027 Mechatronics
  • 203033 Hydraulic drive technology
  • 202 Electrical Engineering, Electronics, Information Engineering
  • 202009 Electrical drive engineering
  • 202036 Sensor systems
  • 102 Computer Sciences
  • 101027 Dynamical systems
  • 102003 Image processing
  • 102023 Supercomputing
  • 102001 Artificial intelligence
  • 101004 Biomathematics
  • 102035 Data science
  • 101014 Numerical mathematics
  • 101028 Mathematical modelling
  • 101013 Mathematical logic
  • 102009 Computer simulation
  • 102019 Machine learning
  • 101024 Probability theory
  • 206003 Medical physics
  • 206001 Biomedical engineering
  • 101020 Technical mathematics
  • 101019 Stochastics
  • 101018 Statistics

JKU Focus areas

  • Digital Transformation

Cite this