Combining Conflict-Driven Clause Learning and Chronological Backtracking for Propositional Model Counting

Sibylle Möhle, Armin Biere

Research output: Chapter in Book/Report/Conference proceedingConference proceedingspeer-review

Abstract

In propositional model counting, also named #SAT, the search space needs to be explored exhaustively, in contrast to SAT, where the task is to determine whether a propositional formula is satisfiable. While state-of-the-art SAT solvers are based on non-chronological backtracking, it has also been shown that backtracking chronologically does not significantly degrade solver performance. Hence investigating the combination of chronological backtracking with conflict-driven clause learning (CDCL) for #SAT seems evident. We present a calculus for #SAT combining chronological backtracking with CDCL and provide a formal proof of its correctness.
Original languageEnglish
Title of host publication5th Global Conference on Artificial Intelligence
Editors Diego Calvanese, Luca Iocchi
Number of pages14
Publication statusPublished - 2019

Publication series

NameEasyChair EPiC series in Computing

Fields of science

  • 102 Computer Sciences
  • 102001 Artificial intelligence
  • 102011 Formal languages
  • 102022 Software development
  • 102031 Theoretical computer science
  • 603109 Logic
  • 202006 Computer hardware

JKU Focus areas

  • Digital Transformation

Cite this