TY - GEN
T1 - Combining Conflict-Driven Clause Learning and Chronological Backtracking for Propositional Model Counting
AU - Möhle, Sibylle
AU - Biere, Armin
PY - 2019
Y1 - 2019
N2 - In propositional model counting, also named #SAT, the search space needs to be explored exhaustively, in contrast to SAT, where the task is to determine whether a propositional formula is satisfiable. While state-of-the-art SAT solvers are based on non-chronological backtracking, it has also been shown that backtracking chronologically does not significantly degrade solver performance. Hence investigating the combination of chronological backtracking with conflict-driven clause learning (CDCL) for #SAT seems evident. We present a calculus for #SAT combining chronological backtracking with CDCL and provide a formal proof of its correctness.
AB - In propositional model counting, also named #SAT, the search space needs to be explored exhaustively, in contrast to SAT, where the task is to determine whether a propositional formula is satisfiable. While state-of-the-art SAT solvers are based on non-chronological backtracking, it has also been shown that backtracking chronologically does not significantly degrade solver performance. Hence investigating the combination of chronological backtracking with conflict-driven clause learning (CDCL) for #SAT seems evident. We present a calculus for #SAT combining chronological backtracking with CDCL and provide a formal proof of its correctness.
M3 - Conference proceedings
T3 - EasyChair EPiC series in Computing
BT - 5th Global Conference on Artificial Intelligence
A2 - Diego Calvanese, Luca Iocchi, null
ER -