Jump to content

User:Vwinkler/Boolean satisfiability problem

From Wikipedia, the free encyclopedia

Algorithms for solving SAT

[edit]

Since the SAT problem is NP-complete, only algorithms with exponential worst-case complexity are known for it. In spite of this, efficient and scalable algorithms for SAT were developed during the 2000s and have contributed to dramatic advances in our ability to automatically solve problem instances involving tens of thousands of variables and millions of constraints (i.e. clauses).[1] Examples of such problems in electronic design automation (EDA) include formal equivalence checking, model checking, formal verification of pipelined microprocessors,[2] automatic test pattern generation, routing of FPGAs,[3] planning, and scheduling problems, and so on. A SAT-solving engine is now considered to be an essential component in the EDA toolbox.

A DPLL SAT solver employs a systematic backtracking search procedure to explore the (exponentially sized) space of variable assignments looking for satisfying assignments. The basic search procedure was proposed in two seminal papers in the early 1960s (see references below) and is now commonly referred to as the Davis–Putnam–Logemann–Loveland algorithm ("DPLL" or "DLL").[4][5] Many modern approaches to practical SAT solving base on the DPLL algorithm and share the same structure. Often they only improve the efficiency of certain classes of SAT problems such as instances that appear in industrial applications or randomly generated instances.[6] Theoretically, exponential lower bounds have been proved for the DPLL family of algorithms.

Algorithms that are not part of the DPLL family include stochastic local search algorithms. One example is WalkSAT. Stochastic methods try to find a satisfying interpretation but cannot deduce that a SAT instance is unsatisfiable, as opposed to complete algorithms, such as DPLL.[6]

In contrast, randomized algorithms like the PPSZ algorithm by Paturi, Pudlak, Saks, and Zane set variables in a random order according to some heuristics, for example bounded-width resolution. If the heuristic can't find the correct setting, the variable is assigned randomly. The PPSZ algorithm has a runtime of for 3-SAT. This was the best-known runtime for this problem until a recent improvement by Hansen, Kaplan, Zamir and Zwick that has a runtime of for 3-SAT and currently the best known runtime for k-SAT, for all values of k. In the setting with many satisfying assignments the randomized algorithm by Schöning has a better bound.[7][8][9]

Modern SAT solvers (developed in the 2000s) come in two flavors: "conflict-driven" and "look-ahead". Both approaches descend from DPLL.[6] Conflict-driven solvers, such as conflict-driven clause learning (CDCL), augment the basic DPLL search algorithm with efficient conflict analysis, clause learning, non-chronological backtracking (a.k.a. backjumping), as well as "two-watched-literals" unit propagation, adaptive branching, and random restarts. These "extras" to the basic systematic search have been empirically shown to be essential for handling the large SAT instances that arise in electronic design automation (EDA).[10] Well known implementations include Chaff[11] and GRASP[12]. Look-ahead solvers have especially strengthened reductions (going beyond unit-clause propagation) and the heuristics, and they are generally stronger than conflict-driven solvers on hard instances (while conflict-driven solvers can be much better on large instances which actually have an easy instance inside).

Modern SAT solvers are also having significant impact on the fields of software verification, constraint solving in artificial intelligence, and operations research, among others. Powerful solvers are readily available as free and open source software. In particular, the conflict-driven MiniSAT, which was relatively successful at the 2005 SAT competition, only has about 600 lines of code. A modern Parallel SAT solver is ManySAT[13]. It can achieve super linear speed-ups on important classes of problems. An example for look-ahead solvers is march_dl, which won a prize at the 2007 SAT competition.

Certain types of large random satisfiable instances of SAT can be solved by survey propagation (SP). Particularly in hardware design and verification applications, satisfiability and other logical properties of a given propositional formula are sometimes decided based on a representation of the formula as a binary decision diagram (BDD).

Almost all SAT solvers include time-outs, so they will terminate in reasonable time even if they cannot find a solution. Different SAT solvers will find different instances easy or hard, and some excel at proving unsatisfiability, and others at finding solutions. All of these behaviors can be seen in the SAT solving contests.[14]

Parallel SAT-Solving

[edit]

Parallel SAT solvers come in three categories: Portfolio, Divide-and-conquer and parallel local search algorithms. With parallel portfolios, multiple different SAT solvers run concurrently. Each of them solves a copy of the SAT instance, whereas divide-and-conquer algorithms divide the problem between the processors. Different approaches exist to parallelize local search algorithms.

The International SAT Solver Competition has a parallel track reflecting recent advances in parallel SAT solving. In 2016[15], 2017[16] and 2018[17], the benchmarks were run on a shared-memory system with 24 processing cores, therefore solvers intended for distributed memory or manycore processors might have fallen short.

Portfolios

[edit]

In general there is no SAT solver that performs better than all other solvers on all SAT problems. An algorithm might perform well for problem instances others struggle with, but will do worse with other instances. Furthermore, given a SAT instance, there is no reliable way to predict which algorithm will solve this instance particularly fast. These limitations motivate the parallel portfolio approach. A portfolio is a set of different algorithms or different configurations of the same algorithm. All solvers in a parallel portfolio run on different processors to solve of the same problem. If one solver terminates, the portfolio solver reports the problem to be satisfiable or unsatisfiable according to this one solver. All other solvers are terminated. Diversifying portfolios by including a variety of solvers, each performing well on a different set of problems, increases the robustness of the solver.[18]

Many solvers internally use a random number generator. Diversifying their seeds is a simple way to diversify a portfolio. Other diversification strategies involve enabling, disabling or diversifying certain heuristics in the sequential solver.[19]

One drawback of parallel portfolios is the amount of duplicate work. If clause learning is used in the sequential solvers, sharing learned clauses between parallel running solvers can reduce duplicate work and increase performance. Yet, even merely running a portfolio of the best solvers in parallel makes a competitive parallel solver. An example of such a solver is PPfolio[20][21]. It was designed to find a lower bound for the performance a parallel SAT solver should be able to deliver. Despite the large amount of duplicate work due to lack of optimizations, it performed well on a shared memory machine. HordeSat[22] is a parallel portfolio solver for large clusters of computing nodes. It uses differently configured instances of the same sequential solver at its core. Particularly for hard SAT instances HordeSat can produce linear speedups and therefore reduce runtime significantly.

In recent years parallel portfolio SAT solvers have dominated the parallel track of the International SAT Solver Competitions. Notable examples of such solvers include Plingeling and painless-mcomsps.[23]

Divide-and-Conquer

[edit]

In contrast to parallel portfolios, parallel Divide-and-Conquer tries to split the search space between the processing elements. Divide-and-conquer algorithms, such as the sequential DPLL, already apply the technique of splitting the search space, hence their extension towards a parallel algorithm is straight forward. However, due to techniques like unit propagation, following a division, the partial problems may differ significantly in complexity. Thus the DPLL algorithm typically does not process each part of the search space in the same amount of time, yielding a challenging load balancing problem.[18]

Tree illustrating the look-ahead phase and the resulting cubes.
Cube phase for the formula . The decision heuristic chooses which variables (circles) to assign. After the cutoff heuristic decides to stop further branching, the partial problems (rectangles) are solved independently using CDCL.

Due to non-chronological backtracking, parallelization of conflict-driven clause learning is more difficult. One way to overcome this is the Cube-and-Conquer paradigm.[24] It suggests solving in two phases. In the "cube" phase the Problem is divided into many thousands, up to millions, of sections. This is done by a look-ahead solver, that finds a set of partial configurations called "cubes". A cube can also be seen as a conjunction of a subset of variables of the original formula. In conjunction with the formula, each of the cubes forms a new formula. These formulas can be solved independently and concurrently by conflict-driven solvers. As the disjunction of these formulas is equivalent to the original formula, the problem is reported to be satisfiable, if one of the formulas is satisfiable. The look-ahead solver is favorable for small but hard problems[25], so it is used to gradually divide the problem into multiple sub-problems. These sub-problems are easier but still large which is the ideal form for a conflict-driven solver. Furthermore look-ahead solvers consider the entire problem whereas conflict-driven solvers make decisions based on information that is much more local. There are three heuristics involved in the cube phase. The variables in the cubes are chosen by the decision heuristic. The direction heuristic decides which variable assignment (true or false) to explore first. In satisfiable problem instances, choosing a satisfiable branch first is beneficial. The cutoff heuristic decides when to stop expanding a cube and instead forward it to a sequential conflict-driven solver. Preferably the cubes are similarly complex to solve.[24]

Treengeling is an example for a parallel solver that applies the Cube-and-Conquer paradigm. Since its introduction in 2012 it has had multiple successes at the International SAT Solver Competition. Cube-and-Conquer was used to solve the Boolean Pythagorean triples problem.[26]

[edit]

One strategy towards a parallel local search algorithm for SAT solving is trying multiple variable flips concurrently on different processing units.[27] Another is to apply the aforementioned portfolio approach, however clause sharing is not possible since local search solvers do not produce clauses. Alternatively, it is possible to share the configurations that are produced locally. These configurations can be used to guide the production of a new initial configuration when a local solver decides to restart its search.[28]

  1. ^ Ohrimenko, Olga; Stuckey, Peter J.; Codish, Michael (2007), "Propagation = Lazy Clause Generation", Principles and Practice of Constraint Programming – CP 2007, Lecture Notes in Computer Science, vol. 4741, pp. 544–558, CiteSeerX 10.1.1.70.5471, doi:10.1007/978-3-540-74970-7_39, modern SAT solvers can often handle problems with millions of constraints and hundreds of thousands of variables.
  2. ^ Cite error: The named reference Bryant.German.Velev.1999 was invoked but never defined (see the help page).
  3. ^ Gi-Joon Nam; Sakallah, K. A.; Rutenbar, R. A. (2002). "A new FPGA detailed routing approach via search-based Boolean satisfiability" (PDF). IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems. 21 (6): 674. doi:10.1109/TCAD.2002.1004311.
  4. ^ Davis, M.; Putnam, H. (1960). "A Computing Procedure for Quantification Theory". Journal of the ACM. 7 (3): 201. doi:10.1145/321033.321034.
  5. ^ Davis, M.; Logemann, G.; Loveland, D. (1962). "A machine program for theorem-proving" (PDF). Communications of the ACM. 5 (7): 394–397. doi:10.1145/368273.368557.
  6. ^ a b c Zhang, Lintao; Malik, Sharad (2002), "The Quest for Efficient Boolean Satisfiability Solvers", Computer Aided Verification, Springer Berlin Heidelberg, pp. 17–36, ISBN 978-3-540-43997-4, retrieved 2020-02-27
  7. ^ Cite error: The named reference Schoning.1999 was invoked but never defined (see the help page).
  8. ^ "An improved exponential-time algorithm for k-SAT", Paturi, Pudlak, Saks, Zani
  9. ^ "Faster k-SAT algorithms using biased-PPSZ", Hansen, Kaplan, Zamir, Zwick
  10. ^ Vizel, Y.; Weissenbacher, G.; Malik, S. (2015). "Boolean Satisfiability Solvers and Their Applications in Model Checking". Proceedings of the IEEE. 103 (11). doi:10.1109/JPROC.2015.2455034.
  11. ^ Moskewicz, M. W.; Madigan, C. F.; Zhao, Y.; Zhang, L.; Malik, S. (2001). "Chaff: Engineering an Efficient SAT Solver" (PDF). Proceedings of the 38th conference on Design automation (DAC). p. 530. doi:10.1145/378239.379017. ISBN 1581132972.
  12. ^ Marques-Silva, J. P.; Sakallah, K. A. (1999). "GRASP: a search algorithm for propositional satisfiability" (PDF). IEEE Transactions on Computers. 48 (5): 506. doi:10.1109/12.769433.
  13. ^ http://www.cril.univ-artois.fr/~jabbour/manysat.htm
  14. ^ "The international SAT Competitions web page". Retrieved 2007-11-15.
  15. ^ "SAT Competition 2016". baldur.iti.kit.edu. Retrieved 2020-02-13.
  16. ^ "SAT Competition 2017". baldur.iti.kit.edu. Retrieved 2020-02-13.
  17. ^ "SAT Competition 2018". sat2018.forsyte.tuwien.ac.at. Retrieved 2020-02-13.
  18. ^ a b Balyo, Tomáš; Sinz, Carsten (2018), "Parallel Satisfiability", Handbook of Parallel Constraint Reasoning, Springer International Publishing, pp. 3–29, ISBN 978-3-319-63515-6, retrieved 2020-02-07
  19. ^ Biere, Armin. "Lingeling, Plingeling, PicoSAT and PrecoSAT at SAT Race 2010" (PDF). SAT-RACE 2010.{{cite web}}: CS1 maint: url-status (link)
  20. ^ "ppfolio solver". www.cril.univ-artois.fr. Retrieved 2019-12-29.
  21. ^ "SAT 2011 Competition: 32 cores track: ranking of solvers". www.cril.univ-artois.fr. Retrieved 2020-02-13.
  22. ^ Balyo, Tomáš; Sanders, Peter; Sinz, Carsten (2015), "HordeSat: A Massively Parallel Portfolio SAT Solver", Lecture Notes in Computer Science, Springer International Publishing, pp. 156–172, ISBN 978-3-319-24317-7, retrieved 2019-12-29
  23. ^ "SAT Competition 2018". sat2018.forsyte.tuwien.ac.at. Retrieved 2020-02-13.
  24. ^ a b Heule, Marijn J. H.; Kullmann, Oliver; Wieringa, Siert; Biere, Armin (2012), "Cube and Conquer: Guiding CDCL SAT Solvers by Lookaheads", Hardware and Software: Verification and Testing, Springer Berlin Heidelberg, pp. 50–65, ISBN 978-3-642-34187-8, retrieved 2019-12-30
  25. ^ Heule, Marijn J. H.; van Maaren, Hans (2009). "Look-Ahead Based SAT Solvers" (PDF). Handbook of Satisfiability. IOS Press. pp. 155–184. ISBN 978-1-58603-929-5.
  26. ^ Heule, Marijn J. H.; Kullmann, Oliver; Marek, Victor W. (2016), "Solving and Verifying the Boolean Pythagorean Triples Problem via Cube-and-Conquer", Theory and Applications of Satisfiability Testing – SAT 2016, Springer International Publishing, pp. 228–245, ISBN 978-3-319-40969-6, retrieved 2019-12-31
  27. ^ Roli, Andrea (2002), "Criticality and Parallelism in Structured SAT Instances", Lecture Notes in Computer Science, Springer Berlin Heidelberg, pp. 714–719, ISBN 978-3-540-44120-5, retrieved 2020-02-07
  28. ^ Arbelaez, Alejandro; Hamadi, Youssef (2011), "Improving Parallel Local Search for SAT", Lecture Notes in Computer Science, Springer Berlin Heidelberg, pp. 46–60, ISBN 978-3-642-25565-6, retrieved 2020-02-13