SAT Competitions
The International SAT Competition Web Page
Current Competition
SAT 2026 Competition
Organizers
Katalin Fazekas
Marijn Heule
Ashlin Iser
Past Competitions, Races and Evaluations
SAT 2025 Competition
Organizers
Katalin Fazekas
Marijn Heule
Ashlin Iser
SAT 2024 Competition
Organizers
Marijn Heule
Ashlin Iser
Matti Järvisalo
Martin Suda
SAT 2023 Competition
Organizers
Marijn Heule
Matti Järvisalo
Martin Suda
Ashlin Iser
Tomáš Balyo
SAT 2022 Competition
Organizers
Marijn Heule
Matti Järvisalo
Martin Suda
Ashlin Iser
Tomáš Balyo
SAT 2021 Competition
Organizers
Marijn Heule
Matti Järvisalo
Martin Suda
Ashlin Iser
Tomáš Balyo
Nils Froleyks
SAT 2020 Competition
Organizers
Marijn Heule
Matti Järvisalo
Martin Suda
Ashlin Iser
Tomáš Balyo
Nils Froleyks
SAT 2019 Race
Organizers
Marijn Heule
Matti Järvisalo
Martin Suda
SAT 2018 Competition
Organizers
Marijn Heule
Matti Järvisalo
Martin Suda
Slides
Slides used at SAT 2018
Proceedings
Descriptions of the solvers and benchmarks
Benchmarks
Available here
Solvers
Available here
Gold
Silver
Bronze
Main Track
SAT+UNSAT
Maple_LCM_Dist_ChronoBT
Maple_LCM_Scavel
Maple_CM
SAT
Maple_LCM_Dist_ChronoBT
Maple_LCM_Scavel
CryptoMiniSat 5.5
UNSAT
CaDiCaL
Maple_LCM_M1
Maple_CM
Parallel Track
SAT+UNSAT
Painless
Plingeling
abcdSAT
SAT
Plingeling
Painless
CryptoMiniSat 5.5
UNSAT
Painless
Plingeling
abcdSAT
No-Limits Track
ReasonLS
Maple_CM
CryptoMiniSat 5.5 V20
Glucose Hack Track
GHackCOMSPS
inIDGlucose
glu_mix
Random Track
Sparrow2Riss
gluHack
glucose-3.0_PADC_10
SAT 2017 Competition
Organizers
Marijn Heule
Matti Järvisalo
Tomáš Balyo
Slides
Slides used at SAT 2017
Proceedings
Descriptions of the solvers and benchmarks
Benchmarks
Available here
Solvers
Available here
Gold
Silver
Bronze
Gold
Silver
Bronze
Gold
Silver
Bronze
Agile Track
Main Track
Random Track
SAT+UNSAT
CaDiCaL Agile, CaDiCaL NoProof
Glu_VC
Glucose 4.1
Maple LCM Dist,
Maple LCM,
MapleLRB LCMOccRestart,
MapleLRB LCM
MapleCOMSPS LRB VSIDS 2,
MapleCOMSPS LRB VSIDS
COMiniSATPS Pulsar
YalSAT
tch glucose3
Score2SAT
Parallel Track
No-Limit Track
Incremental Library Track
SAT+UNSAT
Syrup24, Syrup48
Plingeling
Painless MapleCOMSPS
COMiniSATPS Pulsar
MapleCOMPSPS LRB VSIDS 2, MapleCOMPSPS LRB VSIDS
CaDiCaL NoProof
AbcdSAT
Glucose
Riss
SAT 2016 Competition
Organizers
Marijn Heule
Matti Järvisalo
Tomáš Balyo
Proceedings
Descriptions of the solvers and benchmarks
Benchmarks
Available here
Solvers
Available here
Gold
Silver
Bronze
Gold
Silver
Bronze
Gold
Silver
Bronze
Agile Track
Main Track
Random Track
SAT+UNSAT
Riss
TB_Glucose
CHBR_Glucose
MapleCOMSPS
Riss
Lingeling
Dimetheus
CSCCSat
DCCAlm
Parallel Track
No-Limit Track
Incremental Library Track
SAT+UNSAT
Treengeling
Plingeling
CryptoMiniSat
BreakIDCOMiniSatPS
Lingeling
abcdSAT
CryptoMiniSat
Glucose
Riss
Best Application Benchmark Solver in the Main Track
Best Crafted Benchmark Solver in the Main Track
Best Glucose Hack in the Main Track
SAT+UNSAT
MapleCOMSPS
TC Glucose
Kiel
SAT 2015 Race
Organizing committee
Tomas Balyo
Carsten Sinz
Markus Iser
Armin Biere
SAT 2014 Competition
Organizing committee
Anton Belov
Daniel Diepold
Marijn Heule
Matti Järvisalo
Judges
Pete Manolios
Lakhdar Sais
and
Peter Stuckey
Proceedings
Descriptions of the solvers and benchmarks
Benchmarks
Application
Hard combinatorial
Random
Solvers
Source code available in
EDACC
Application
Hard combinatorial
Random
Gold
Silver
Bronze
Gold
Silver
Bronze
Gold
Silver
Bronze
Core solvers
SAT+UNSAT
Lingeling
SWDiA5BY
Riss BlackBox
glueSplit_clasp
Lingeling
SparrowToRiss
SAT
minisat_blbd
Riss BlackBox
SWDiA5BY
SparrowToRiss
CCAnr+glucose
SGSeq
Dimetheus
BalancedZ
CSCCSat2014
Certified UNSAT
Lingeling (druplig)
glucose
SWDiA5BY
Riss BlackBox
Lingeling (druplig)
glucose
Core solvers, Parallel
SAT+UNSAT
Plingeling
PeneLoPe
Treengeling
Treengeling
Plingeling
pmcSAT 2.0
SAT
pprobSAT
Plingeling
CSCCSat2014
Minisat hack
SAT+UNSAT
MiniSat_HACK_999ED
minisat_blbd
ROKKminisat
SAT 2013 Competition
Organizing committee
Adrian Balint
Anton Belov
Marijn Heule
Matti Järvisalo
Judges
Roberto Sebastiani
Karem A. Sakallah
and
Youssef Hamadi
Proceedings
Descriptions of the solvers and benchmarks
Benchmarks
Application
Hard combinatorial
Random
Solvers
Application
Hard combinatorial
Random
Gold
Silver
Bronze
Gold
Silver
Bronze
Gold
Silver
Bronze
Core solvers
SAT+UNSAT
Lingeling aqw
Lingeling 587f
ZENN 0.1.0
BreakIDGlucose 1
gluebit_clasp 1.0
glucose 2.3
CSHCrandMC
MIPSat random sat_unsat
march_vflip 1.0
SAT
Lingeling aqw
ZENN 0.1.0
satUZK 48
glucose 2.3
gluebit_clasp 1.0
BreakIDGlucose 1
probSAT SC13
sattime2013 2013
Ncca+ V 1.0
Certified UNSAT
glucose 2.3 (certified unsat)
glueminisat-cert-unsat 2.2.7j
Riss3g cert
Riss3g cert
glucose 2.3 (certified unsat)
forl drup-nocachestamp
Core solvers, Parallel
SAT+UNSAT
Plingeling aqw
Treengeling aqw
PeneLoPe 2013
Treengeling aqw
Plingeling aqw
pmcSAT 1.0
Minisat hack
SAT+UNSAT
SINNminisat 1.0.0
minisat_bit 1.0
MiniGolf prefetch
Open track (multiple solver sources, mixed benchmarks)
CSHCpar8
MIPSat
GlucoRed+March r531
SAT 2012 Challenge
Organizers
Adrian Balint
Anton Belov
Matti Järvisalo
Carsten Sinz
SAT 2011 Competition
Organizing committee
Matti J�rvisalo
Daniel Le Berre
and
Olivier Roussel
Judges
Uwe Egly
Alexander Nadel
Ashish Sabharwal
and
Moshe Vardi
Benchmarks
whole selection
(tar of bz2 files, 1.7 GiB)
Solvers
static binaries
dynamic libraries
source code
CPU Time
Application
Crafted
Random
Gold
Silver
Bronze
Gold
Silver
Bronze
Gold
Silver
Bronze
SAT+UNSAT
glucose
glueminisat
lingeling
3S
ppfolio //
ppfolio seq
3S
ppfolio //
ppfolio seq
SAT
contrasat hack
cirminisat hack
mphasesat64
ppfolio //
ppfolio seq
3S
sparrow2011
sattime2011
eagleup
UNSAT
glueminisat
glucose
qutersat
clasp
3S
glucose
march_rw
mphasesat_m
ppfolio //
WC Time
Application
Crafted
Random
Gold
Silver
Bronze
Gold
Silver
Bronze
Gold
Silver
Bronze
SAT+UNSAT
plingeling
cryptominisat //
ppfolio //
ppfolio //
claspmt //
3S
ppfolio //
3S
ppfolio seq
SAT
ppfolio //
plingeling //
contrasat
ppfolio //
ppfolio seq
3S
sparrow2011
csls //
sattime2011
UNSAT
cryptominisat //
glueminisat
plingeling //
claspmt //
clasp
ppfolio //
march_rw
ppfolio //
mphasesat_m
Special prizes
Best Minisat Hack
CirMinisat hack
SAT 2010 Race
Organizer
Carsten Sinz
SAT 2009 Competition
Organizing committee
Daniel Le Berre
Olivier Roussel
and
Laurent Simon
Judges
Andreas Goerdt
Ines Lynce
and
Aaron Stump
Benchmarks
random
(7z 46MiB),
crafted
(.7z 171MiB),
industrial
(7z 385 MiB)
Solvers
binaries (7z, 33MiB)
sources (7z, 25MiB)
booklet with the description of the solvers (and benchmarks)
Application
Crafted
Random
Gold
Silver
Bronze
Gold
Silver
Bronze
Gold
Silver
Bronze
SAT+UNSAT
precosat
glucose
lysat
clasp
SATzilla2009_C
IUT_BMB_SAT
SATzilla2009_R
March hi
NA
SAT
SATzilla I
precosat
MXC
clasp
SApperloT
MXC
TNM
gNovelty2+
hybridGM3
adapt2wsat2009++
UNSAT
glucose
precosat
lysat
SATzilla2009_C
clasp
IUT_BMB_SAT
March hi
SATzilla2009_R
NA
Special prizes
Parallel solver application
ManySAT
Parallel solver random
gNovelty2+
Best Minisat Hack
Minisat 09z
SAT 2008 Race
Organizer
Carsten Sinz
SAT 2007 Competition
Organizing committee
Daniel Le Berre
Olivier Roussel
and
Laurent Simon
Judges
Ewald Speckenmeyer
Geoff Sutcliffe
and
Lintao Zhang
Benchmarks
random
(tar.bz2 44MB),
crafted
(.tar, bz2 compressed files inside 175MB),
industrial
(.tar, bz2 compressed files inside, 556 MB)+
velev 's VLIW-SAT 4.0
and
VLIW-UNSAT 2.0
IBM benchmarks
Systems
All
Winners
precompiled for linux (tgz, 25/10 MB).
Source code
(competition division only, tgz, -updated 11/7/07- 6MB).
Industrial
handmade
Random
Gold
Silver
Bronze
Gold
Silver
Bronze
Gold
Silver
Bronze
SAT+UNSAT
Rsat
Picosat
Minisat
SATzilla CRAFTED
Minisat
MXC
SATzilla RANDOM
March KS
KCNFS 2004
SAT
Picosat
Rsat
Minisat
March KS
SATzilla CRAFTED
Minisat
gnovelty+
adaptg2wsat0
adaptg2wsat+
UNSAT
Rsat
Minisat
TiniSatELite
SATzilla CRAFTED
TTS
Minisat
March KS
KCNFS 2004
SATzilla RANDOM
SAT 2006 Race
Organizer
Carsten Sinz
SAT 2005 Competition
Organizing committee
Daniel Le Berre
and
Laurent Simon
Judges
Armin Biere
Oliver Kullmann
and
Allen Van Gelder
Reference
Daniel Le Berre and Laurent Simon Editors, Journal on Satisfiability, Boolean Modeling and Computation, Volume 2, Special Volume on the SAT 2005 competitions and evaluations, March 2006.
Benchmarks
Random (.tar.bz2, 25MB)
Crafted (.tar.bz2, 360MB)
Industrial (.tar.bz2, 205MB)
See also
IBM
and
Velev
web sites.
Industrial
handmade
Random
Gold
Silver
Bronze
Gold
Silver
Bronze
Gold
Silver
Bronze
SAT+UNSAT
SatELiteGTI
MiniSAT 1.13
zChaff_rand
and
HaifaSAT
Vallst
SatELiteGTI
March_dl
kcnf-2004
March_dl
Dew_Satz1a
SAT
SatELiteGTI
MiniSAT 1.13
Jerusat 1.31 B
and
HaifaSAT
Vallst
March_dl
Hsat1
ranov
g2wsat
VW
UNSAT
SatELiteGTI
Zchaff_rand
HaifaSat
SatELiteGTI
MiniSAT 1.13
Vallst
and
March-dl
kcnf-2004
March_dl
Dew_Satz1a
Special tracks
CERTIFIED UNSAT
zChaff
TTSP-3.0
NON CLAUSAL
No solver submitted
PSEUDO BOOLEAN
Go to
official web site
SAT 2004 Competition
Organizing committee
Daniel Le Berre
and
Laurent Simon
Judges
Fahiem Bacchus, Hans Kleine Buning
and Joao Marques Silva
Reference
55 Solvers in Vancouver: The SAT
2004 competition. Daniel Le Berre and Laurent
Simon. Proceedings of the Seventh International Conference on Theory and Applications of Satisfiability Testing, SAT 2004, Springer, Lecture Notes in Computer Science, n� 3542, pp 321-344, 2005. Revised Selected Papers.
Benchmarks
Random (.tar.bz2, 11MB)
Crafted (.tar.bz2, 36MB)
Industrial (.tar.bz2, 2GB)
Industrial
handmade
Random
ALL (SAT+UNSAT)
Zchaff 2004
March-eq
AdaptNovelty
SAT
Jerusat
Satzoo
AdaptNovelty
UNSAT
Zchaff 2004
March-eq
Kcnfs
SAT 2003 Competition
Organizing committee
Daniel Le Berre
and
Laurent Simon
Judges
John Franco, Hans van Maaren and Toby Walsh
Reference
The essentials of the SAT 2003
competition. Daniel Le Berre and Laurent Simon. Proceedings of the Sixth International Conference on Theory and Applications of Satisfiability Testing (SAT2003). Lecture Notes in Computer Science 2919, pp 452-467, 2003.
Benchmarks
available from SATLIB (.tar.bz2, 345MB)
Industrial
handmade
Random
Complete on all
Forklift
Satzoo
Kcnfs
ALL on SAT
Forklift
Satzoo
Unitwalk
SAT 2002 Competition
Organizing committee
Edward
A. Hirsch
Daniel Le Berre
and
Laurent Simon
Judges
N/A
Reference
The SAT2002 competition
report. Laurent Simon, Daniel Le Berre and Edward
A. Hirsch.
Annals of Mathematics and Artificial Intelligence, Volume 43, Issue 1-4, pp. 307-342, January 2005
See also
A Parsimony Tree for the SAT2002 Competition. Paul W. Purdom, Daniel Le Berre, Laurent Simon
Annals of Mathematics and Artificial Intelligence, Volume 43, Issue 1-4, pp. 343-365, January 2005
Benchmarks
available from SATLIB (tgz, 147MB)
Industrial
handmade
Random
Complete on all
zChaff
zChaff
OKSolver
ALL on SAT
Limmat
Berkmin
OKSolver
Purpose
The purpose of the competition is to identify new challenging
benchmarks
and to promote new
solvers
for the propositional satisfiability problem (SAT)
as well as to compare them with state-of-the-art solvers.
We strongly encourage people thinking
about SAT-based techniques in their area (planning, hardware or software

verification, etc.) to submit benchmarks to be used for the competition.
The result of the competition will be a good indicator of the current
feasibility of such approach. The competition will be completely automated
using the
SAT-Ex
system.
Credits
Credits go to Hans van Maaren and John Franco who contributed significantly to the first SAT competition in order to make it into a success.
SAT-Ex
SATLIB
SAT Live!
Imprint