Results 1  10
of
15
HeavyTailed Phenomena in Satisfiability and Constraint Satisfaction Problems
 J. of Autom. Reasoning
, 2000
"... Abstract. We study the runtime distributions of backtrack procedures for propositional satisfiability and constraint satisfaction. Such procedures often exhibit a large variability in performance. Our study reveals some intriguing properties of such distributions: They are often characterized by ver ..."
Abstract

Cited by 148 (27 self)
 Add to MetaCart
Abstract. We study the runtime distributions of backtrack procedures for propositional satisfiability and constraint satisfaction. Such procedures often exhibit a large variability in performance. Our study reveals some intriguing properties of such distributions: They are often characterized by very long tails or “heavy tails”. We will show that these distributions are best characterized by a general class of distributions that can have infinite moments (i.e., an infinite mean, variance, etc.). Such nonstandard distributions have recently been observed in areas as diverse as economics, statistical physics, and geophysics. They are closely related to fractal phenomena, whose study was introduced by Mandelbrot. We also show how random restarts can effectively eliminate heavytailed behavior. Furthermore, for harder problem instances, we observe long tails on the lefthand side of the distribution, which is indicative of a nonnegligible fraction of relatively short, successful runs. A rapid restart strategy eliminates heavytailed behavior and takes advantage of short runs, significantly reducing expected solution time. We demonstrate speedups of up to two orders of magnitude on SAT and CSP encodings of hard problems in planning, scheduling, and circuit synthesis. Key words: satisfiability, constraint satisfaction, heavy tails, backtracking 1.
The Taming of the (X)OR
 CL 2000
, 2000
"... Many key verification problems such as bounded modelchecking, circuit verification and logical cryptanalysis are formalized with combined clausal and affine logic (i.e. clauses with xor as the connective) and cannot be efficiently (if at all) solved by using CNFonly provers. We present a decision ..."
Abstract

Cited by 56 (7 self)
 Add to MetaCart
Many key verification problems such as bounded modelchecking, circuit verification and logical cryptanalysis are formalized with combined clausal and affine logic (i.e. clauses with xor as the connective) and cannot be efficiently (if at all) solved by using CNFonly provers. We present a decision procedure to efficiently decide such problems. The GaussDPLL procedure is a tight integration in a unifying framework of a GaussElimination procedure (for affine logic) and a DavisPutnamLogemanLoveland procedure (for usual clause logic). The key idea, which distinguishes our approach from others, is the full interaction bewteen the two parts which makes it possible to maximize (deterministic) simplification rules by passing around newly created unit or binary clauses in either of these parts. We show the correcteness and the termination of GaussDPLL under very liberal assumptions.
Logical cryptanalysis as a SATproblem: Encoding and analysis
 In Journal of Automated Reasoning
, 2000
"... Abstract. Cryptographic algorithms play a key role in computer security and the formal analysis of their robustness is of utmost importance. Yet, logic and automated reasoning tools are seldom used in the analysis of a cipher, and thus one cannot often get the desired formal assurance that the ciphe ..."
Abstract

Cited by 22 (2 self)
 Add to MetaCart
Abstract. Cryptographic algorithms play a key role in computer security and the formal analysis of their robustness is of utmost importance. Yet, logic and automated reasoning tools are seldom used in the analysis of a cipher, and thus one cannot often get the desired formal assurance that the cipher is free from unwanted properties that may weaken its strength. In this paper, we claim that one can feasibly encode the lowlevel properties of stateoftheart cryptographic algorithms as SAT problems and then use efficient automated theoremproving systems and SATsolvers for reasoning about them. We call this approach logical cryptanalysis. In this framework, for instance, finding a model for a formula encoding an algorithm is equivalent to finding a key with a cryptanalytic attack. Other important properties, such as cipher integrity or algebraic closure, can also be captured as SAT problems or as quantified boolean formulae. SAT benchmarks based on the encoding of cryptographic algorithms can be used to effectively combine features of “realworld ” problems and randomly generated problems. Here we present a case study on the U.S. Data Encryption Standard (DES) and show how to obtain a manageable encoding of its properties. We have also tested three SAT provers, TABLEAU by Crawford and Auton, SATO by Zhang, and relSAT by Bayardo and Schrag, on the encoding of DES, and we discuss the reasons behind their different performance. A discussion of open problems and future research concludes the paper. Key words: cipher verification, Data Encryption Standard, logical cryptanalysis, propositional satisfiability, quantified boolean formulae, SAT benchmarks.
An Evolutionary Approach with Diversity Guarantee and WellInformed Grouping Recombination for Graph Coloring
, 2010
"... We present a diversityoriented hybrid evolutionary approach for the graph coloring problem. This approach is based on both generally applicable strategies and specifically tailored techniques. Particular attention is paid to ensuring population diversity by carefully controlling spacing among indiv ..."
Abstract

Cited by 14 (8 self)
 Add to MetaCart
We present a diversityoriented hybrid evolutionary approach for the graph coloring problem. This approach is based on both generally applicable strategies and specifically tailored techniques. Particular attention is paid to ensuring population diversity by carefully controlling spacing among individuals. Using a distance measure between potential solutions, the general population management strategy decides whether an offspring should be accepted in the population, which individual needs to be replaced and when mutation is applied. Furthermore, we introduce a special groupingbased multiparent crossover operator which relies on several relevant features to identify meaningful building blocks for offspring construction. The proposed approach can be generally characterized as “wellinformed”, in the sense that the design of each component is based on the most pertinent information which is identified by both experimental observation and careful analysis of the given problem. The resulting algorithm proves to be highly competitive when it is applied on the whole set of the DIMACS benchmark graphs.
Reactive search: machine learning for memorybased heuristics
 Teofilo F. Gonzalez (Ed.), Approximation Algorithms and Metaheuristics, Taylor & Francis Books (CRC Press
, 2005
"... 1 Introduction: the role of the user in heuristics Most stateoftheart heuristics are characterized by a certain number of choices and free parameters, whose appropriate setting is a subject that raises issues of research methodology [5, 41, 51]. In some cases, these parameters are tuned through a ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
1 Introduction: the role of the user in heuristics Most stateoftheart heuristics are characterized by a certain number of choices and free parameters, whose appropriate setting is a subject that raises issues of research methodology [5, 41, 51]. In some cases, these parameters are tuned through a feedback loop that includes the user as a crucial learning component: depending on preliminary algorithm tests some parameter values are changed by the
Algorithms and Experiments: The New (and Old) Methodology
 J. Univ. Comput. Sci
, 2001
"... The last twenty years have seen enormous progress in the design of algorithms, but little of it has been put into practice. Because many recently developed algorithms are hard to characterize theoretically and have large runningtime coefficients, the gap between theory and practice has widened over ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
The last twenty years have seen enormous progress in the design of algorithms, but little of it has been put into practice. Because many recently developed algorithms are hard to characterize theoretically and have large runningtime coefficients, the gap between theory and practice has widened over these years. Experimentation is indispensable in the assessment of heuristics for hard problems, in the characterization of asymptotic behavior of complex algorithms, and in the comparison of competing designs for tractable problems. Implementation, although perhaps not rigorous experimentation, was characteristic of early work in algorithms and data structures. Donald Knuth has throughout insisted on testing every algorithm and conducting analyses that can predict behavior on actual data; more recently, Jon Bentley has vividly illustrated the difficulty of implementation and the value of testing. Numerical analysts have long understood the need for standardized test suites to ensure robustness, precision and efficiency of numerical libraries. It is only recently, however, that the algorithms community has shown signs of returning to implementation and testing as an integral part of algorithm development. The emerging disciplines of experimental algorithmics and algorithm engineering have revived and are extending many of the approaches used by computing pioneers such as Floyd and Knuth and are placing on a formal basis many of Bentley's observations. We reflect on these issues, looking back at the last thirty years of algorithm development and forward to new challenges: designing cacheaware algorithms, algorithms for mixed models of computation, algorithms for external memory, and algorithms for scientific research.
Spacing memetic algorithms
 PROC. OF THE 13TH ANNUAL GENET. AND EVOL. COMPUT. CONF. (GECCO
, 2011
"... We introduce the Spacing Memetic Algorithm (SMA), a formal evolutionary model devoted to a systematic control of spacing (distances) among individuals. SMA uses search space distance information to decide what individuals are acceptable in the population, what individuals need to be replaced and whe ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
We introduce the Spacing Memetic Algorithm (SMA), a formal evolutionary model devoted to a systematic control of spacing (distances) among individuals. SMA uses search space distance information to decide what individuals are acceptable in the population, what individuals need to be replaced and when to apply mutations. By ensuring a“healthy” spacing (and thus diversity), SMA substantially reduces the risk of premature convergence and helps the search process to continuously discover new highquality search areas. Generally speaking, the number of distance calculations represents a limited computational overhead compared to the number of local search iterations. Most existing memetic algorithms can be “upgraded” to a spacing memetic algorithm, provided that a suitable distance measure can be specified. The impact of the main SMA components is assessed within several case studies on different problems.
A BranchandPrice Approach for the Maximum Weight Independent Set Problem
, 2005
"... The maximum weight independent set problem (MWISP) is one of the most wellknown and wellstudied problems in combinatorial optimization. This paper presents a novel approach to solve MWISP exactly by decomposing the original graph into vertexinduced subgraphs. The approach solves MWISP for the or ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
The maximum weight independent set problem (MWISP) is one of the most wellknown and wellstudied problems in combinatorial optimization. This paper presents a novel approach to solve MWISP exactly by decomposing the original graph into vertexinduced subgraphs. The approach solves MWISP for the original graph by solving MWISP on the subgraphs in order to generate columns for a branchandprice framework. The authors investigate different implementation techniques that can be associated with the approach and offer computational results to identify the strengths and weaknesses of each implementation technique.