Results 1  10
of
213
Automatic Translation of FORTRAN Programs to Vector Form
 ACM Transactions on Programming Languages and Systems
, 1987
"... This paper discusses the theoretical concepts underlying a project at Rice University to develop an automatic translator, called PFC (for Parallel FORTRAN Converter), from FORTRAN to FORTRAN 8x. The Rice project, based initially upon the research of Kuck and others at the University of Illinois [6, ..."
Abstract

Cited by 324 (34 self)
 Add to MetaCart
This paper discusses the theoretical concepts underlying a project at Rice University to develop an automatic translator, called PFC (for Parallel FORTRAN Converter), from FORTRAN to FORTRAN 8x. The Rice project, based initially upon the research of Kuck and others at the University of Illinois [6, 1721, 24, 32, 36], is a continuation of work begun while on leave at IBM Research in Yorktown Heights, N.Y. Our first implementation was based on the Illinois PARAFRASE compiler [20, 36], but the current version is a completely new program (although it performs many of the same transformations as PARAFRASE). Other projects that have influenced our work are the Texas Instruments ASC compiler [9, 33], the Cray1 FORTRAN compiler [15], and the Massachusetts Computer Associates Vectorizer [22, 25]. The paper is organized into seven sections. Section 2 introduces FORTRAN 8x and gives examples of its use. Section 3 presents an overview of the translation process along with an extended translation example. Section 4 develops the concept of interstatement dependence and shows how it can be applied to the problem of vectorization. Loop carried dependence and loop independent dependence are introduced in this section to extend dependence to multiple statements and multiple loops. Section 5 develops dependencebased algorithms for code generation and transformations for enhancing the parallelism of a statement. Section 6 describes a method for extending the power of data dependence to control statements by the process of IF conversion. Finally, Section 7 details the current state of PFC and our plans for its continued development
A Knowledge Compilation Map
 Journal of Artificial Intelligence Research
, 2002
"... We propose a perspective on knowledge compilation which calls for analyzing different compilation approaches according to two key dimensions: the succinctness of the target compilation language, and the class of queries and transformations that the language supports in polytime. ..."
Abstract

Cited by 225 (31 self)
 Add to MetaCart
(Show Context)
We propose a perspective on knowledge compilation which calls for analyzing different compilation approaches according to two key dimensions: the succinctness of the target compilation language, and the class of queries and transformations that the language supports in polytime.
NonStandard Reasoning Services for the Debugging of Description Logic Terminologies
, 2003
"... Current Description Logic reasoning systems provide only limited support for debugging logically erroneous knowledge bases. In this paper we propose new nonstandard reasoning services which we designed and implemented to pinpoint logical contradictions when developing the medical terminology DICE. ..."
Abstract

Cited by 164 (8 self)
 Add to MetaCart
Current Description Logic reasoning systems provide only limited support for debugging logically erroneous knowledge bases. In this paper we propose new nonstandard reasoning services which we designed and implemented to pinpoint logical contradictions when developing the medical terminology DICE. We provide complete algorithms for unfoldable ACCTBoxes based on minimisation of axioms using Boolean methods for minimal unsatisfiabilitypresening subTBoxes, and an incomplete bottomup method for generalised incoherencepreserving terminologies. 1
Logicbased benders decomposition
, 2000
"... Benders decomposition uses a strategy of “learning from one’s mistakes.” The aim of this paper is to extend this strategy to a much larger class of problems. The key is to generalize the linear programming dual used in the classical method to an “inference dual. ” Solution of the inference dual take ..."
Abstract

Cited by 71 (12 self)
 Add to MetaCart
(Show Context)
Benders decomposition uses a strategy of “learning from one’s mistakes.” The aim of this paper is to extend this strategy to a much larger class of problems. The key is to generalize the linear programming dual used in the classical method to an “inference dual. ” Solution of the inference dual takes the form of a logical deduction that yields Benders cuts. The dual is therefore very different from other generalized duals that have been proposed. The approach is illustrated by working out the details for propositional satisfiability and 01 programming problems. Computational tests are carried out for the latter, but the most promising contribution of logicbased Benders may be to provide a framework for combining optimization and constraint programming methods.
Mini: A heuristic approach for logic minimization
 IBM Journal of Research and Development
, 1974
"... Abstract: MINI is a heuristic logic minimization technique for manyvariable problems. It accepts as input a Boolean logic specification expressed as an inputoutput table, thus avoiding a long list of minterms. It seeks a minimal implicant solution, without generating all prime implicants, which ca ..."
Abstract

Cited by 61 (0 self)
 Add to MetaCart
(Show Context)
Abstract: MINI is a heuristic logic minimization technique for manyvariable problems. It accepts as input a Boolean logic specification expressed as an inputoutput table, thus avoiding a long list of minterms. It seeks a minimal implicant solution, without generating all prime implicants, which can be converted to prime implicants if desired. New and effective subprocesses, such as expanding, reshaping, and removing redundancy from cubes, are iterated until there is no further reduction in the solution. The process is general in that it can minimize both conventional logic and logic functions of multivalued variables.
Bidimensionality and Kernels
, 2010
"... Bidimensionality theory appears to be a powerful framework in the development of metaalgorithmic techniques. It was introduced by Demaine et al. [J. ACM 2005] as a tool to obtain subexponential time parameterized algorithms for bidimensional problems on Hminor free graphs. Demaine and Hajiaghayi ..."
Abstract

Cited by 61 (24 self)
 Add to MetaCart
(Show Context)
Bidimensionality theory appears to be a powerful framework in the development of metaalgorithmic techniques. It was introduced by Demaine et al. [J. ACM 2005] as a tool to obtain subexponential time parameterized algorithms for bidimensional problems on Hminor free graphs. Demaine and Hajiaghayi [SODA 2005] extended the theory to obtain polynomial time approximation schemes (PTASs) for bidimensional problems. In this paper, we establish a third metaalgorithmic direction for bidimensionality theory by relating it to the existence of linear kernels for parameterized problems. In parameterized complexity, each problem instance comes with a parameter k and the parameterized problem is said to admit a linear kernel if there is a polynomial time algorithm, called
F.: Debugging incoherent terminologies
 In: Journal of Automated Reasoning
"... Abstract. In this paper we study the diagnosis and repair of incoherent terminologies. We define a number of new nonstandard reasoning services to explain incoherence through pinpointing, and we present algorithms for all of these services. For one of the core tasks of debugging, the calculation of ..."
Abstract

Cited by 52 (3 self)
 Add to MetaCart
(Show Context)
Abstract. In this paper we study the diagnosis and repair of incoherent terminologies. We define a number of new nonstandard reasoning services to explain incoherence through pinpointing, and we present algorithms for all of these services. For one of the core tasks of debugging, the calculation of minimal unsatisfiability preserving subterminologies, we developed two different algorithms, one implementing a bottomup approach using support of an external DL reasoner, the other implementing a specialized tableaubased calculus. Both algorithms have been prototypically implemented. We study the effectiveness of our algorithms in two ways: we present a realistic casestudy where we diagnose a terminology used in a practical application, and we perform controlled benchmark experiments to get a better understanding of the computational properties of our algorithms in particular, and the debugging problem in general.
The Minimum Equivalent DNF Problem and Shortest Implicants
, 1998
"... We prove that the Minimum Equivalent DNF problem is \Sigma p 2 complete, resolving a conjecture due to Stockmeyer. The proof involves as an intermediate step a variant of a related problem in logic minimization, namely, that of finding the shortest implicant of a Boolean function. We also obtain ..."
Abstract

Cited by 51 (4 self)
 Add to MetaCart
We prove that the Minimum Equivalent DNF problem is \Sigma p 2 complete, resolving a conjecture due to Stockmeyer. The proof involves as an intermediate step a variant of a related problem in logic minimization, namely, that of finding the shortest implicant of a Boolean function. We also obtain certain results concerning the complexity of the Shortest Implicant problem that may be of independent interest. When the input is a formula, the Shortest Implicant problem is \Sigma p 2  complete, and \Sigma p 2 hard to approximate to within an n 1=2\Gammaffl factor. When the input is a circuit, approximation is \Sigma p 2  hard to within an n 1\Gammaffl factor. However, when the input is a DNF formula, the Shortest Implicant problem cannot be \Sigma p 2 complete unless \Sigma p 2 = NP[log 2 n] NP . 1. Introduction Twolevel (DNF) logic minimization is a central practical problem in logic synthesis and also one of the more natural problems in the polynomial hierarchy....
Consensus Algorithms for the Generation of All Maximal Bicliques
, 2002
"... We describe a new algorithm for generating all maximal bicliques (i.e. complete bipartite, not necessarily induced subgraphs) of a graph. The algorithm is inspired by, and is quite similar to, the consensus method used in propositional logic. We show that some variants of the algorithm are totally p ..."
Abstract

Cited by 44 (5 self)
 Add to MetaCart
We describe a new algorithm for generating all maximal bicliques (i.e. complete bipartite, not necessarily induced subgraphs) of a graph. The algorithm is inspired by, and is quite similar to, the consensus method used in propositional logic. We show that some variants of the algorithm are totally polynomial, and even incrementally polynomial. The total complexity of the most efficient variant of the algorithms presented here is polynomial in the input size, and only linear in the output size. Computational experiments demonstrate its high efficiency on randomly generated graphs with up to 2,000 vertices and 20,000 edges.
A Continuous Approach to Inductive Inference
 Mathematical Programming
, 1992
"... In this paper we describe an interior point mathematical programming approach to inductive inference. We list several versions of this problem and study in detail the formulation based on hidden Boolean logic. We consider the problem of identifying a hidden Boolean function F : f0; 1g n ! f0; 1g ..."
Abstract

Cited by 43 (2 self)
 Add to MetaCart
(Show Context)
In this paper we describe an interior point mathematical programming approach to inductive inference. We list several versions of this problem and study in detail the formulation based on hidden Boolean logic. We consider the problem of identifying a hidden Boolean function F : f0; 1g n ! f0; 1g using outputs obtained by applying a limited number of random inputs to the hidden function. Given this inputoutput sample, we give a method to synthesize a Boolean function that describes the sample. We pose the Boolean Function Synthesis Problem as a particular type of Satisfiability Problem. The Satisfiability Problem is translated into an integer programming feasibility problem, that is solved with an interior point algorithm for integer programming. A similar integer programming implementation has been used in a previous study to solve randomly generated instances of the Satisfiability Problem. In this paper we introduce a new variant of this algorithm, where the Riemannian metric used...