Results 1  10
of
15
Explaining Type Inference
 Science of Computer Programming
, 1995
"... Type inference is the compiletime process of reconstructing missing type information in a program based on the usage of its variables. ML and Haskell are two languages where this aspect of compilation has enjoyed some popularity, allowing type information to be omitted while static type checking is ..."
Abstract

Cited by 53 (0 self)
 Add to MetaCart
Type inference is the compiletime process of reconstructing missing type information in a program based on the usage of its variables. ML and Haskell are two languages where this aspect of compilation has enjoyed some popularity, allowing type information to be omitted while static type checking is still performed. Type inference may be expected to have some application in the prototyping and scripting languages which are becoming increasingly popular. A difficulty with type inference is the confusing and sometimes counterintuitive diagnostics produced by the type checker as a result of type errors. A modification of the HindleyMilner type inference algorithm is presented, which allows the specific reasoning which led to a program variable having a particular type to be recorded for type explanation. This approach is close to the intuitive process used in practice for debugging type errors. 1 Introduction Type inference refers to the compiletime process of reconstructing missing t...
Decidable higherorder unification problems
 AUTOMATED DEDUCTION  CADE12. SPRINGER LNAI 814
, 1994
"... Secondorder unification is undecidable in general. Miller showed that unification of socalled higherorder patterns is decidable and unitary. Weshow that the unification of a linear higherorder pattern s with an arbitrary secondorder term that shares no variables with s is decidable and finitar ..."
Abstract

Cited by 17 (4 self)
 Add to MetaCart
Secondorder unification is undecidable in general. Miller showed that unification of socalled higherorder patterns is decidable and unitary. Weshow that the unification of a linear higherorder pattern s with an arbitrary secondorder term that shares no variables with s is decidable and finitary. A few extensions of this unification problem are still decidable: unifying two secondorder terms, where one term is linear, is undecidable if the terms contain bound variables but decidable if they don't.
Tractable and Intractable SecondOrder Matching Problems
 In Proc. 5th Ann. Int. Computing and Combinatorics Conference (COCOON'99), LNCS 1627
, 1999
"... . The secondorder matching problem is the problem of determining, for a finite set {#t i , s i #  i # I} of pairs of a secondorder term t i and a firstorder closed term s i , called a matching expression, whether or not there exists a substitution # such that t i # = s i for each i # I ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
. The secondorder matching problem is the problem of determining, for a finite set {#t i , s i #  i # I} of pairs of a secondorder term t i and a firstorder closed term s i , called a matching expression, whether or not there exists a substitution # such that t i # = s i for each i # I . It is wellknown that the secondorder matching problem is NPcomplete. In this paper, we introduce the following restrictions of a matching expression: kary, kfv , predicate, ground , and functionfree. Then, we show that the secondorder matching problem is NPcomplete for a unary predicate, a unary ground, a ternary functionfree predicate, a binary functionfree ground, and an 1fv predicate matching expressions, while it is solvable in polynomial time for a binary functionfree predicate, a unary functionfree, a kfv functionfree (k # 0), and a ground predicate matching expressions. 1 Introduction The unification problem is the problem of determining whether or not any two ter...
Equality and Abductive Residua for Horn Clauses
 Theoretical Computer Science
, 1992
"... One method of proving theorems in Horn clause theories is surface deduction (also known as the modification method). Surface deduction yields interpretations of unification failures in terms of residual hypotheses needed for unification to succeed. This suggests that it can be used for abductive rea ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
One method of proving theorems in Horn clause theories is surface deduction (also known as the modification method). Surface deduction yields interpretations of unification failures in terms of residual hypotheses needed for unification to succeed. This suggests that it can be used for abductive reasoning with equality. In surface deduction the input clauses are first transformed to a flat form (involving no nested terms) and symmetrized (if necessary). They are then manipulated by binary resolution, a restricted version of factoring and compression. In this paper we partially characterize the deductive strength of surface deduction and show how it depends on the type of flattening used. This is used to show that some forms of surface deduction will yield all hypotheses preferred by parsimony when used as an abductive inference engine. The characterization of deductive strength suggests a new equational preference principle according to which honest explanations are preferred. In h...
Analogical projection in pattern perception
 Journal of Experimental and Theoretical Artificial Intelligence
, 2003
"... Abstract. This paper proposes a perceptually motivated method for solving proportional analogy problems involving sequential patterns. Our method is based on an algebraic model of pattern perception that determines the gestalt structure of sequential patterns. The gestalts of sequential patterns are ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Abstract. This paper proposes a perceptually motivated method for solving proportional analogy problems involving sequential patterns. Our method is based on an algebraic model of pattern perception that determines the gestalt structure of sequential patterns. The gestalts of sequential patterns are represented as algebraic terms in the model. An analogical relation between sequential patterns appearing in proportional analogy is then formalized as a mapping between algebras that generate terms representing the gestalts of these patterns. Based on this formalism, an algorithm is proposed that solves proportional analogy problems: given three sequential patterns, the algorithm computes a fourth pattern such that the resulting two pairs of patterns are perceived as having an identical analogical relation.
Complexity of the Higher Order Matching
 Automated Deduction. Volume 1632 of LNCS
, 2000
"... We use the standard encoding of Boolean values in simply typed lambda calculus to develop a method of translating SAT problems for various logics into higher order matching. We obtain this way already known NPhardness bounds for the order two and three and a new result that the fourth order matchin ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We use the standard encoding of Boolean values in simply typed lambda calculus to develop a method of translating SAT problems for various logics into higher order matching. We obtain this way already known NPhardness bounds for the order two and three and a new result that the fourth order matching is NEXPTIMEhard. 1 Introduction Consider two normalized simply typed lambda terms M and N, where N is closed (does not contain free variables). The higher order matching problem M ? = N (also known as pattern matching, 1 range problem or instantiation problem) is to decide whether there exists a substitution # for free variables in M, such that M# is ##reducible to N. Matching is a special case of unification, where the restriction that N is closed is removed (and a solution of M ? = N is a substitution # such that M# and N# are equal modulo ##conversion). The order of a problem M ? = N is the highest functionality order of free variables occurring in M. At the time of writin...
On condensation of a clause
 In Horvath T. and Yamamoto A., Eds, Proceedings of the 15th Int. Conf. on Inductive Logic Programming, LNAI, SpringerVerlag
, 2003
"... Abstract. In this paper, we investigate condensation of a clause. First, we extend a substitution graph introduced by Scheffer et al. (1996) to a total matcher graph. Then, we give a correct proof of the relationship between subsumption and the existence of cliques in a total matcher graph. Next, we ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract. In this paper, we investigate condensation of a clause. First, we extend a substitution graph introduced by Scheffer et al. (1996) to a total matcher graph. Then, we give a correct proof of the relationship between subsumption and the existence of cliques in a total matcher graph. Next, we introduce the concept of width of a clique in a total matcher graph. As a corollary of the above relationship, we show that the minimum condensation of a clause is corresponding to the clique with the minimum width in a total matcher graph. Finally, we design a greedy algorithm of finding condensation of a clause, as the algorithm of finding cliques with as small width as possible from the total matcher graph of a clause. 1
On the hardness of learning acyclic conjunctive queries
 Proc. 11th Internat. Conf. on Algorithmic Learning Theory, LNAI 1968 (Springer
, 2000
"... A conjunctive query problem in relational database theory is a problem to determine whether or not a tuple belongs to the answer of a conjunctive query over a database. Here, a tuple and a conjunctive query are regarded as a ground atom and a nonrecursive functionfree definite clause, respectively ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
A conjunctive query problem in relational database theory is a problem to determine whether or not a tuple belongs to the answer of a conjunctive query over a database. Here, a tuple and a conjunctive query are regarded as a ground atom and a nonrecursive functionfree definite clause, respectively. While the conjunctive query problem is NPcomplete in general, it becomes efficiently solvable if a conjunctive query is acyclic. Concerned with this problem, we investigate the learnability of acyclic conjunctive queries from an instance with a jdatabase which is a finite set of ground unit clauses containing at most jary predicate symbols. We deal with two kinds of instances, a simple instance as a set of ground atoms and an extended instance as a set of pairs of a ground atom and a description. Then, we show that, for each j 3, there exist a jdatabase such that acyclic conjunctive queries are not polynomially predictable from an extended instance under the cryptographic assumptions. Also we show that, for each n> 0 and a polynomial p, there exists a p(n)database of size O(2 p(n)) such that predicting Boolean formulae of size p(n) over n variables reduces to predicting acyclic conjunctive queries from a simple instance. This result implies that, if we can ignore the size of a database, then acyclic conjunctive queries are not polynomially predictable from a simple instance under the cryptographic assumptions. Finally, we show that, if either j = 1, or j = 2 and the number of element of a database is at most l ( # 0), then acyclic conjunctive queries are paclearnable from a simple instance with jdatabases.
Principal TypingSchemes in a Polyadic πCalculus
 In 4th International Conference on Concurrency Theory, volume 715 of LNCS
, 1992
"... The present report introduces a typing system for a version of Milner's polyadic ßcalculus, and a typing inference algorithm. The central concept underlying the typing system is the notion of type assignment, where each free name in a term is assigned a type, the term itself being given multiple n ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
The present report introduces a typing system for a version of Milner's polyadic ßcalculus, and a typing inference algorithm. The central concept underlying the typing system is the notion of type assignment, where each free name in a term is assigned a type, the term itself being given multiple nametype pairs. This observation leads to a clean typing system for Milner's sorting, and induces an efficient algorithm to infer the typing of a term. The typing system enjoys a subjectreduction property and possesses a notion of principal typingscheme. Furthermore, the algorithm to reconstruct the principal typingscheme of a process, or to detect its inexistence, is proved correct with respect to the typing system. 1. Introduction Type discipline helps programmers not only in writing typecorrect programs but especially in writing them in a principled and clear way. Concurrent programming, however, has long lacked such discipline, in contrast to functional and algebraic frameworks i...
Design and implementation of deterministic higherorder patterns, 2005. Draft. Available at http:://www.ipl.t.utokyo.ac.jp/yicho
"... Abstract. We introduce a class of deterministic higherorder patterns to Template Haskell for supporting declarative transformational programming with more elegant binding of pattern variables. Higherorder patterns are capable of checking and binding subtrees far from the root, which is useful for ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. We introduce a class of deterministic higherorder patterns to Template Haskell for supporting declarative transformational programming with more elegant binding of pattern variables. Higherorder patterns are capable of checking and binding subtrees far from the root, which is useful for program manipulation. However, there are three major problems. First, it is difficult to explain why a particular desired matching result cannot be obtained because of the complicated higherorder matching algorithm. Second, the general higherorder matching algorithm is of high cost, which may be exponential time at worst. Third, the (possibly infinite) nondeterministic solutions of higherorder matching prevents it from being used in a functional setting. To resolve these problems, we impose reasonable restrictions on higherorder patterns to gain predictability, efficiency and determinism. We show that our deterministic higherorder patterns are powerful to support concise specification and efficient implementation of various kinds of program transformations for optimizations. 1