Results 1  10
of
99
A General Framework for Adaptive Processing of Data Structures
 IEEE TRANSACTIONS ON NEURAL NETWORKS
, 1998
"... A structured organization of information is typically required by symbolic processing. On the other hand, most connectionist models assume that data are organized according to relatively poor structures, like arrays or sequences. The framework described in this paper is an attempt to unify adaptive ..."
Abstract

Cited by 118 (47 self)
 Add to MetaCart
A structured organization of information is typically required by symbolic processing. On the other hand, most connectionist models assume that data are organized according to relatively poor structures, like arrays or sequences. The framework described in this paper is an attempt to unify adaptive models like artificial neural nets and belief nets for the problem of processing structured information. In particular, relations between data variables are expressed by directed acyclic graphs, where both numerical and categorical values coexist. The general framework proposed in this paper can be regarded as an extension of both recurrent neural networks and hidden Markov models to the case of acyclic graphs. In particular we study the supervised learning problem as the problem of learning transductions from an input structured space to an output structured space, where transductions are assumed to admit a recursive hidden statespace representation. We introduce a graphical formalism for r...
The TPTP Problem Library
, 1999
"... This report provides a detailed description of the TPTP Problem Library for automated theorem proving systems. The library is available via Internet, and forms a common basis for development of and experimentation with automated theorem provers. This report provides: ffl the motivations for buildin ..."
Abstract

Cited by 100 (6 self)
 Add to MetaCart
This report provides a detailed description of the TPTP Problem Library for automated theorem proving systems. The library is available via Internet, and forms a common basis for development of and experimentation with automated theorem provers. This report provides: ffl the motivations for building the library; ffl a discussion of the inadequacies of previous problem collections, and how these have been resolved in the TPTP; ffl a description of the library structure, including overview information; ffl descriptions of supplementary utility programs; ffl guidelines for obtaining and using the library; Contents 1 Introduction 2 1.1 Previous Problem Collections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.2 What is Required? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 2 Inside the TPTP 6 2.1 The TPTP Domain Structure . . . . . . . . . . . . . . . . . . . . . ...
leanTAP: Lean Tableaubased Deduction
 Journal of Automated Reasoning
, 1995
"... . "prove((E,F),A,B,C,D) : !, prove(E,[FA],B,C,D). prove((E;F),A,B,C,D) : !, prove(E,A,B,C,D), prove(F,A,B,C,D). prove(all(H,I),A,B,C,D) : !, "+length(C,D), copyterm((H,I,C),(G,F,C)), append(A,[all(H,I)],E), prove(F,E,B,[GC],D). prove(A,,[CD],,) : ((A= (B); (A)=B)) ? (unify(B,C); pro ..."
Abstract

Cited by 78 (11 self)
 Add to MetaCart
. "prove((E,F),A,B,C,D) : !, prove(E,[FA],B,C,D). prove((E;F),A,B,C,D) : !, prove(E,A,B,C,D), prove(F,A,B,C,D). prove(all(H,I),A,B,C,D) : !, "+length(C,D), copyterm((H,I,C),(G,F,C)), append(A,[all(H,I)],E), prove(F,E,B,[GC],D). prove(A,,[CD],,) : ((A= (B); (A)=B)) ? (unify(B,C); prove(A,[],D,,)). prove(A,[EF],B,C,D) : prove(E,F,[AB],C,D)." implements a firstorder theorem prover based on freevariable semantic tableaux. It is complete, sound, and efficient. 1 Introduction The Prolog program listed in the abstract implements a complete and sound theorem prover for firstorder logic; it is based on freevariable semantic tableaux (Fitting, 1990). We call this lean deduction: the idea is to achieve maximal efficiency from minimal means. We will see that the above program is indeed very efficientnot although but because it is extremely short and compact. Our approach surely does not lead to a deduction system which is superior to highly sophisticated systems li...
Controlled Integrations of the Cut Rule into Connection Tableau Calculi
"... In this paper techniques are developed and compared which increase the inferential power of tableau systems for classical firstorder logic. The mechanisms are formulated in the framework of connection tableaux, which is an amalgamation of the connection method and the tableau calculus, and a genera ..."
Abstract

Cited by 61 (3 self)
 Add to MetaCart
In this paper techniques are developed and compared which increase the inferential power of tableau systems for classical firstorder logic. The mechanisms are formulated in the framework of connection tableaux, which is an amalgamation of the connection method and the tableau calculus, and a generalization of model elimination. Since connection tableau calculi are among the weakest proof systems with respect to proof compactness, and the (backward) cut rule is not suitable for the firstorder case, we study alternative methods for shortening proofs. The techniques we investigate are the folding up and the folding down operation. Folding up represents an efficient way of supporting the basic calculus, which is topdown oriented, with lemmata derived in a bottomup manner. It is shown that both techniques can also be viewed as controlled integrations of the cut rule. In order to remedy the additional redundancy imported into tableau proof procedures by the new inference rules, we develop and apply an extension of the regularity condition on tableaux and the mechanism of antilemmata which realizes a subsumption concept on tableaux. Using the framework of the theorem prover SETHEO, we have implemented three new proof procedures which overcome the deductive weakness of cutfree tableau systems. Experimental results demonstrate the superiority of the systems with folding up over the cutfree variant and the one with folding down.
Caching and Lemmaizing in Model Elimination Theorem Provers
, 1992
"... Theorem provers based on model elimination have exhibited extremely high inference rates but have lacked a redundancy control mechanism such as subsumption. In this paper we report on work done to modify a model elimination theorem prover using two techniques, caching and lemmaizing, that have reduc ..."
Abstract

Cited by 49 (2 self)
 Add to MetaCart
Theorem provers based on model elimination have exhibited extremely high inference rates but have lacked a redundancy control mechanism such as subsumption. In this paper we report on work done to modify a model elimination theorem prover using two techniques, caching and lemmaizing, that have reduced by more than an order of magnitude the time required to find proofs of several problems and that have enabled the prover to prove theorems previously unobtainable by topdown model elimination theorem provers.
Free Variable Tableaux for Propositional Modal Logics
 TABLEAUX97, LNCS 1227
, 1997
"... We present a sound, complete, modular and lean labelled tableau calculus for many propositional modal logics where the labels contain "free" and "universal" variables. Our "lean" Prolog implementation is not only surprisingly short, but compares favourably with other considerably more complex implem ..."
Abstract

Cited by 42 (5 self)
 Add to MetaCart
We present a sound, complete, modular and lean labelled tableau calculus for many propositional modal logics where the labels contain "free" and "universal" variables. Our "lean" Prolog implementation is not only surprisingly short, but compares favourably with other considerably more complex implementations for modal deduction.
NORA/HAMMR: Making DeductionBased Software Component Retrieval Practical
, 1997
"... Deductionbased software component retrieval uses preand postconditions as indexes and search keys and an automated theorem prover (ATP) to check whether a component matches. This idea is very simple but the vast number of arising proof tasks makes a practical implementation very hard. We thus pass ..."
Abstract

Cited by 39 (4 self)
 Add to MetaCart
Deductionbased software component retrieval uses preand postconditions as indexes and search keys and an automated theorem prover (ATP) to check whether a component matches. This idea is very simple but the vast number of arising proof tasks makes a practical implementation very hard. We thus pass the components through a chain of filters of increasing deductive power. In this chain, rejection filters based on signature matching and model checking techniques are used to rule out nonmatches as early as possible and to prevent the subsequent ATP from "drowning." Hence, intermediate results of reasonable precision are available at (almost) any time of the retrieval process. The final ATP step then works as a confirmation filter to lift the precision of the answer set. We implemented a chain which runs fully automatically and uses MACE for model checking and the automated prover SETHEO as confirmation filter. We evaluated the system over a mediumsized collection of components. The resul...
An Overview Of Strategies For Neurosymbolic Integration
, 1995
"... This paper will give an overview of the various approaches to neurosymbolic integration. Roughly, these can be divided into two strategies: unified strategies aim at attaining neural and symbolic capabilities using neural networks alone, while hybrid strategies combine neural networks with symbolic ..."
Abstract

Cited by 33 (1 self)
 Add to MetaCart
This paper will give an overview of the various approaches to neurosymbolic integration. Roughly, these can be divided into two strategies: unified strategies aim at attaining neural and symbolic capabilities using neural networks alone, while hybrid strategies combine neural networks with symbolic models such as expert systems, casebased reasoning systems, 2 Chapter 2 and decision trees. These two approaches form the main subtrees of the classification hierarchy depicted in Figure 1. Symbol Proc. Neuronal Unified approach Symbol Proc. hybrids Connectionist Localist Hybrid approach Combined L/D Neurosymbolic integration Functional Chainprocessing Translational Subprocessing hybrids Metaprocessing Distributed Coprocessing Figure 1 Classification of integrated neurosymbolic systems.
A connection based proof method for intuitionistic logic
 TH WORKSHOP ON THEOREM PROVING WITH ANALYTIC TABLEAUX AND RELATED METHODS, LNAI 918
, 1995
"... We present a proof method for intuitionistic logic based on Wallen’s matrix characterization. Our approach combines the connection calculus and the sequent calculus. The search technique is based on notions of paths and connections and thus avoids redundancies in the search space. During the proof s ..."
Abstract

Cited by 29 (19 self)
 Add to MetaCart
We present a proof method for intuitionistic logic based on Wallen’s matrix characterization. Our approach combines the connection calculus and the sequent calculus. The search technique is based on notions of paths and connections and thus avoids redundancies in the search space. During the proof search the computed firstorder and intuitionistic substitutions are used to simultaneously construct a sequent proof which is more human oriented than the matrix proof. This allows to use our method within interactive proof environments. Furthermore we can consider local substitutions instead of global ones and treat substitutions occurring in different branches of the sequent proof independently. This reduces the number of extra copies of formulae to be considered.
Synthesizing certified code
 Proc. Intl. Symp. Formal Methods Europe 2002: Formal Methods—Getting IT Right, LNCS 2391
, 2002
"... Abstract. Code certification is a lightweight approach for formally demonstrating software quality. Its basic idea is to require code producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates that can be checked independently. Since c ..."
Abstract

Cited by 29 (15 self)
 Add to MetaCart
Abstract. Code certification is a lightweight approach for formally demonstrating software quality. Its basic idea is to require code producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates that can be checked independently. Since code certification uses the same underlying technology as program verification, it requires detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding annotations to the code is timeconsuming and errorprone. We address this problem by combining code certification with automatic program synthesis. Given a highlevel specification, our approach simultaneously generates code and all annotations required to certify the generated code. We describe a certification extension of AutoBayes, a synthesis tool for automatically generating data analysis programs. Based on builtin domain knowledge, proof annotations are added and used to generate proof obligations that are discharged by the automated theorem prover ESETHEO. We demonstrate our approach by certifying operator and memorysafety on a dataclassification program. For this program, our approach was faster and more precise than PolySpace, a commercial static analysis tool.