Results 1  10
of
12
A Compiler for Deterministic, Decomposable Negation Normal Form
, 2002
"... We present a compiler for converting CNF formulas into deterministic, decomposable negation normal form (dDNNF). This is a ..."
Abstract

Cited by 52 (11 self)
 Add to MetaCart
We present a compiler for converting CNF formulas into deterministic, decomposable negation normal form (dDNNF). This is a
Formalized mathematics
 TURKU CENTRE FOR COMPUTER SCIENCE
, 1996
"... It is generally accepted that in principle it’s possible to formalize completely almost all of presentday mathematics. The practicability of actually doing so is widely doubted, as is the value of the result. But in the computer age we believe that such formalization is possible and desirable. In c ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
It is generally accepted that in principle it’s possible to formalize completely almost all of presentday mathematics. The practicability of actually doing so is widely doubted, as is the value of the result. But in the computer age we believe that such formalization is possible and desirable. In contrast to the QED Manifesto however, we do not offer polemics in support of such a project. We merely try to place the formalization of mathematics in its historical perspective, as well as looking at existing praxis and identifying what we regard as the most interesting issues, theoretical and practical.
Lattice duality: The origin of probability and entropy
 In press: Neurocomputing
, 2005
"... Bayesian probability theory is an inference calculus, which originates from a generalization of inclusion on the Boolean lattice of logical assertions to a degree of inclusion represented by a real number. Dual to this lattice is the distributive lattice of questions constructed from the ordered set ..."
Abstract

Cited by 17 (6 self)
 Add to MetaCart
Bayesian probability theory is an inference calculus, which originates from a generalization of inclusion on the Boolean lattice of logical assertions to a degree of inclusion represented by a real number. Dual to this lattice is the distributive lattice of questions constructed from the ordered set of downsets of assertions, which forms the foundation of the calculus of inquiry—a generalization of information theory. In this paper we introduce this novel perspective on these spaces in which machine learning is performed and discuss the relationship between these results and several proposed generalizations of information theory in the literature.
Stochastic Switching Circuit Synthesis
"... In his 1938 Master’s Thesis, Shannon demonstrated that any Boolean function can be realized by a switching relay circuit, leading to the development of deterministic digital logic. Here, we replace each classical switch with a probabilistic switch (pswitch). We present algorithms for synthesizing ci ..."
Abstract

Cited by 8 (6 self)
 Add to MetaCart
In his 1938 Master’s Thesis, Shannon demonstrated that any Boolean function can be realized by a switching relay circuit, leading to the development of deterministic digital logic. Here, we replace each classical switch with a probabilistic switch (pswitch). We present algorithms for synthesizing circuits closed with a desired probability, including an algorithm that generates optimal size circuits for any binary fraction. We also introduce a new duality property for seriesparallel stochastic switching circuits. Finally, we construct a universal probability generator which maps deterministic inputs to arbitrary probabilistic outputs. Potential applications exist in the analysis and design of stochastic networks in biology and engineering. I. INTRODUCTION. Claude Shannon, in his 1938 Master’s Thesis, discovered a systematic synthesis procedure to realize a given Boolean function using deterministic switches [Sha38]. This classical contribution led to the development of modern digital logic design and is at the foundation of our ability to design and manufacture digital circuits with millions of transistors.
MultiUser File System Search
, 2007
"... Information retrieval research usually deals with globally visible, static document collections. Practical applications, in contrast, like file system search and enterprise search, have to cope with highly dynamic text collections and have to take into account userspecific access permissions when ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Information retrieval research usually deals with globally visible, static document collections. Practical applications, in contrast, like file system search and enterprise search, have to cope with highly dynamic text collections and have to take into account userspecific access permissions when generating the results to a search query. The goal of this thesis is to close the gap between information retrieval research and the requirements exacted by these reallife applications. The algorithms and data structures presented in this thesis can be used to implement a file system search engine that is able to react to changes in the file system by updating its index data in real time. File changes (insertions, deletions, or modifications) are reflected by the search results within a few seconds,
A short survey of automated reasoning
"... Abstract. This paper surveys the field of automated reasoning, giving some historical background and outlining a few of the main current research themes. We particularly emphasize the points of contact and the contrasts with computer algebra. We finish with a discussion of the main applications so f ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Abstract. This paper surveys the field of automated reasoning, giving some historical background and outlining a few of the main current research themes. We particularly emphasize the points of contact and the contrasts with computer algebra. We finish with a discussion of the main applications so far. 1 Historical introduction The idea of reducing reasoning to mechanical calculation is an old dream [75]. Hobbes [55] made explicit the analogy in the slogan ‘Reason [...] is nothing but Reckoning’. This parallel was developed by Leibniz, who envisaged a ‘characteristica universalis’ (universal language) and a ‘calculus ratiocinator ’ (calculus of reasoning). His idea was that disputes of all kinds, not merely mathematical ones, could be settled if the parties translated their dispute into the characteristica and then simply calculated. Leibniz even made some steps towards realizing this lofty goal, but his work was largely forgotten. The characteristica universalis The dream of a truly universal language in Leibniz’s sense remains unrealized and probably unrealizable. But over the last few centuries a language that is at least adequate for
A toward framework for generic uncertainty management
 IFSAEUSFLAT
, 2009
"... The need for an automatic inference process able to deal with information coming from unreliable sources is becoming a relevant issue both on corporate networks and on the open Web. Mathematical theories to reason with uncertain information have been successfully applied in several situations, but e ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
The need for an automatic inference process able to deal with information coming from unreliable sources is becoming a relevant issue both on corporate networks and on the open Web. Mathematical theories to reason with uncertain information have been successfully applied in several situations, but each one of these models is tailored to deal with a specific semantics of uncertainty. In this paper, we put forward the idea of using explicit representations of the different types of uncertainty for partitioning the inference process into parts. By coordinating multiple independent reasoning processes, we are sometimes able to apply a specific model to each type of uncertain information, and recombine the final results via a suitable reconciliation process. We validated our approach applying it to the classic schema matching problem, and using the Ontology Alignment Evaluation Initiative, (OAEI) tests to assess the results.
THE EMERGENCE OF EVOLUTIONARY PSYCHOLOGY: WHAT IS AT STAKE?
"... THE THEORY OF evolution by natural selection has revolutionary implications for understanding the design of the human mind and brain, as Darwin himself was the first to recognize (Darwin, 1859). Indeed, a principled understanding of the network of causation that built the functional architecture of ..."
Abstract
 Add to MetaCart
THE THEORY OF evolution by natural selection has revolutionary implications for understanding the design of the human mind and brain, as Darwin himself was the first to recognize (Darwin, 1859). Indeed, a principled understanding of the network of causation that built the functional architecture of the human species offers the possibility of transforming the study of humanity into a natural science capable of precision and rapid progress. Yet, nearly a century and a half after The Origin of Species was published, the psychological, social, and behavioral sciences remain largely untouched by these implications, and many of these disciplines continue to be founded on assumptions evolutionarily informed researchers know to be false (Pinker, 2002; Tooby & Cosmides, 1992). Evolutionary psychology is the longforestalled scientific attempt to assemble out of the disjointed, fragmentary, and mutually contradictory human disciplines a single, logically integrated research framework for the psychological, social, and behavioral sciences—a framework that not only incorporates the evolutionary sciences on a full and equal basis, but that systematically works out all of the revisions in existing belief and research practice that such a synthesis requires (Tooby & Cosmides, 1992). The longterm scientific goal toward which evolutionary psychologists are working is the mapping of our universal human nature. By this, we mean the construction of a set of empirically validated, highresolution models of the evolved mechanisms that collectively constitute universal human nature. Because the evolved function of a psychological mechanism is computational—to regulate behavior and the body adaptively in response to informational inputs—such a model consists of a description of the functional circuit logic or information
Boolean Frame is Adequate for Treatment of Gradation or Fuzziness Equally as for TwoValued or Classical Case
 SISY 2006 • 4 TH SERBIANHUNGARIAN JOINT SYMPOSIUM ON INTELLIGENT SYSTEMS
, 2006
"... Boolean frame are all axioms and theorems of Boolean algebra. Everything which satisfies all Boolean axioms and theorems is inside a Boolean frame and/or is an element of corresponding Boolean algebra. Common for all known approaches for treatment gradation is the fact that either they are not comp ..."
Abstract
 Add to MetaCart
Boolean frame are all axioms and theorems of Boolean algebra. Everything which satisfies all Boolean axioms and theorems is inside a Boolean frame and/or is an element of corresponding Boolean algebra. Common for all known approaches for treatment gradation is the fact that either they are not complete (from the logical point of view) or they are not in the Boolean frame. Here is given Interpolative Boolean algebra (IBA) as a consistent MV realization of finite (atomic) Boolean algebra. Since, axioms and lows of Boolean algebra are actually meter of value independent structure of IBA elements, all axioms and all laws of Boolean algebra are preserved in any type of value realization (twovalued, threevalued, …, [0, 1]). To every element of IBA corresponds generalized Boolean polynomial with ability to process all values of primary variables from real unit interval [0, 1].