Results 1  10
of
74
Selfnonself discrimination in a computer
 In Proceedings of the 1994 IEEE Symposium on Research in Security and Privacy, Los Alamitos, CA. IEEE Computer
, 1994
"... SFI Working Papers contain accounts of scientific work of the author(s) and do not necessarily represent the views of the Santa Fe Institute. We accept papers intended for publication in peerreviewed journals or proceedings volumes, but not papers that have already appeared in print. Except for pap ..."
Abstract

Cited by 383 (25 self)
 Add to MetaCart
(Show Context)
SFI Working Papers contain accounts of scientific work of the author(s) and do not necessarily represent the views of the Santa Fe Institute. We accept papers intended for publication in peerreviewed journals or proceedings volumes, but not papers that have already appeared in print. Except for papers by our external faculty, papers must be based on work done at SFI, inspired by an invited visit to or collaboration at SFI, or funded by an SFI grant. ©NOTICE: This working paper is included by permission of the contributing author(s) as a means to ensure timely distribution of the scholarly and technical work on a noncommercial basis. Copyright and all rights therein are maintained by the author(s). It is understood that all persons copying this information will adhere to the terms and constraints invoked by each author's copyright. These works may be reposted only with the explicit permission of the copyright holder. www.santafe.edu SANTA FE INSTITUTE
The Markov Chain Monte Carlo method: an approach to approximate counting and integration
, 1996
"... In the area of statistical physics, Monte Carlo algorithms based on Markov chain simulation have been in use for many years. The validity of these algorithms depends crucially on the rate of convergence to equilibrium of the Markov chain being simulated. Unfortunately, the classical theory of stocha ..."
Abstract

Cited by 286 (12 self)
 Add to MetaCart
(Show Context)
In the area of statistical physics, Monte Carlo algorithms based on Markov chain simulation have been in use for many years. The validity of these algorithms depends crucially on the rate of convergence to equilibrium of the Markov chain being simulated. Unfortunately, the classical theory of stochastic processes hardly touches on the sort of nonasymptotic analysis required in this application. As a consequence, it had previously not been possible to make useful, mathematically rigorous statements about the quality of the estimates obtained. Within the last ten years, analytical tools have been devised with the aim of correcting this deficiency. As well as permitting the analysis of Monte Carlo algorithms for classical problems in statistical physics, the introduction of these tools has spurred the development of new approximation algorithms for a wider class of problems in combinatorial enumeration and optimization. The “Markov chain Monte Carlo ” method has been applied to a variety of such problems, and often provides the only known efficient (i.e., polynomial time) solution technique.
An Immunological Approach to Change Detection: Algorithms
 Analysis and Implications,” IEEE Symposium on Security and Privacy
, 1996
"... We present new results on a distributable changedetection method inspired by the natural immune system. A weakness in the original algorithm was the exponential cost of generating detectors. Two detectorgenerating algorithms are introduced which run in linear time. The algorithms are analyzed, heur ..."
Abstract

Cited by 147 (21 self)
 Add to MetaCart
(Show Context)
We present new results on a distributable changedetection method inspired by the natural immune system. A weakness in the original algorithm was the exponential cost of generating detectors. Two detectorgenerating algorithms are introduced which run in linear time. The algorithms are analyzed, heuristics are given for setting parameters based on the analysis, and the presence of holes in detector space is examined. The analysis provides a basis for assessing the practicality of the algorithms in specific settings, and some of the implications are discussed. 1.
The WellPosed Problem
 Foundations of Physics
, 1973
"... distributions obtained from transformation groups, using as our main example the famous paradox of Bertrand. Bertrand's problem (Bertrand, 1889) was stated originally in terms of drawing a straight line "at random" intersecting a circle. It will be helpful to think of this in a more ..."
Abstract

Cited by 37 (0 self)
 Add to MetaCart
(Show Context)
distributions obtained from transformation groups, using as our main example the famous paradox of Bertrand. Bertrand's problem (Bertrand, 1889) was stated originally in terms of drawing a straight line "at random" intersecting a circle. It will be helpful to think of this in a more concrete way; presumably, we do no violence to the problem (i.e., it is still just as "random") if we suppose that we are tossing straws onto the circle, without specifying how they are tossed. We therefore formulate the problem as follows. A long straw is tossed at random onto a circle; given that it falls so that it intersects the circle, what is the probability that the chord thus defined is longer than a side of the inscribed equilateral triangle? Since Bertrand proposed it in 1889 this problem has been cited to generations of students to demonstrate that Laplace's "principle of indifference" contains logical inconsistencies. For, there appear to be many ways of defining "equally possibl
Resolution lower bounds for perfect matching principles
 Journal of Computer and System Sciences
"... For an arbitrary hypergraph H, letPM(H) be the propositional formula asserting that H contains a perfect matching. We show that every resolution refutation of PM(H) musthavesize exp Ω δ(H) λ(H)r(H)(log n(H))(r(H)+logn(H)) where n(H) is the number of vertices, δ(H) is the minimal degree of a vertex, ..."
Abstract

Cited by 37 (5 self)
 Add to MetaCart
For an arbitrary hypergraph H, letPM(H) be the propositional formula asserting that H contains a perfect matching. We show that every resolution refutation of PM(H) musthavesize exp Ω δ(H) λ(H)r(H)(log n(H))(r(H)+logn(H)) where n(H) is the number of vertices, δ(H) is the minimal degree of a vertex, r(H) is the maximal size of an edge, and λ(H) is the maximal number of edges incident to two different vertices. For ordinary graphs G our general bound considerably simplifies to exp Ω (implying an exp(Ω(δ(G) 1/3)) lower bound that depends on the minimal degree only). As a direct corollary, every resolution proof of the functional ( ( onto)) version of must have size exp Ω (which the pigeonhole principle onto − FPHP m n n (log m) 2 δ(G) (log n(G)) 2 becomes exp ( Ω(n 1/3) ) when the number of pigeons m is unbounded). This in turn immediately implies an exp(Ω(t/n 3)) lower bound on the size of resolution proofs of the principle asserting that the circuit size of the Boolean function fn in n variables is greater than t. Inparticular,Resolution does not possess efficient proofs of NP ⊆ P/poly. These results relativize, in a natural way, to a more general principle M(UH) asserting that H contains a matching covering all vertices in U ⊆ V (H).
Immunotronics  Novel FiniteStateMachine Architectures With BuiltIn SelfTest Using Self–Nonself Differentiation
, 2002
"... A novel approach to hardware fault tolerance is demonstrated that takes inspiration from the human immune system as a method of fault detection. The human immune system is a remarkable system of interacting cells and organs that protect the body from invasion and maintains reliable operation even in ..."
Abstract

Cited by 18 (1 self)
 Add to MetaCart
A novel approach to hardware fault tolerance is demonstrated that takes inspiration from the human immune system as a method of fault detection. The human immune system is a remarkable system of interacting cells and organs that protect the body from invasion and maintains reliable operation even in the presence of invading bacteria or viruses. This paper seeks to address the field of electronic hardware fault tolerance from an immunological perspective with the aim of showing how novel methods based upon the operation of the immune system can both complement and create new approaches to the development of fault detection mechanisms for reliable hardware systems. In particular, it is shown that by use of partial matching, as prevalent in biological systems, high fault coverage can be achieved with the added advantage of reducing memory requirements. The development of a generic finitestatemachine immunization procedure is discussed that allows any system that can be represented in such a manner to be “immunized” against the occurrence of faulty operation. This is demonstrated by the creation of an immunized decade counter that can detect the presence of faults in real time.
The Theory of Fuzzy Sets: Beliefs and Realities
 International Journal of Energy, Information and Communications
, 2011
"... On two important counts, the Zadehian theory of fuzzy sets urgently needs to be restructured. First, it can be established that for a normal fuzzy number N = [α, β, γ] with membership function Ψ1(x), if α ≤ x ≤ β, Ψ2(x), if β ≤ x ≤ γ, and 0, otherwise, Ψ1(x) is in fact the distribution function of a ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
(Show Context)
On two important counts, the Zadehian theory of fuzzy sets urgently needs to be restructured. First, it can be established that for a normal fuzzy number N = [α, β, γ] with membership function Ψ1(x), if α ≤ x ≤ β, Ψ2(x), if β ≤ x ≤ γ, and 0, otherwise, Ψ1(x) is in fact the distribution function of a random variable defined in the interval [α, β], while Ψ2(x) is the complementary distribution function of another random variable defined in the interval [β, γ]. In other words, every normal law of fuzziness can be expressed in terms of two laws of randomness defined in the measure theoretic sense. This is how a normal fuzzy number should be constructed, and this is how partial presence of an element in a fuzzy set has to be defined. Hence the measure theoretic matters with reference to fuzziness have to be studied accordingly. Secondly, the field theoretic matters related to fuzzy sets are required to be revised all over again because in the current definition of the complement of a fuzzy set, fuzzy membership function and fuzzy membership value had been taken to be the same, which led to the conclusion that the fuzzy sets do not follow the set theoretic axioms of exclusion and contradiction. For the complement of a normal fuzzy set, fuzzy membership function and fuzzy membership value are two different things, and the complement of a normal fuzzy set has to be defined accordingly. We shall further show how fuzzy randomness should be explained with reference to two laws of randomness defined for every fuzzy observation so as to make fuzzy statistical conclusions. Finally, we shall explain how randomness can be viewed as a special case of fuzziness defined in our perspective with reference to normal fuzzy numbers of the type [α, β, β]. Indeed every probability distribution function is a DuboisPrade left reference function, and probability can be viewed in that way too.
Asymptotic Estimates of Elementary Probability Distributions
 Studies in Applied Mathematics
, 1996
"... Several new asymptotic estimates (with precise error bounds) are derived for Poisson and binomial distributions as the parameters tend to infinity. The analytic methods used are also applicable to other discrete distribution functions. ..."
Abstract

Cited by 16 (6 self)
 Add to MetaCart
(Show Context)
Several new asymptotic estimates (with precise error bounds) are derived for Poisson and binomial distributions as the parameters tend to infinity. The analytic methods used are also applicable to other discrete distribution functions.
The Statistical Analysis of Spatially Clustered Genes under the Maximum Gap Criterion
 JOURNAL OF COMPUTATIONAL BIOLOGY
, 2005
"... Statistical validation of gene clusters is imperative for many important applications in comparative genomics which depend on the identification of genomic regions that are historically and/or functionally related. We develop the first rigorous statistical treatment of maxgap clusters, a cluster de ..."
Abstract

Cited by 15 (7 self)
 Add to MetaCart
(Show Context)
Statistical validation of gene clusters is imperative for many important applications in comparative genomics which depend on the identification of genomic regions that are historically and/or functionally related. We develop the first rigorous statistical treatment of maxgap clusters, a cluster definition frequently used in empirical studies. We present exact expressions for the probability of observing an individual cluster of a set of marked genes in one genome, as well as upper and lower bounds on the probability of observing a cluster of h homologs in a pairwise wholegenome comparison. We demonstrate the utility of our approach by applying it to a wholegenome comparison of E. coli and B. subtilis. Code for statistical tests is available at www.cs.cmu.edu/∼durand/Lab/software.html.