Results 1 
7 of
7
Bernoulli’s Principle of Insufficient Reason and Conservation of Information in Computer Search
"... Abstract—Conservation of information (COI) popularized by the no free lunch theorem is a great leveler of search algorithms, showing that on average no search outperforms any other. Yet in practice some searches appear to outperform others. In consequence, some have questioned the significance of CO ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
(Show Context)
Abstract—Conservation of information (COI) popularized by the no free lunch theorem is a great leveler of search algorithms, showing that on average no search outperforms any other. Yet in practice some searches appear to outperform others. In consequence, some have questioned the significance of COI to the performance of search algorithms. An underlying foundation of COI is Bernoulli’s Principle of Insufficient Reason1(PrOIR) which imposes of a uniform distribution on a search space in the absence of all prior knowledge about the search target or the search space structure. The assumption is conserved under mapping. If the probability of finding a target in a search space is p, then the problem of finding the target in any subset of the search space is p. More generally, all sometomany mappings of a uniform search space result in a new search space where the chance of doing better than p is 5050. Consequently the chance of doing worse is 5050. This result can be viewed as a confirming property of COI. To properly assess the significance of the COI for search, one must completely identify the precise sources of information that affect search performance. This discussion leads to resolution of the seeming conflict between COI and the observation that some search algorithms perform well on a large class of problems. Index Terms—active information, Bernoulli’s principle of insufficient reason, Bernoulli’s principle of nonsufficient reason, Bertrand’s paradox, conservation of generalization performance, conservation of information, endogenous information, equal distribution of ignorance, evolutionary search, no free lunch theorem, principle of indifference, uniform distribution
A Simple Discrete System with Chaotic Behavior*
"... Abstract — We discuss the behavior of a particular discrete system, viz. Post’s system of tag with alphabet {0,1}, deletion number d = 3, and rules: 0 → 00, 1 → 1101. As initial string we consider all strings of length less than or equal to 15 as well as all ‘‘worst case’ ’ inputs of the form (100) ..."
Abstract
 Add to MetaCart
Abstract — We discuss the behavior of a particular discrete system, viz. Post’s system of tag with alphabet {0,1}, deletion number d = 3, and rules: 0 → 00, 1 → 1101. As initial string we consider all strings of length less than or equal to 15 as well as all ‘‘worst case’ ’ inputs of the form (100) m with 1 ≤ m ≤ 128. 1.
A Public Key Cryptoscheme Using the Bitpair Method
"... Abstract: The authors give the definition of a bitpair shadow, and design the three algorithms of a public key cryptoscheme called JUNA which regards a bitpair as an operation unit, and is based on the multivariate permutation problem (MPP) and the anomalous subset product problem (ASPP). Then, de ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract: The authors give the definition of a bitpair shadow, and design the three algorithms of a public key cryptoscheme called JUNA which regards a bitpair as an operation unit, and is based on the multivariate permutation problem (MPP) and the anomalous subset product problem (ASPP). Then, demonstrate the correctness of the decryption algorithm, deduce the probability that a plaintext solution is nonunique is nearly zero, and analyze the security of the cryptoscheme against extracting a private key from a public key, and recovering a plaintext from a ciphertext on the assumption that IFP, DLP, and SSP can be solved efficiently. Besides, give the conversion from the ASPP to the anomalous subset sum problem (ASSP) through a discrete logarithm. The facts show the bitpair method increases the density of a related ASSP knapsack with D> 1, and decreases the length of modulus of the cryptoscheme with ⎡lg M ⎤ =
The REESSE1+ Public Key Cryptosystem v 2.21 *
, 2012
"... Abstract: In this paper, the authors give the definitions of a coprime sequence and a lever function, and describe the five algorithms and six characteristics of a prototypal public key cryptosystem which is used for encryption and signature, and based on three new problems and one existent problem: ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract: In this paper, the authors give the definitions of a coprime sequence and a lever function, and describe the five algorithms and six characteristics of a prototypal public key cryptosystem which is used for encryption and signature, and based on three new problems and one existent problem: the multivariate permutation problem (MPP), the anomalous subset product problem (ASPP), the transcendental logarithm problem (TLP), and the polynomial root finding problem (PRFP). Prove by reduction that MPP, ASPP, and TLP are computationally at least equivalent to the discrete logarithm problem (DLP) in the same prime field, and meanwhile find some evidence which inclines people to believe that the new problems are harder than DLP each, namely unsolvable in DLP subexponential time. Demonstrate the correctness of the decryption and the verification, deduce the probability of a plaintext solution being nonunique is nearly zero, and analyze the exact securities of the cryptosystem against recovering a plaintext from a ciphertext, extracting a private key from a public key or a signature, and forging a signature through known signatures, public keys, and messages on the assumption that IFP, DLP, and LSSP can be solved. Studies manifest that the running times of effectual attack tasks are greater than or equal to O(2 n) so far when n = 80, 96, 112, or 128 with lg M ≈ 696, 864, 1030, or 1216. As viewed from utility, it should be researched further how to decrease the length of a modulus and to increase the speed of the decryption.
A Lightweight Hash Function Resisting Birthday Attack and Meetinthemiddle Attack
"... Abstract: In this paper, to match a lightweight digital signing scheme of which the length of modulus is between 80 and 160 bits, a lightweight hash function called JUNA is proposed. It is based on the intractabilities MPP and ASPP, and regards a short message or a message digest as an input which i ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract: In this paper, to match a lightweight digital signing scheme of which the length of modulus is between 80 and 160 bits, a lightweight hash function called JUNA is proposed. It is based on the intractabilities MPP and ASPP, and regards a short message or a message digest as an input which is treated as only one block. The JUNA hash contains two algorithms: an initialization algorithm and a compression algorithm, and converts a string of n bits into another of m bits, where 80 ≤ m ≤ n ≤ 4096. The two algorithms are described, and their securities are analyzed from several aspects. The analysis shows that the JUNA hash is oneway, weakly collisionfree, strongly collisionfree along with a proof, especially resistant to birthday attack and meetinthemiddle attack, and up to the security of O(2 m) arithmetic steps at present, while the time complexity of its compression algorithm is O(n) arithmetic steps. Moreover, the JUNA hash with short input and small computation may be used to reform a classical hash with output of n bits and security of O(2 n / 2) into a compact hash with output of n / 2 bits and equivalent security. Thus, it opens a door to convenience for utilization of lightweight digital signing schemes.