• Documents
  • Authors
  • Tables

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 46,956
Next 10 →

Information Theory and Statistics

by S. Kullback , 1968
"... Entropy and relative entropy are proposed as features extracted from symbol sequences. Firstly, a proper Iterated Function System is driven by the sequence, producing a fractaMike representation (CSR) with a low computational cost. Then, two entropic measures are applied to the CSR histogram of th ..."
Abstract - Cited by 1805 (2 self) - Add to MetaCart
Entropy and relative entropy are proposed as features extracted from symbol sequences. Firstly, a proper Iterated Function System is driven by the sequence, producing a fractaMike representation (CSR) with a low computational cost. Then, two entropic measures are applied to the CSR histogram of the CSR and theoretically justified. Examples are included.

Information Theory, Inference, and Learning Algorithms

by David J. C. MacKay , 2003
"... ..."
Abstract - Cited by 1936 (13 self) - Add to MetaCart
Abstract not found

Algorithmic information theory

by G. J. Chaitin - IBM JOURNAL OF RESEARCH AND DEVELOPMENT , 1977
"... This paper reviews algorithmic information theory, which is an attempt to apply information-theoretic and probabilistic ideas to recursive function theory. Typical concerns in this approach are, for example, the number of bits of information required to specify an algorithm, or the probability that ..."
Abstract - Cited by 385 (18 self) - Add to MetaCart
This paper reviews algorithmic information theory, which is an attempt to apply information-theoretic and probabilistic ideas to recursive function theory. Typical concerns in this approach are, for example, the number of bits of information required to specify an algorithm, or the probability

A Theory of Program Size Formally Identical to Information Theory

by Gregory J. Chaitin , 1975
"... A new definition of program-size complexity is made. H(A;B=C;D) is defined to be the size in bits of the shortest self-delimiting program for calculating strings A and B if one is given a minimal-size selfdelimiting program for calculating strings C and D. This differs from previous definitions: (1) ..."
Abstract - Cited by 380 (15 self) - Add to MetaCart
concept of information theory. For example, H(A;B) = H(A) + H(B=A) + O(1). Also, if a program of length k is assigned measure 2 \Gammak , then H(A) = \Gamma log 2 (the probability that the standard universal computer will calculate A) +O(1). Key Words and Phrases: computational complexity, entropy

information theory

by El Amin Aoulad, Mohamed El Merouani, Abdellatif Medouri
"... information theory ..."
Abstract - Add to MetaCart
information theory

Information Theory

by unknown authors
"... Information Theory 1 ..."
Abstract - Add to MetaCart
Information Theory 1

Some informational aspects of visual perception

by Fred Attneave - Psychol. Rev , 1954
"... The ideas of information theory are at present stimulating many different areas of psychological inquiry. In providing techniques for quantifying situations which have hitherto been difficult or impossible to quantify, they suggest new and more precise ways of conceptualizing these situations (see M ..."
Abstract - Cited by 643 (2 self) - Add to MetaCart
The ideas of information theory are at present stimulating many different areas of psychological inquiry. In providing techniques for quantifying situations which have hitherto been difficult or impossible to quantify, they suggest new and more precise ways of conceptualizing these situations (see

A Network Information Theory for Wireless Communication: Scaling Laws and Optimal Operation

by Liang-liang Xie, P. R. Kumar - IEEE Transactions on Information Theory , 2002
"... How much information can be carried over a wireless network with a multiplicity of nodes? What are the optimal strategies for information transmission and cooperation among the nodes? We obtain sharp information theoretic scaling laws under some conditions. ..."
Abstract - Cited by 362 (19 self) - Add to MetaCart
How much information can be carried over a wireless network with a multiplicity of nodes? What are the optimal strategies for information transmission and cooperation among the nodes? We obtain sharp information theoretic scaling laws under some conditions.

Graph Theory

by Reinhard Diestel , Alexander Schrijver , Paul D. Seymour - MATHEMATISCHES FORSCHUNGSINSTITUT OBERWOLFACH REPORT NO. 16/2007 , 2007
"... This week broadly targeted both finite and infinite graph theory, as well as matroids, including their interaction with other areas of pure mathematics. The talks were complemented by informal workshops focussing on specific problems or particularly active areas. ..."
Abstract - Cited by 1200 (5 self) - Add to MetaCart
This week broadly targeted both finite and infinite graph theory, as well as matroids, including their interaction with other areas of pure mathematics. The talks were complemented by informal workshops focussing on specific problems or particularly active areas.

Theory of Elasticity,

by S P Timoshenko , J N Goodier , 1951
"... ABSTRACT The knowledge of the in-situ stress field in rock masses is in general of crucial importance in various areas of geo-engineering, such as mining or civil underground excavations, hydrocarbon extraction, CO2 storage, hydraulic fracture operations, etc. In the context of the Finite Element ..."
Abstract - Cited by 710 (1 self) - Add to MetaCart
to incorporate all the information available, aside of course from the most basic method consisting of using vertical stresses due to gravity and horizontal due to K 0 . In this paper, the various options available are discussed and compared, and one new alternative procedure is developed based on Airy stress
Next 10 →
Results 1 - 10 of 46,956
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University