Results 1  10
of
3,074,981
The Nature of Statistical Learning Theory
, 1999
"... Statistical learning theory was introduced in the late 1960’s. Until the 1990’s it was a purely theoretical analysis of the problem of function estimation from a given collection of data. In the middle of the 1990’s new types of learning algorithms (called support vector machines) based on the deve ..."
Abstract

Cited by 12976 (32 self)
 Add to MetaCart
Statistical learning theory was introduced in the late 1960’s. Until the 1990’s it was a purely theoretical analysis of the problem of function estimation from a given collection of data. In the middle of the 1990’s new types of learning algorithms (called support vector machines) based
Maximum likelihood from incomplete data via the EM algorithm
 JOURNAL OF THE ROYAL STATISTICAL SOCIETY, SERIES B
, 1977
"... A broadly applicable algorithm for computing maximum likelihood estimates from incomplete data is presented at various levels of generality. Theory showing the monotone behaviour of the likelihood and convergence of the algorithm is derived. Many examples are sketched, including missing value situat ..."
Abstract

Cited by 11807 (17 self)
 Add to MetaCart
A broadly applicable algorithm for computing maximum likelihood estimates from incomplete data is presented at various levels of generality. Theory showing the monotone behaviour of the likelihood and convergence of the algorithm is derived. Many examples are sketched, including missing value
Algorithmic information theory
 IBM JOURNAL OF RESEARCH AND DEVELOPMENT
, 1977
"... This paper reviews algorithmic information theory, which is an attempt to apply informationtheoretic and probabilistic ideas to recursive function theory. Typical concerns in this approach are, for example, the number of bits of information required to specify an algorithm, or the probability that ..."
Abstract

Cited by 399 (20 self)
 Add to MetaCart
This paper reviews algorithmic information theory, which is an attempt to apply informationtheoretic and probabilistic ideas to recursive function theory. Typical concerns in this approach are, for example, the number of bits of information required to specify an algorithm, or the probability
Multisource Algorithmic Information Theory
, 2006
"... Multisource information theory in Shannon setting is well known. In this article we try to develop its algorithmic information theory counterpart and use it as the general framework for many interesting questions about Kolmogorov complexity. 1 ..."
Abstract
 Add to MetaCart
Multisource information theory in Shannon setting is well known. In this article we try to develop its algorithmic information theory counterpart and use it as the general framework for many interesting questions about Kolmogorov complexity. 1
Algorithmic information theory
 In Handbook on the Philosophy of Information
"... We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain the main concepts of this quantitative approach to defining ‘information’. We discuss the extent to which Kolmogorov’s and Shannon’s information theory have a common purpose, and where they are ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
We introduce algorithmic information theory, also known as the theory of Kolmogorov complexity. We explain the main concepts of this quantitative approach to defining ‘information’. We discuss the extent to which Kolmogorov’s and Shannon’s information theory have a common purpose, and where
An application of algorithmic information theory
, 2007
"... Slides available at home.gwu.edu/∼jchubb Introduction Initial segments of scattered linear orderings Ideas from algorithmic information theory Wrapping up Preliminaries • A ≤T B if there is an algorithm using B as an oracle that will compute the characteristic function of A. • A ≤wtt B if there’s an ..."
Abstract
 Add to MetaCart
Slides available at home.gwu.edu/∼jchubb Introduction Initial segments of scattered linear orderings Ideas from algorithmic information theory Wrapping up Preliminaries • A ≤T B if there is an algorithm using B as an oracle that will compute the characteristic function of A. • A ≤wtt B if there’s
Quantum algorithmic information theory
, 2008
"... The agenda of quantum algorithmic information theory, ordered ‘topdown, ’ is the quantum halting amplitude, followed by the quantum algorithmic information content, which in turn requires the theory of quantum computation. The fundamental atoms processed by quantum computation are the quantum bits ..."
Abstract
 Add to MetaCart
The agenda of quantum algorithmic information theory, ordered ‘topdown, ’ is the quantum halting amplitude, followed by the quantum algorithmic information content, which in turn requires the theory of quantum computation. The fundamental atoms processed by quantum computation are the quantum bits
A Glimpse into Algorithmic Information Theory
 LOGIC, LANGUAGE AND COMPUTATION, VOLUME 3, CSLI SERIES
, 1999
"... This paper is a subjective, short overview of algorithmic information theory. We critically discuss various equivalent algorithmical models of randomness motivating a "randomness hypothesis". Finally some recent results on computably enumerable random reals are reviewed. ..."
Abstract

Cited by 6 (6 self)
 Add to MetaCart
This paper is a subjective, short overview of algorithmic information theory. We critically discuss various equivalent algorithmical models of randomness motivating a "randomness hypothesis". Finally some recent results on computably enumerable random reals are reviewed.
HOW TO RUN ALGORITHMIC INFORMATION THEORY
, 1995
"... Hi everybody! It’s a great pleasure for me to be back here at the new, improved Santa Fe Institute in this spectacular location. I guess this is my fourth visit and it’s always very stimulating, so I’m always very happy to visit you guys. I’d like to tell you what I’ve been up to lately. First of al ..."
Abstract
 Add to MetaCart
of all, let me say what algorithmic information theory is good for, before telling you about the new version of it I’ve got. 1
Results 1  10
of
3,074,981