Results 1  10
of
13
Minimum Message Length Clustering of SpatiallyCorrelated Data with Varying InterClass Penalties
 6TH IEEE INTERNATIONAL CONFERENCE ON COMPUTER AND INFORMATION SCIENCE (ICIS 2007
, 2007
"... We present here some applications of the Minimum Message Length (MML) principle to spatially correlated data. Discrete valued Markov Random Fields are used to model spatial correlation. The models for spatial correlation used here are a generalisation of the model used in (Wallace 1998) [14] for uns ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
We present here some applications of the Minimum Message Length (MML) principle to spatially correlated data. Discrete valued Markov Random Fields are used to model spatial correlation. The models for spatial correlation used here are a generalisation of the model used in (Wallace 1998) [14] for unsupervised classification of spatially correlated data (such as image segmentation). We discuss how our work can be applied to that type of unsupervised classification. We now make the following three new contributions. First, the rectangular grid used in (Wallace 1998) [14] is generalised to an arbitrary graph of arbitrary edge distances. Secondly, we refine (Wallace 1998) [14] slightly by including a discarded message length term important to small data sets and to a simpler problem presented here. Finally, we show how the Minimum Message Length (MML) principle can be used to test for the presence of spatial correlation and how it can be used to choose between models of varying complexity to infer details of the nature of the spatial correlation.
MML, HYBRID BAYESIAN NETWORK GRAPHICAL MODELS, STATISTICAL CONSISTENCY, INVARIANCE AND UNIQUENESS
"... The problem of statistical — or inductive — inference pervades a large number of human activities and a large number of (human and nonhuman) actions requiring ‘intelligence’. Human and other ‘intelligent ’ activity often entails making inductive inferences, remembering and recording observations fr ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
The problem of statistical — or inductive — inference pervades a large number of human activities and a large number of (human and nonhuman) actions requiring ‘intelligence’. Human and other ‘intelligent ’ activity often entails making inductive inferences, remembering and recording observations from which one can make
Decision forests with oblique decision trees
, 2006
"... Ensemble learning schemes have shown impressive increases in prediction accuracy over single model schemes. We introduce a new decision forest learning scheme, whose base learners are Minimum Message Length (MML) oblique decision trees. Unlike other tree inference algorithms, MML oblique decision tr ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Ensemble learning schemes have shown impressive increases in prediction accuracy over single model schemes. We introduce a new decision forest learning scheme, whose base learners are Minimum Message Length (MML) oblique decision trees. Unlike other tree inference algorithms, MML oblique decision tree learning does not overgrow the inferred trees. The resultant trees thus tend to be shallow and do not require pruning. MML decision trees are known to be resistant to overfitting and excellent at probabilistic predictions. A novel weighted averaging scheme is also proposed which takes advantage of high probabilistic prediction accuracy produced by MML oblique decision trees. The experimental results show that the new weighted averaging offers solid improvement over other averaging schemes, such as majority vote. Our MML decision forests scheme also returns favourable results compared to other ensemble learning algorithms on data sets with binary classes.
Measuring Cognitive Abilities of Machines, Humans and NonHuman Animals in a Unified Way: towards Universal
, 2012
"... We present and develop the notion of ‘universal psychometrics ’ as a subject of study, and eventually a discipline, that focusses on the measurement of cognitive abilities for the machine kingdom, which comprises any kind of individual or collective, either artificial, biological or hybrid. Universa ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
We present and develop the notion of ‘universal psychometrics ’ as a subject of study, and eventually a discipline, that focusses on the measurement of cognitive abilities for the machine kingdom, which comprises any kind of individual or collective, either artificial, biological or hybrid. Universal psychometrics can be built, of course, upon the experience, techniques and methodologies from (human) psychometrics, comparative cognition and related areas. Conversely, the perspective and techniques which are being developed in the area of machine intelligence measurement using (algorithmic) information theory can be of much broader applicability and implication outside artificial intelligence. This general approach to universal psychometrics spurs the reunderstanding of most (if not all) of the big issues about the measurement of cognitive abilities, and creates a new foundation for (re)defining and mathematically formalising the concept of cognitive task, evaluable subject, interface, task choice, difficulty, agent response curves, etc. We introduce the notion of a universal cognitive test and discuss whether (and when) it may be necessary for exploring the machine kingdom. On the issue of intelligence and very general abilities, we also get some results and connections with the related notions of nofreelunch theorems and universal priors. 1
Tomographic Reconstruction of Images from Noisy Projections A Preliminary Study
"... Abstract. Although Computed Tomography (CT) is a mature discipline, the development of techniques that will further reduce radiation dose are still essential. This paper makes steps towards projection and reconstruction methods which aim to assist in the reduction of this dosage, by studying the way ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Abstract. Although Computed Tomography (CT) is a mature discipline, the development of techniques that will further reduce radiation dose are still essential. This paper makes steps towards projection and reconstruction methods which aim to assist in the reduction of this dosage, by studying the way noise propagates from projection space to image space. Inference methods Maximum Likelihood Estimation (MLE), Akaike’s Information Criterion (AIC) and Minimum Message Length (MML) are used to obtain accurate models obtained from minimal data. 1
Supervised Learning of a Generative Model for EdgeWeighted Graphs
"... This paper addresses the problem of learning archetypal structural models from examples. To this end we define a generative model for graphs where the distribution of observed nodes and edges is governed by a set of independent Bernoulli trials with parameters to be estimated from data in a situatio ..."
Abstract
 Add to MetaCart
This paper addresses the problem of learning archetypal structural models from examples. To this end we define a generative model for graphs where the distribution of observed nodes and edges is governed by a set of independent Bernoulli trials with parameters to be estimated from data in a situation where the correspondences between the nodes in the data graphs and the nodes in the model are not not known ab initio and must be estimated from local structure. This results in an EMlike approach where we alternate the estimation of the node correspondences with the estimation of the model parameters. Parameter estimation and model order selection is addressed within a Minimum Message Length (MML) framework. 1
Enhancing MML Clustering Using Context Data with Climate Applications
"... Abstract. In Minimum Message Length (MML) clustering (unsupervised classification, mixture modelling) the aim is to infer a set of classes that best explains the observed data items. There are cases where parts of the observed data do not need to be explained by the inferred classes but can be used ..."
Abstract
 Add to MetaCart
Abstract. In Minimum Message Length (MML) clustering (unsupervised classification, mixture modelling) the aim is to infer a set of classes that best explains the observed data items. There are cases where parts of the observed data do not need to be explained by the inferred classes but can be used to improve the inference and resulting predictions. Our main contribution is to provide a simple and flexible way of using such context data in MML clustering. This is done by replacing the traditional mixing proportion vector with a new context matrix. We show how our method can be used to give evidence regarding the presence of apparent longterm trends in climaterelated atmospheric pressure records. Akaike Information Criterion (AIC) and Bayesian Information Criterion (BIC) solutions for our model have also been implemented to compare with the MML solution.
Simplicity Versus Likelihood Principles in Perception
"... Discussions of the foundations of perceptual inference have often centered on 2 governing principles, the likelihood principle and the simplicity principle. Historically, these principles have usually been seen as opposed, but contemporary statistical (e.g., Bayesian) theory tends to see them as con ..."
Abstract
 Add to MetaCart
Discussions of the foundations of perceptual inference have often centered on 2 governing principles, the likelihood principle and the simplicity principle. Historically, these principles have usually been seen as opposed, but contemporary statistical (e.g., Bayesian) theory tends to see them as consistent, because for a variety of reasons simpler models (i.e., those with fewer dimensions or free parameters) make better predictors than more complex ones. In perception, many interpretation spaces are naturally hierarchical, meaning that they consist of a set of mutually embedded model classes of various levels of complexity, including simpler (lower dimensional) classes that are special cases of more complex ones. This article shows how such spaces can be regarded as algebraic structures, for example, as partial orders or lattices, with interpretations ordered in terms of dimensionality. The natural inference rule in such a space is a kind of simplicity rule: Among all interpretations qualitatively consistent with the image, draw the one that is lowest in the partial order, called the maximumdepth interpretation. This interpretation also maximizes the Bayesian posterior under certain simplifying assumptions, consistent with a unification of simplicity and likelihood principles. Moreover, the algebraic approach brings out the compositional structure inherent in such spaces, showing how perceptual interpretations are composed from a lexicon of primitive perceptual descriptors.