Results 1  10
of
10
Independent Component Filters Of Natural Images Compared With Simple Cells In Primary Visual Cortex
, 1998
"... this article we investigate to what extent the statistical properties of natural images can be used to understand the variation of receptive field properties of simple cells in the mammalian primary visual cortex. The receptive fields of simple cells have been studied extensively (e.g., Hubel & Wies ..."
Abstract

Cited by 274 (0 self)
 Add to MetaCart
this article we investigate to what extent the statistical properties of natural images can be used to understand the variation of receptive field properties of simple cells in the mammalian primary visual cortex. The receptive fields of simple cells have been studied extensively (e.g., Hubel & Wiesel 1968, DeValois et al. 1982a, DeAngelis et al. 1993): they are localised in space and time, have bandpass characteristics in the spatial and temporal frequency domains, are oriented, and are often sensitive to the direction of motion of a stimulus. Here we will concentrate on the spatial properties of simple cells. Several hypotheses as to the function of these cells have been proposed. As the cells preferentially respond to oriented edges or lines, they can be viewed as edge or line detectors. Their joint localisation in both the spatial domain and the spatial frequency domain has led to the suggestion that they mimic Gabor filters, minimising uncertainty in both domains (Daugman 1980, Marcelja 1980). More recently, the match between the operations performed by simple cells and the wavelet transform has attracted attention (e.g., Field 1993). The approaches based on Gabor filters and wavelets basically consider processing by the visual cortex as a general image processing strategy, relatively independent of detailed assumptions about image statistics. On the other hand, the edge and line detector hypothesis is based on the intuitive notion that edges and lines are both abundant and important in images. This theme of relating simple cell properties with the statistics of natural images was explored extensively by Field (1987, 1994). He proposed that the cells are optimized specifically for coding natural images. He argued that one possibility for such a code, sparse coding...
Non Linear Neurons in the Low Noise Limit: A Factorial Code Maximizes Information Transfer
, 1994
"... We investigate the consequences of maximizing information transfer in a simple neural network (one input layer, one output layer), focussing on the case of non linear transfer functions. We assume that both receptive fields (synaptic efficacies) and transfer functions can be adapted to the environm ..."
Abstract

Cited by 140 (18 self)
 Add to MetaCart
We investigate the consequences of maximizing information transfer in a simple neural network (one input layer, one output layer), focussing on the case of non linear transfer functions. We assume that both receptive fields (synaptic efficacies) and transfer functions can be adapted to the environment. The main result is that, for bounded and invertible transfer functions, in the case of a vanishing additive output noise, and no input noise, maximization of information (Linsker'sinfomax principle) leads to a factorial code  hence to the same solution as required by the redundancy reduction principle of Barlow. We show also that this result is valid for linear, more generally unbounded, transfer functions, provided optimization is performed under an additive constraint, that is which can be written as a sum of terms, each one being specific to one output neuron. Finally we study the effect of a non zero input noise. We find that, at first order in the input noise, assumed to be small ...
Unsupervised Neural Network Learning Procedures . . .
, 1996
"... In this article, we review unsupervised neural network learning procedures which can be applied to the task of preprocessing raw data to extract useful features for subsequent classification. The learning algorithms reviewed here are grouped into three sections: informationpreserving methods, densi ..."
Abstract

Cited by 24 (1 self)
 Add to MetaCart
In this article, we review unsupervised neural network learning procedures which can be applied to the task of preprocessing raw data to extract useful features for subsequent classification. The learning algorithms reviewed here are grouped into three sections: informationpreserving methods, density estimation methods, and feature extraction methods. Each of these major sections concludes with a discussion of successful applications of the methods to realworld problems.
Processing of natural time series of intensities by the visual system of the blowfly
 Vision Research
, 1997
"... A major problem a visual system faces is how to fit the large intensity variation of natural image streams into the limited dynamic range of its neurons. One of the means to accomplish this is through the use of gain control. In order to investigate this, natural time series of intensities were meas ..."
Abstract

Cited by 22 (5 self)
 Add to MetaCart
A major problem a visual system faces is how to fit the large intensity variation of natural image streams into the limited dynamic range of its neurons. One of the means to accomplish this is through the use of gain control. In order to investigate this, natural time series of intensities were measured, as well as the responses of blowfly photoreceptors and Large Monopolar Cells (LMCs) to these time series. Time series representative of what each photoreceptor of a real visual system would normally receive were measured with an optical system measuring the light intensity of a spot comparable to the field of view of single human foveal cones. This system was worn on a headband by a freely walking person. Resulting time series have rmscontrasts ranging from an average of 0.45 for 1 second segments to 1.39 for 100 second segments (both when limited to frequencies up to 100 Hz). Power spectra behave approximately as 1/f (f: temporal frequency). Measured time series were subsequently presented to fly photoreceptors and LMCs by playing them back on an LED. The results show that fast gain controls indeed keep the response within the dynamic range of the cells and that a large part of this range is actually used for packing the information in natural time series.
Maximization of Mutual Information in a Linear Noisy Network: a Detailed Study
"... We consider a linear, onelayer feedforward neural network performing a coding task. The goal of the network is to provide a statistical neural representation that convey as much information as possible on the input stimuli in noisy conditions. We determine the family of synaptic couplings that maxi ..."
Abstract

Cited by 10 (5 self)
 Add to MetaCart
We consider a linear, onelayer feedforward neural network performing a coding task. The goal of the network is to provide a statistical neural representation that convey as much information as possible on the input stimuli in noisy conditions. We determine the family of synaptic couplings that maximizes the mutual information between input and output distribution. Optimization is performed under different constraints on the synaptic efficacies. We analyze the dependence of the solutions on input and output noises. This work goes beyond previous studies of the same problem in that: (i) we perform a detailed stability analysis in order to find the global maxima of the mutual information; (ii) we examine the properties of the optimal synaptic configurations under different constraints; (iii) we do not assume translational invariance of the input data, as it is usually done when input are assumed to be visual stimuli. 1 Introduction This paper deals with the problem of learning the sta...
Natural Image Statistics and Visual Processing
, 1998
"... This thesis focuses on the statistics of natural images. The first question that is to be
answered is: what are natural images and why do we study them. We start with our
definition, and then discuss the properties and uses of natural images. An image is a
projection of an environment, and natural i ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
This thesis focuses on the statistics of natural images. The first question that is to be
answered is: what are natural images and why do we study them. We start with our
definition, and then discuss the properties and uses of natural images. An image is a
projection of an environment, and natural images are those that are taken from a
natural environment, i.e., an environment that is commonly encountered by a
particular organism. This means that these images represent the natural visual input
(natural stimulus) of an eye. In general, images may include optical information
extending over space, time (timevarying images), as well as wavelength (colour
images). In this thesis, however, we restrict ourselves to images of light intensity
(black and white images) that either extend exclusively over space (still images) or
exclusively over time (time series).
The motivation for investigating natural images is to gain a better understanding of
neural processing in visual systems. Natural images and visual processing in
biological systems are linked by the hypothesis that evolution has optimised visual
systems to process natural stimuli. The analysis of the optimal performance of
biological visual systems may inspire the building of artificial visual systems.
Information Processing by a Noisy Binary Channel
"... We study the information processing properties of a binary channel receiving data from a gaussian source. A systematic comparison with linear processing is done. A remarkable property of the binary sytem is that, as the ratio ff between the number of output and input units increases, binary processi ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We study the information processing properties of a binary channel receiving data from a gaussian source. A systematic comparison with linear processing is done. A remarkable property of the binary sytem is that, as the ratio ff between the number of output and input units increases, binary processing becomes equivalent to linear processing with a quantization output noise that depends on ff. In this regime , that holds up to O(ff \Gamma4 ) , information processing occurs as if populations of ff binary units cooperate to represent one ffbit output unit. Unsupervised learning of a noisy environment by optimization of the parameters of the binary channel is also considered.
Information Transmission By Networks Of Non Linear Neurons
"... this paper we considered the problem of maximizing information transfer with a network of neurons made of N inputs and p outputs, focussing on the case of non linear transfer functions and arbitrary input distributions. We assumed that both the transfer functions and the synaptic efficacies could be ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
this paper we considered the problem of maximizing information transfer with a network of neurons made of N inputs and p outputs, focussing on the case of non linear transfer functions and arbitrary input distributions. We assumed that both the transfer functions and the synaptic efficacies could be adapted to the environment. The main consequence of our analysis is that, in the limit of small additive output noise (and an even smaller input noise), the infomax principle of Linsker implies the redundancy reduction
Auxiliary Variational Information Maximization for Dimensionality Reduction
 In PASCAL: Subspace, Latent Structure and Feature Selection Techniques: Statistical and Optimisation Perspectives Workshop
, 2005
"... www.idiap.ch Abstract. Mutual Information (MI) is a long studied measure of information content, and many attempts to apply it to feature extraction and stochastic coding have been made. However, in general MI is computationally intractable to compute, and most previous studies redefine the criterio ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
www.idiap.ch Abstract. Mutual Information (MI) is a long studied measure of information content, and many attempts to apply it to feature extraction and stochastic coding have been made. However, in general MI is computationally intractable to compute, and most previous studies redefine the criterion in forms of approximations. Recently we described properties of a simple lower bound on MI [2], and discussed its links to some of the popular dimensionality reduction techniques. Here we introduce a richer family of the auxiliary variational bounds on MI, which generalize our previous approximations. Our specific focus then is on applying the bound to extracting informative lowerdimensional orthonormal projections in the presence of irreducible Gaussian noise. We show that our method produces significantly tighter bounds than the asif Gaussian approximations [7] of MI. We also show that learning projections to multinomial auxiliary spaces may facilitate reconstructions of the sources from noisy lowerdimensional representations. 1