Results 1  10
of
1,907,595
A Simple Proof of the Restricted Isometry Property for Random Matrices
 CONSTR APPROX
, 2008
"... We give a simple technique for verifying the Restricted Isometry Property (as introduced by Candès and Tao) for random matrices that underlies Compressed Sensing. Our approach has two main ingredients: (i) concentration inequalities for random inner products that have recently provided algorithmical ..."
Abstract

Cited by 620 (63 self)
 Add to MetaCart
, we obtain simple and direct proofs of Kashin’s theorems on widths of finite balls in Euclidean space (and their improvements due to Gluskin) and proofs of the existence of optimal Compressed Sensing measurement matrices. In the process, we also prove that these measurements have a certain
Very simple classification rules perform well on most commonly used datasets
 Machine Learning
, 1993
"... The classification rules induced by machine learning systems are judged by two criteria: their classification accuracy on an independent test set (henceforth "accuracy"), and their complexity. The relationship between these two criteria is, of course, of keen interest to the machin ..."
Abstract

Cited by 539 (5 self)
 Add to MetaCart
to the machine learning community. There are in the literature some indications that very simple rules may achieve surprisingly high accuracy on many datasets. For example, Rendell occasionally remarks that many real world datasets have "few peaks (often just one) " and so are &
A simple distributed autonomous power control algorithm and its convergence
 IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY
, 1993
"... For wireless cellular communication systems, one seeks a simple effective means of power control of signals associated with randomly dispersed users that are reusing a single channel in different cells. By effecting the lowest interference environment, in meeting a required minimum signaltointerf ..."
Abstract

Cited by 467 (3 self)
 Add to MetaCart
power level of each of the signals, using only local measurements, so that eventually all users meet the p requirement. The local per channel power measurements include that of the intended signal as well as the undesired interference from other users (plus receiver noise). For a certain simple
Some simple effective approximations to the 2Poisson model for probabilistic weighted retrieval
 In Proceedings of SIGIR’94
, 1994
"... The 2–Poisson model for term frequencies is used to suggest ways of incorporating certain variables in probabilistic models for information retrieval. The variables concerned are withindocument term frequency, document length, and withinquery term frequency. Simple weighting functions are develope ..."
Abstract

Cited by 452 (14 self)
 Add to MetaCart
The 2–Poisson model for term frequencies is used to suggest ways of incorporating certain variables in probabilistic models for information retrieval. The variables concerned are withindocument term frequency, document length, and withinquery term frequency. Simple weighting functions
Learnability and the VapnikChervonenkis dimension
, 1989
"... Valiant’s learnability model is extended to learning classes of concepts defined by regions in Euclidean space E”. The methods in this paper lead to a unified treatment of some of Valiant’s results, along with previous results on distributionfree convergence of certain pattern recognition algorith ..."
Abstract

Cited by 716 (22 self)
 Add to MetaCart
Valiant’s learnability model is extended to learning classes of concepts defined by regions in Euclidean space E”. The methods in this paper lead to a unified treatment of some of Valiant’s results, along with previous results on distributionfree convergence of certain pattern recognition
The Simple Economics of Basic Scientific Research
 Journal of Political Economy
, 1959
"... I begin this essay by reflecting on my early paper (Nelson, 1859), and Ken’s (Arrow, 1962), as period pieces. These papers certainly have been influential in shaping the discussion of science and technology policy over the last forty years, at least among economists, but at the time they were writte ..."
Abstract

Cited by 423 (5 self)
 Add to MetaCart
I begin this essay by reflecting on my early paper (Nelson, 1859), and Ken’s (Arrow, 1962), as period pieces. These papers certainly have been influential in shaping the discussion of science and technology policy over the last forty years, at least among economists, but at the time they were
Hierarchical Models of Object Recognition in Cortex
, 1999
"... The classical model of visual processing in cortex is a hierarchy of increasingly sophisticated representations, extending in a natural way the model of simple to complex cells of Hubel and Wiesel. Somewhat surprisingly, little quantitative modeling has been done in the last 15 years to explore th ..."
Abstract

Cited by 809 (83 self)
 Add to MetaCart
The classical model of visual processing in cortex is a hierarchy of increasingly sophisticated representations, extending in a natural way the model of simple to complex cells of Hubel and Wiesel. Somewhat surprisingly, little quantitative modeling has been done in the last 15 years to explore
Simple statistical gradientfollowing algorithms for connectionist reinforcement learning
 Machine Learning
, 1992
"... Abstract. This article presents a general class of associative reinforcement learning algorithms for connectionist networks containing stochastic units. These algorithms, called REINFORCE algorithms, are shown to make weight adjustments in a direction that lies along the gradient of expected reinfor ..."
Abstract

Cited by 442 (0 self)
 Add to MetaCart
reinforcement in both immediatereinforcement tasks and certain limited forms of delayedreinforcement tasks, and they do this without explicitly computing gradient estimates or even storing information from which such estimates could be computed. Specific examples of such algorithms are presented, some
Factor Graphs and the SumProduct Algorithm
 IEEE TRANSACTIONS ON INFORMATION THEORY
, 1998
"... A factor graph is a bipartite graph that expresses how a "global" function of many variables factors into a product of "local" functions. Factor graphs subsume many other graphical models including Bayesian networks, Markov random fields, and Tanner graphs. Following one simple c ..."
Abstract

Cited by 1761 (69 self)
 Add to MetaCart
A factor graph is a bipartite graph that expresses how a "global" function of many variables factors into a product of "local" functions. Factor graphs subsume many other graphical models including Bayesian networks, Markov random fields, and Tanner graphs. Following one simple
Results 1  10
of
1,907,595