Results 1  10
of
168
Rough sets: some extensions,”
 Information Sciences,
, 2007
"... Abstract In this article, we present some extensions of the rough set approach and we outline a challenge for the rough set based research. ..."
Abstract

Cited by 87 (6 self)
 Add to MetaCart
(Show Context)
Abstract In this article, we present some extensions of the rough set approach and we outline a challenge for the rough set based research.
The art of granular computing:
 Proceeding of the International Conference on Rough Sets and Emerging Intelligent Systems Paradigms,
, 2007
"... Abstract: This paper has two purposes. One is to present a critical examination of the rise of granular computing and the other is to suggest a triarchic theory of granular computing. By examining the reasons, justifications, and motivations for the rise of granular computing, we may be able to ful ..."
Abstract

Cited by 74 (20 self)
 Add to MetaCart
(Show Context)
Abstract: This paper has two purposes. One is to present a critical examination of the rise of granular computing and the other is to suggest a triarchic theory of granular computing. By examining the reasons, justifications, and motivations for the rise of granular computing, we may be able to fully appreciate its scope, goal and potential values. The results enable us to formulate a triarchic theory in the light of research results from many disciplines. The three components of the theory are labeled as the philosophy, the methodology, and the computation. The integration of the three offers a unified view of granular computing as a way of structured thinking, a method of structured problem solving, and a paradigm of structured information processing, focusing on hierarchical granular structures. The triarchic theory is an important effort in synthesizing the various theories and models of granular computing. Key words: Triarchic theory of granular computing; systems theory; structured thinking, problem solving and information processing. CLC number: Document code: A Introduction Although granular computing, as a separate field of study, started a decade ago [1], its basic philosophy, ideas, principles, methodologies, theories and tools has, in fact, long been used either explicitly or implicitly across many branches of natural and social sciences The answers, at least partial answers, to these questions may be obtained by drawing and synthesizing results from wellestablished disciplines, including philosophy, psychology, neuroscience, cognitive science, education, artificial intelligence, computer programming, and many more. Previously, I argued that granular computing represents an idea converged from many branches of natural and social sciences HumanInspired Computing Research on understanding the human brain and natural intelligence is closely related to the field of artificial intelligence (AI) and information technology (IT). The results have led to a computational view for explaining how the mind works
Prı́ncipe, “The kernel leastmeansquare algorithm
 IEEE Transactions on Signal Processing
, 2008
"... Abstract—The combination of the famed kernel trick and the leastmeansquare (LMS) algorithm provides an interesting samplebysample update for an adaptive filter in reproducing kernel Hilbert spaces (RKHS), which is named in this paper the KLMS. Unlike the accepted view in kernel methods, this pap ..."
Abstract

Cited by 50 (9 self)
 Add to MetaCart
(Show Context)
Abstract—The combination of the famed kernel trick and the leastmeansquare (LMS) algorithm provides an interesting samplebysample update for an adaptive filter in reproducing kernel Hilbert spaces (RKHS), which is named in this paper the KLMS. Unlike the accepted view in kernel methods, this paper shows that in the finite training data case, the KLMS algorithm is well posed in RKHS without the addition of an extra regularization term to penalize solution norms as was suggested by
Efficient Kernel Machines Using the Improved Fast Gauss Transform
 Advances in Neural Information Processing Systems 17
, 2004
"... The computation required for kernel machines with N training samples is O(N ). Such computational complexity is significant even for moderate size problems and is prohibitive for large datasets. We present an approximation technique based on the improved fast Gauss transform to reduce the com ..."
Abstract

Cited by 49 (7 self)
 Add to MetaCart
(Show Context)
The computation required for kernel machines with N training samples is O(N ). Such computational complexity is significant even for moderate size problems and is prohibitive for large datasets. We present an approximation technique based on the improved fast Gauss transform to reduce the computation to O(N). We also give an error bound for the approximation, and provide experimental results on the UCI datasets.
Shannon sampling and function reconstruction from point values
 BULL. AM. MATH. SOC
, 2004
"... ..."
Optimal rates for the regularized leastsquares algorithm
 Foundations of Computational Mathematics
"... We develop a theoretical analysis of generalization performances of regularized leastsquares on reproducing kernel Hilbert spaces for supervised learning. We show that the concept of effective dimension of an integral operator plays a central role in the definition of a criterion for the choice of t ..."
Abstract

Cited by 37 (9 self)
 Add to MetaCart
(Show Context)
We develop a theoretical analysis of generalization performances of regularized leastsquares on reproducing kernel Hilbert spaces for supervised learning. We show that the concept of effective dimension of an integral operator plays a central role in the definition of a criterion for the choice of the regularization parameter as a function of the number of samples. In fact a minimax analysis is performed which shows asymptotic optimality of the above mentioned criterion.
Kernel Techniques: From Machine Learning to Meshless Methods
, 2006
"... Kernels are valuable tools in various fields of Numerical Analysis, including approximation, interpolation, meshless methods for solving partial differential equations, neural networks, and Machine Learning. This contribution explains why and how kernels are applied in these disciplines. It uncovers ..."
Abstract

Cited by 35 (11 self)
 Add to MetaCart
Kernels are valuable tools in various fields of Numerical Analysis, including approximation, interpolation, meshless methods for solving partial differential equations, neural networks, and Machine Learning. This contribution explains why and how kernels are applied in these disciplines. It uncovers the links between them, as far as they are related to kernel techniques. It addresses nonexpert readers and focuses on practical guidelines for using kernels in applications.
Persistent Robotic Tasks: Monitoring and Sweeping in Changing Environments
, 2011
"... We present controllers that enable mobile robots to persistently monitor or sweep a changing environment. The changing environment is modeled as a field which grows in locations that are not within range of a robot, and decreases in locations that are within range of a robot. We assume that the rob ..."
Abstract

Cited by 34 (10 self)
 Add to MetaCart
We present controllers that enable mobile robots to persistently monitor or sweep a changing environment. The changing environment is modeled as a field which grows in locations that are not within range of a robot, and decreases in locations that are within range of a robot. We assume that the robots travel on given closed paths. The speed of each robot along its path is controlled to prevent the field from growing unbounded at any location. We consider the space of speed controllers that can be parametrized by a finite set of basis functions. For a single robot, we develop a linear program that is guaranteed to compute a speed controller in this space to keep the field bounded, if such a controller exists. Another linear program is then derived whose solution is the speed controller that minimizes the maximum field value over the environment. We extend our linear program formulation to develop a multirobot controller that keeps the field bounded. The multirobot controller has the unique feature that it does not require communication among the robots. Simulation studies demonstrate the robustness of the controllers to modeling errors, and to stochasticity in the environment.
Some properties of regularized kernel methods
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2004
"... In regularized kernel methods, the solution of a learning problem is found by minimizing functionals consisting of the sum of a data and a complexity term. In this paper we investigate some properties of a more general form of the above functionals in which the data term corresponds to the expected ..."
Abstract

Cited by 33 (4 self)
 Add to MetaCart
In regularized kernel methods, the solution of a learning problem is found by minimizing functionals consisting of the sum of a data and a complexity term. In this paper we investigate some properties of a more general form of the above functionals in which the data term corresponds to the expected risk. First, we prove a quantitative version of the representer theorem holding for both regression and classification, for both differentiable and nondifferentiable loss functions, and for arbitrary offset terms. Second, we show that the case in which the offset space is non trivial corresponds to solving a standard problem of regularization in a Reproducing Kernel Hilbert Space in which the penalty term is given by a seminorm. Finally, we discuss the issues of existence and uniqueness of the solution. From the specialization of our analysis to the discrete setting it is immediate to establish a connection between the solution properties of sparsity and coefficient boundedness and some properties of the loss function. For the case of Support Vector Machines for classification, we also obtain a complete characterization of the whole method in terms of the KhunTucker conditions with no need to introduce the dual formulation.