• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 20,562
Next 10 →

Good Parameters for Differential Evolution By

by Magnus Erik, Hvass Pedersen, Hvass Laboratories
"... The general purpose optimization method known as Differential Evolution (DE) has a number of parameters that determine its behaviour and efficacy in optimizing a given problem. This paper gives a list of good choices of parameters for various optimization scenarios which should help the practitioner ..."
Abstract - Cited by 1 (0 self) - Add to MetaCart
The general purpose optimization method known as Differential Evolution (DE) has a number of parameters that determine its behaviour and efficacy in optimizing a given problem. This paper gives a list of good choices of parameters for various optimization scenarios which should help

The Good Parameters Problem

by unknown authors , 2010
"... I Use of timing parameters (unknown constants) ..."
Abstract - Add to MetaCart
I Use of timing parameters (unknown constants)

A class of linear codes with good parameters

by Author(s Xing, Chaoping Ling, Chaoping Xing, San Ling - IEEE Trans. on Inform. Theory
"... Title A class of linear codes with good parameters ..."
Abstract - Cited by 5 (1 self) - Add to MetaCart
Title A class of linear codes with good parameters

Good Parameters for Particle Swarm Optimization By

by Magnus Erik, Hvass Pedersen, Hvass Laboratories
"... The general purpose optimization method known as Particle Swarm Optimization (PSO) has a number of parameters that determine its behaviour and efficacy in optimizing a given problem. This paper gives a list of good choices of parameters for various optimization scenarios which should help the practi ..."
Abstract - Cited by 3 (0 self) - Add to MetaCart
The general purpose optimization method known as Particle Swarm Optimization (PSO) has a number of parameters that determine its behaviour and efficacy in optimizing a given problem. This paper gives a list of good choices of parameters for various optimization scenarios which should help

A Model of Investor Sentiment

by Nicholas Barberis, Andrei Shleifer, Robert Vishny - Journal of Financial Economics , 1998
"... Recent empirical research in finance has uncovered two families of pervasive regularities: underreaction of stock prices to news such as earnings announcements, and overreaction of stock prices to a series of good or bad news. In this paper, we present a parsimonious model of investor sentiment, or ..."
Abstract - Cited by 777 (32 self) - Add to MetaCart
Recent empirical research in finance has uncovered two families of pervasive regularities: underreaction of stock prices to news such as earnings announcements, and overreaction of stock prices to a series of good or bad news. In this paper, we present a parsimonious model of investor sentiment

A training algorithm for optimal margin classifiers

by Bernhard E. Boser, et al. - PROCEEDINGS OF THE 5TH ANNUAL ACM WORKSHOP ON COMPUTATIONAL LEARNING THEORY , 1992
"... A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented. The technique is applicable to a wide variety of classifiaction functions, including Perceptrons, polynomials, and Radial Basis Functions. The effective number of parameters is adjust ..."
Abstract - Cited by 1865 (43 self) - Add to MetaCart
A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented. The technique is applicable to a wide variety of classifiaction functions, including Perceptrons, polynomials, and Radial Basis Functions. The effective number of parameters

Learning low-level vision

by William T. Freeman, Egon C. Pasztor - International Journal of Computer Vision , 2000
"... We show a learning-based method for low-level vision problems. We set-up a Markov network of patches of the image and the underlying scene. A factorization approximation allows us to easily learn the parameters of the Markov network from synthetic examples of image/scene pairs, and to e ciently prop ..."
Abstract - Cited by 579 (30 self) - Add to MetaCart
We show a learning-based method for low-level vision problems. We set-up a Markov network of patches of the image and the underlying scene. A factorization approximation allows us to easily learn the parameters of the Markov network from synthetic examples of image/scene pairs, and to e ciently

Answering the Skeptics: Yes, Standard Volatility Models Do Provide Accurate Forecasts

by Torben G. Andersen, Tim Bollerslev
"... Volatility permeates modern financial theories and decision making processes. As such, accurate measures and good forecasts of future volatility are critical for the implementation and evaluation of asset and derivative pricing theories as well as trading and hedging strategies. In response to this, ..."
Abstract - Cited by 561 (45 self) - Add to MetaCart
Volatility permeates modern financial theories and decision making processes. As such, accurate measures and good forecasts of future volatility are critical for the implementation and evaluation of asset and derivative pricing theories as well as trading and hedging strategies. In response to this

A Study of Cross-Validation and Bootstrap for Accuracy Estimation and Model Selection

by Ron Kohavi - INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE , 1995
"... We review accuracy estimation methods and compare the two most common methods: cross-validation and bootstrap. Recent experimental results on artificial data and theoretical results in restricted settings have shown that for selecting a good classifier from a set of classifiers (model selection), te ..."
Abstract - Cited by 1283 (11 self) - Add to MetaCart
We review accuracy estimation methods and compare the two most common methods: cross-validation and bootstrap. Recent experimental results on artificial data and theoretical results in restricted settings have shown that for selecting a good classifier from a set of classifiers (model selection

The Ant System: Optimization by a colony of cooperating agents

by Marco Dorigo, Vittorio Maniezzo, Alberto Colorni - IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS-PART B , 1996
"... An analogy with the way ant colonies function has suggested the definition of a new computational paradigm, which we call Ant System. We propose it as a viable new approach to stochastic combinatorial optimization. The main characteristics of this model are positive feedback, distributed computation ..."
Abstract - Cited by 1300 (46 self) - Add to MetaCart
computation, and the use of a constructive greedy heuristic. Positive feedback accounts for rapid discovery of good solutions, distributed computation avoids premature convergence, and the greedy heuristic helps find acceptable solutions in the early stages of the search process. We apply the proposed
Next 10 →
Results 1 - 10 of 20,562
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University