• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 869
Next 10 →

Competition for consciousness among visual events: the Psychophysics of reentrant visual processes

by Vincent Di Lollo, James T. Enns, Ronald A. Rensink - Journal of Experimental Psychology: General , 2000
"... Advances in neuroscience implicate reentrant signaling as the predominant form of communication between brain areas. This principle was used in a series of masking experiments that defy explanation by feed-forward theories. The masking occurs when a brief display of target plus mask is continued wit ..."
Abstract - Cited by 184 (18 self) - Add to MetaCart
between the reentrant visual representation and the ongoing lower level activity. Iterative reentrant processing was formalized in a computational model that provides an excellent fit to the data. The model provides a more comprehensive account of all forms of visual masking than do the long-held feed-forward

Creating Full View Panoramic Image Mosaics and Environment Maps

by Richard Szeliski , Heung-Yeung Shum , 1997
"... This paper presents a novel approach to creating full view panoramic mosaics from image sequences. Unlike current panoramic stitching methods, which usually require pure horizontal camera panning, our system does not require any controlled motions or constraints on how the images are taken (as long ..."
Abstract - Cited by 340 (29 self) - Add to MetaCart
This paper presents a novel approach to creating full view panoramic mosaics from image sequences. Unlike current panoramic stitching methods, which usually require pure horizontal camera panning, our system does not require any controlled motions or constraints on how the images are taken (as long

Cultivating competence, self-efficacy, and intrinsic interest through proximal self-motivation.

by Albert Bandura , Dale H Schunk Bandura , A Schunk , D H - Journal of Personality and Social Psychology, , 1981
"... Abstract: The present experiment tested the hypothesis that self-motivation through proximal goal setting serves as an effective mechanism for cultivating competencies, self-percepts of efficacy, and intrinsic interest. Children who exhibited gross deficits and disinterest in mathematical tasks pur ..."
Abstract - Cited by 295 (6 self) - Add to MetaCart
, achieved substantial mastery of mathematical operations, and developed a sense of personal efficacy and intrinsic interest in arithmetic activities that initially held little attraction for them. Distal goals had no demonstrable effects. In addition to its other benefits, goal proximity fostered veridical

Challenging Long-Held Notions about Sexual Abuse by Adolescents

by David S. Prescott, Steven Bengis, Joan Tabachnick , 2008
"... We want to expand our readership, and we need your help. If you have found these newsletters helpful, would you consider forwarding this issue to a friend or colleague? Colleagues can sign up to receive future issues of the newsletter at www.neari.com/mailing.html. Feel free to view the previous iss ..."
Abstract - Add to MetaCart
We want to expand our readership, and we need your help. If you have found these newsletters helpful, would you consider forwarding this issue to a friend or colleague? Colleagues can sign up to receive future issues of the newsletter at www.neari.com/mailing.html. Feel free to view the previous

Feed-Forward Neural Networks and Topographic Mappings for Exploratory Data Analysis

by David Lowe, Michael Tipping - Neural Computing and Applications , 1996
"... A recent novel approach to the visualisation and analysis of datasets, and one which is particularly applicable to those of a high dimension, is discussed in the context of real applications. A feed-forward neural network is utilised to effect a topographic, structure-preserving, dimension-reducing ..."
Abstract - Cited by 49 (2 self) - Add to MetaCart
A recent novel approach to the visualisation and analysis of datasets, and one which is particularly applicable to those of a high dimension, is discussed in the context of real applications. A feed-forward neural network is utilised to effect a topographic, structure-preserving, dimension

Feed-forward: Future questions, future maps

by M.S Peggy W Penn - Family Process , 1985
"... "Feed-forward" is a technique that encourages families to imagine the pattern of their relationships at some future point in time. Questions about the future, in conjunction with positive connotation, put families in a metaposition to their own dilemmas and thus facilitate change by openi ..."
Abstract - Cited by 10 (0 self) - Add to MetaCart
of an outcomeif this or that event obtainedgives the family a sense of their own potential to imagine new solutions. At that moment I would say the family are in the process of feed-forward. In considering how things could turn out if, you are addressing a basic descriptor of the system: its capacity

Benchmarking Feed-Forward Neural Networks: Models and Measures

by Leonard Hamey , 1992
"... Existing metrics for the learning performance of feed-forward neural networks do not provide a satisfactory basis for comparison because the choice of the training epoch limit can determine the results of the comparison. I propose new metrics which have the desirable property of being independent of ..."
Abstract - Cited by 2 (0 self) - Add to MetaCart
Existing metrics for the learning performance of feed-forward neural networks do not provide a satisfactory basis for comparison because the choice of the training epoch limit can determine the results of the comparison. I propose new metrics which have the desirable property of being independent

Maximizing the Margin with Feed-forward Neural Networks

by Enrique Romero, Rene Alquezar , 2002
"... Feed-forward Neural Networks (FNNs) and Support Vector Machines (SVMs) are two machine learning frameworks developed from very different starting points of view. In this work a new learning model for FNNs is proposed such that, in the linearly separable case, tends to obtain the same solution that S ..."
Abstract - Cited by 2 (1 self) - Add to MetaCart
Feed-forward Neural Networks (FNNs) and Support Vector Machines (SVMs) are two machine learning frameworks developed from very different starting points of view. In this work a new learning model for FNNs is proposed such that, in the linearly separable case, tends to obtain the same solution

Boltzmann Learning in a Feed-Forward Neural Network

by J. Wroldsen , 1995
"... We show how a feed-forward neural network can be sucessfully trained by using a simulated annealing (or Monte Carlo) technique. The network is initialized randomly. Then the configurations (weights of the network) are generated according to a Boltzmann distribution. By lowering the temperature of th ..."
Abstract - Add to MetaCart
We show how a feed-forward neural network can be sucessfully trained by using a simulated annealing (or Monte Carlo) technique. The network is initialized randomly. Then the configurations (weights of the network) are generated according to a Boltzmann distribution. By lowering the temperature

Metropolis Learning in a Feed-Forward Neural Network

by J. Wroldsen
"... We show how a feed-forward neural network can be sucessfully trained by using a simulated annealing (or Monte Carlo) technique. The network weights are initialized randomly. Then the configurations (weights of the network) are generated according to a Boltzmann distribution using the Metropolis algo ..."
Abstract - Add to MetaCart
We show how a feed-forward neural network can be sucessfully trained by using a simulated annealing (or Monte Carlo) technique. The network weights are initialized randomly. Then the configurations (weights of the network) are generated according to a Boltzmann distribution using the Metropolis
Next 10 →
Results 1 - 10 of 869
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University