• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 315
Next 10 →

Learning to detect unseen object classes by betweenclass attribute transfer

by Christoph H. Lampert, Hannes Nickisch, Stefan Harmeling - In CVPR , 2009
"... We study the problem of object classification when training and test classes are disjoint, i.e. no training examples of the target classes are available. This setup has hardly been studied in computer vision research, but it is the rule rather than the exception, because the world contains tens of t ..."
Abstract - Cited by 363 (5 self) - Add to MetaCart
We study the problem of object classification when training and test classes are disjoint, i.e. no training examples of the target classes are available. This setup has hardly been studied in computer vision research, but it is the rule rather than the exception, because the world contains tens

The adaptive nature of human categorization

by John R. Anderson - Psychological Review , 1991
"... A rational model of human categorization behavior is presented that assumes that categorization reflects the derivation of optimal estimates of the probability of unseen features of objects. A Bayesian analysis is performed of what optimal estimations would be if categories formed a disjoint partiti ..."
Abstract - Cited by 344 (2 self) - Add to MetaCart
A rational model of human categorization behavior is presented that assumes that categorization reflects the derivation of optimal estimates of the probability of unseen features of objects. A Bayesian analysis is performed of what optimal estimations would be if categories formed a disjoint

Mining multi-label data

by Grigorios Tsoumakas, Ioannis Katakis, Ioannis Vlahavas - In Data Mining and Knowledge Discovery Handbook , 2010
"... A large body of research in supervised learning deals with the analysis of singlelabel data, where training examples are associated with a single label λ from a set of disjoint labels L. However, training examples in several application domains are often associated with a set of labels Y ⊆ L. Such d ..."
Abstract - Cited by 92 (9 self) - Add to MetaCart
A large body of research in supervised learning deals with the analysis of singlelabel data, where training examples are associated with a single label λ from a set of disjoint labels L. However, training examples in several application domains are often associated with a set of labels Y ⊆ L

Analyzing the Effectiveness and Applicability of Co-training

by Kamal Nigam, Rayid Ghani , 2000
"... Recently there has been significant interest in supervised learning algorithms that combine labeled and unlabeled data for text learning tasks. The co-training setting [1] applies to datasets that have a natural separation of their features into two disjoint sets. We demonstrate that when learning f ..."
Abstract - Cited by 263 (7 self) - Add to MetaCart
Recently there has been significant interest in supervised learning algorithms that combine labeled and unlabeled data for text learning tasks. The co-training setting [1] applies to datasets that have a natural separation of their features into two disjoint sets. We demonstrate that when learning

Using classifier ensembles to label spatially disjoint data

by Larry Shoemaker A, Robert E. Banfield A, Lawrence O. Hall A, Kevin W. Bowyer B, W. Philip Kegelmeyer C , 2007
"... We describe an ensemble approach to learning from arbitrarily partitioned data. The partitioning comes from the distributed processing requirements of a large scale simulation. The volume of the data is such that classifiers can train only on data local to a given partition. As a result of the parti ..."
Abstract - Add to MetaCart
We describe an ensemble approach to learning from arbitrarily partitioned data. The partitioning comes from the distributed processing requirements of a large scale simulation. The volume of the data is such that classifiers can train only on data local to a given partition. As a result of the partition reflecting the needs of the simulation, the class statistics can vary from partition to partition. Some classes will likely be missing from some partitions. We combine a fast ensemble learning algorithm with probabilistic majority voting in order to learn an accurate classifier from such data. Results from simulations of an impactor bar crushing a storage canister and from facial feature recognition show that regions of interest are successfully identified in spite of the class imbalance in the individual training sets. Ó 2007 Elsevier B.V. All rights reserved. Keywords: Random forest; Saliency; Probabilistic voting; Out-of-partition; k-Nearest centroids; Imbalanced training data

Boundary labeling: Models and efficient algorithms for rectangular maps

by Michael A. Bekos, Michael Kaufmann, Antonios Symvonis, Alexander Wolff , 2004
"... In this paper, we present boundary labeling, a new approach for labeling point sets with large labels. We first place disjoint labels around an axis-parallel rectangle that contains the points. Then we connect each label to its point such that no two connections intersect. Such an approach is commo ..."
Abstract - Cited by 36 (11 self) - Add to MetaCart
In this paper, we present boundary labeling, a new approach for labeling point sets with large labels. We first place disjoint labels around an axis-parallel rectangle that contains the points. Then we connect each label to its point such that no two connections intersect. Such an approach

Non-zero disjoint cycles in highly connected group labelled graphs

by Ken-ichi Kawarabayashi, et al.
"... ..."
Abstract - Cited by 8 (4 self) - Add to MetaCart
Abstract not found

Variation of Graceful Labeling on Disjoint Union of two Subdivided Shell Graphs

by J. Jeba Jesintha, K. Ezhilarasi Hilda , 2014
"... ..."
Abstract - Add to MetaCart
Abstract not found

1 2 Using classifier ensembles to label spatially disjoint data

by Larry Shoemaker A, Robert E. Banfield A, Lawrence O. Hall A, Kevin W. Bowyer B, W. Philip Kegelmeyer C , 2007
"... 12 September 2007 Disk Used 11 We describe an ensemble approach to learning from arbitrarily partitioned data. The partitioning comes from the distributed process-12 ing requirements of a large scale simulation. The volume of the data is such that classifiers can train only on data local to a given ..."
Abstract - Add to MetaCart
12 September 2007 Disk Used 11 We describe an ensemble approach to learning from arbitrarily partitioned data. The partitioning comes from the distributed process-12 ing requirements of a large scale simulation. The volume of the data is such that classifiers can train only on data local to a given par-13 tition. As a result of the partition reflecting the needs of the simulation, the class statistics can vary from partition to partition. Some 14 classes will likely be missing from some partitions. We combine a fast ensemble learning algorithm with probabilistic majority voting 15 in order to learn an accurate classifier from such data. Results from simulations of an impactor bar crushing a storage canister and from 16 facial feature recognition show that regions of interest are successfully identified in spite of the class imbalance in the individual training 17 sets.

Edge-disjoint Spanning Trees in Triangulated Graphs on Surfaces and application to node labeling

by Arnaud Labourel , 2006
"... ..."
Abstract - Add to MetaCart
Abstract not found
Next 10 →
Results 1 - 10 of 315
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University