Results 1  10
of
69,907
NearOptimal Nonapproximability Results for Some NPO PBComplete Problems
 Information Processing Letters
, 1998
"... We show that a number of Npo PBcomplete problems, including Min Ones and Max Ones, are hard to approximate within n 1\Gammaffl for arbitrary ffl ? 0. Keywords: Approximation, Computational complexity, Combinatorial problems. 1 Introduction Npo PB is the class of NP optimization problems whose o ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
objective function is bounded by some polynomial in the size of the input. It is wellknown that many problems that are complete for Npo PB are notoriously hard to approximate and nearoptimal lower bounds on the approximability of Npo PBcomplete problems such as Min # Sat and Min PB 01 Programming have
Strong lower bounds on the approximability of some NPO PBcomplete maximization problems
, 1995
"... The approximability of several NP maximization problems is investigated and strong lower bounds for the studied problems are proved. For some of the problems the bounds are the best that can be achieved, unless P = NP. For example we investigate the approximability of Max PB 0 \Gamma 1 Programming ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
The approximability of several NP maximization problems is investigated and strong lower bounds for the studied problems are proved. For some of the problems the bounds are the best that can be achieved, unless P = NP. For example we investigate the approximability of Max PB 0 \Gamma 1
Tight Lower Bounds on the Approximability of Some NPO PBComplete Problems
, 1997
"... We investigate the approximability of the NPO PBcomplete problems Min Ones, Min Dones, Max Ones, Max Dones and Max PB 0/1 Programming. We show that, unless P = NP, these problems are not approximable within n 1\Gamma" for any " ? 0 where n is the number of variables in the input. Sin ..."
Abstract
 Add to MetaCart
We investigate the approximability of the NPO PBcomplete problems Min Ones, Min Dones, Max Ones, Max Dones and Max PB 0/1 Programming. We show that, unless P = NP, these problems are not approximable within n 1\Gamma" for any " ? 0 where n is the number of variables in the input
Centrality in social networks conceptual clarification
 Social Networks
, 1978
"... The intuitive background for measures of structural centrality in social networks is reviewed aPzd existing measures are evaluated in terms of their consistency with intuitions and their interpretability. Three distinct intuitive conceptions of centrality are uncovered and existing measures are refi ..."
Abstract

Cited by 1035 (2 self)
 Add to MetaCart
of small groups is examined. The problem of centrality The idea of centrality as applied to human communication was introduced by Bavelas in 1948. He was specifically concerned with communication in small groups and he hypothesized a relationship between structural centrality and influence in group
Wrapper Induction for Information Extraction
, 1997
"... The Internet presents numerous sources of useful informationtelephone directories, product catalogs, stock quotes, weather forecasts, etc. Recently, many systems have been built that automatically gather and manipulate such information on a user's behalf. However, these resources are usually ..."
Abstract

Cited by 612 (30 self)
 Add to MetaCart
introduce wrapper induction, a technique for automatically constructing wrappers. Our techniques can be described in terms of three main contributions. First, we pose the problem of wrapper construction as one of inductive learn...
Bagging Predictors
 Machine Learning
, 1996
"... Bagging predictors is a method for generating multiple versions of a predictor and using these to get an aggregated predictor. The aggregation averages over the versions when predicting a numerical outcome and does a plurality vote when predicting a class. The multiple versions are formed by making ..."
Abstract

Cited by 3574 (1 self)
 Add to MetaCart
Bagging predictors is a method for generating multiple versions of a predictor and using these to get an aggregated predictor. The aggregation averages over the versions when predicting a numerical outcome and does a plurality vote when predicting a class. The multiple versions are formed by making bootstrap replicates of the learning set and using these as new learning sets. Tests on real and simulated data sets using classification and regression trees and subset selection in linear regression show that bagging can give substantial gains in accuracy. The vital element is the instability of the prediction method. If perturbing the learning set can cause significant changes in the predictor constructed, then bagging can improve accuracy. 1. Introduction A learning set of L consists of data f(y n ; x n ), n = 1; : : : ; Ng where the y's are either class labels or a numerical response. We have a procedure for using this learning set to form a predictor '(x; L)  if the input is x we ...
Transfer of Cognitive Skill
, 1989
"... A framework for skill acquisition is proposed that includes two major stages in the development of a cognitive skill: a declarative stage in which facts about the skill domain are interpreted and a procedural stage in which the domain knowledge is directly embodied in procedures for performing the s ..."
Abstract

Cited by 869 (21 self)
 Add to MetaCart
A framework for skill acquisition is proposed that includes two major stages in the development of a cognitive skill: a declarative stage in which facts about the skill domain are interpreted and a procedural stage in which the domain knowledge is directly embodied in procedures for performing the skill. This general framework has been instantiated in the ACT system in which facts are encoded in a propositional network and procedures are encoded as productions. Knowledge compilation is the process by which the skill transits from the declarative stage to the procedural stage. It consists of the subprocesses of composition, which collapses sequences of productions into single productions, and proceduralization, which embeds factual knowledge into productions. Once proceduralized, further learning processes operate on the skill to make the productions more selective in their range of applications. These processes include generalization, discrimination, and strengthening of productions. Comparisons are made to similar concepts from past learning theories. How these learning mechanisms apply to produce the power law speedup in processing time with practice is discussed. It requires at least 100 hours of learning and practice to acquire any significant cognitive skill to a reasonable degree of proficiency. For instance, after 100 hours a student learning to program a computer has achieved only a very modest facility in the skill. Learning one's primary language takes tens of thousands of hours. The psychology of human learning has been very thin in ideas about what happens to skills under the impact of this amount of learning—and for obvious reasons. This article presents a theory about the changes in the nature of a skill over such large time scales and about the basic learning processes that are responsible.
Wireless Communications
, 2005
"... Copyright c ○ 2005 by Cambridge University Press. This material is in copyright. Subject to statutory exception and to the provisions of relevant collective licensing agreements, no reproduction of any part may take place without the written permission of Cambridge University ..."
Abstract

Cited by 1129 (32 self)
 Add to MetaCart
Copyright c ○ 2005 by Cambridge University Press. This material is in copyright. Subject to statutory exception and to the provisions of relevant collective licensing agreements, no reproduction of any part may take place without the written permission of Cambridge University
Monitoring the reputation: The choice between bank loans and directly placed debt
 JOURNAL OF POLITICAL ECONOMY
, 1991
"... ..."
Results 1  10
of
69,907