Results 1 - 10
of
501
Conventional wisdom on measurement: A structural equation perspective
- Psychological Bulletin
, 1991
"... ..."
(Show Context)
From frequency to meaning : Vector space models of semantics
- Journal of Artificial Intelligence Research
, 2010
"... Computers understand very little of the meaning of human language. This profoundly limits our ability to give instructions to computers, the ability of computers to explain their actions to us, and the ability of computers to analyse and process text. Vector space models (VSMs) of semantics are begi ..."
Abstract
-
Cited by 322 (3 self)
- Add to MetaCart
(Show Context)
Computers understand very little of the meaning of human language. This profoundly limits our ability to give instructions to computers, the ability of computers to explain their actions to us, and the ability of computers to analyse and process text. Vector space models (VSMs) of semantics are beginning to address these limits. This paper surveys the use of VSMs for semantic processing of text. We organize the literature on VSMs according to the structure of the matrix in a VSM. There are currently three broad classes of VSMs, based on term–document, word–context, and pair–pattern matrices, yielding three classes of applications. We survey a broad range of applications in these three categories and we take a detailed look at a specific open source project in each category. Our goal in this survey is to show the breadth of applications of VSMs for semantics, to provide a new perspective on VSMs for those who are already familiar with the area, and to provide pointers into the literature for those who are less familiar with the field. 1.
of Labor The Economics and Psychology of Personality Traits
"... Any opinions expressed here are those of the author(s) and not those of IZA. Research published in this series may include views on policy, but the institute itself takes no institutional policy positions. The Institute for the Study of Labor (IZA) in Bonn is a local and virtual international resear ..."
Abstract
-
Cited by 111 (13 self)
- Add to MetaCart
Any opinions expressed here are those of the author(s) and not those of IZA. Research published in this series may include views on policy, but the institute itself takes no institutional policy positions. The Institute for the Study of Labor (IZA) in Bonn is a local and virtual international research center and a place of communication between science, politics and business. IZA is an independent nonprofit organization supported by Deutsche Post World Net. The center is associated with the University of Bonn and offers a stimulating research environment through its international network, workshops and conferences, data service, project support, research visits and doctoral program. IZA engages in (i) original and internationally competitive research in all fields of labor economics, (ii) development of policy concepts, and (iii) dissemination of research results and concepts to the interested public. IZA Discussion Papers often represent preliminary work and are circulated to encourage discussion. Citation of such a paper should account for its provisional character. A revised version may be available directly from the author. IZA Discussion Paper No. 3333
Stochastic Variational Inference
- JOURNAL OF MACHINE LEARNING RESEARCH (2013, IN PRESS)
, 2013
"... We develop stochastic variational inference, a scalable algorithm for approximating posterior distributions. We develop this technique for a large class of probabilistic models and we demonstrate it with two probabilistic topic models, latent Dirichlet allocation and the hierarchical Dirichlet proce ..."
Abstract
-
Cited by 99 (23 self)
- Add to MetaCart
(Show Context)
We develop stochastic variational inference, a scalable algorithm for approximating posterior distributions. We develop this technique for a large class of probabilistic models and we demonstrate it with two probabilistic topic models, latent Dirichlet allocation and the hierarchical Dirichlet process topic model. Using stochastic variational inference, we analyze several large collections of documents: 300K articles from Nature, 1.8M articles from The New York Times, and 3.8M articles from Wikipedia. Stochastic inference can easily handle data sets of this size and outperforms traditional variational inference, which can only handle a smaller subset. (We also show that the Bayesian nonparametric topic model outperforms its parametric counterpart.) Stochastic variational inference lets us apply complex Bayesian models to massive data sets.
The Theoretical Status of Latent Variables
- Psychological Review
, 2003
"... This article examines the theoretical status of latent variables as used in modern test theory models. First, it is argued that a consistent interpretation of such models requires a realist ontology for latent variables. Second, the relation between latent variables and their indicators is discussed ..."
Abstract
-
Cited by 91 (7 self)
- Add to MetaCart
This article examines the theoretical status of latent variables as used in modern test theory models. First, it is argued that a consistent interpretation of such models requires a realist ontology for latent variables. Second, the relation between latent variables and their indicators is discussed. It is maintained that this relation can be interpreted as a causal one but that in measurement models for interindividual differences the relation does not apply to the level of the individual person. To substantiate intraindividual causal conclusions, one must explicitly represent individual level processes in the measurement model. Several research strategies that may be useful in this respect are discussed, and a typology of constructs is proposed on the basis of this analysis. The need to link individual processes to latent variable models for interindividual differences is emphasized. Consider the following sentence: “Einstein would not have been able to come up with his e � mc 2 had he not possessed such an extraordinary intelligence. ” What does this sentence express? It relates observable behavior (Einstein’s writing e � mc 2)toan unobservable attribute (his extraordinary intelligence), and it does so by assigning to the unobservable attribute a causal role in
General mental ability in the world of work: Occupational attainment and job performance
- Journal of Personality and Social Psychology
, 2004
"... The psychological construct of general mental ability (GMA), introduced by C. Spearman (1904) nearly 100 years ago, has enjoyed a resurgence of interest and attention in recent decades. This article presents the research evidence that GMA predicts both occupational level attained and performance wit ..."
Abstract
-
Cited by 91 (0 self)
- Add to MetaCart
The psychological construct of general mental ability (GMA), introduced by C. Spearman (1904) nearly 100 years ago, has enjoyed a resurgence of interest and attention in recent decades. This article presents the research evidence that GMA predicts both occupational level attained and performance within one’s chosen occupation and does so better than any other ability, trait, or disposition and better than job experience. The sizes of these relationships with GMA are also larger than most found in psychological research. Evidence is presented that weighted combinations of specific aptitudes tailored to individual jobs do not predict job performance better than GMA alone, disconfirming specific aptitude theory. A theory of job performance is described that explicates the central role of GMA in the world of work. These findings support Spearman’s proposition that GMA is of critical importance in human affairs. During the 1960s when we were graduate students, we frequently heard predictions from experimental psychologists and experimental social psychologists that in 20 or so years differential psychology would be a dead field, because experimental research would explain all individual differences as effects of past or present (environmental) treatment conditions. Obviously, this has
The C-OAR-SE procedure for scale development in marketing
- International Journal of Research in Marketing
, 2002
"... Construct definition, Object classification, Attribute classification, Rater identification, Scale formation, and Enumeration and reporting (C-OAR-SE) is proposed as a new procedure for the development of scales to measure marketing constructs. C-OAR-SE is based on content validity, established by e ..."
Abstract
-
Cited by 82 (3 self)
- Add to MetaCart
Construct definition, Object classification, Attribute classification, Rater identification, Scale formation, and Enumeration and reporting (C-OAR-SE) is proposed as a new procedure for the development of scales to measure marketing constructs. C-OAR-SE is based on content validity, established by expert agreement after pre-interviews with target raters. In C-OAR-SE, constructs are defined in terms of Object, Attribute, and Rater Entity. The Object classification and Attribute classification steps in C-OAR-SE produce a framework (six types of scales) indicating when to use single-item vs. multiple-item scales and, for multiple-item scales, when to use an index of essential items rather than selecting unidimensional items with a high coefficient alpha. The Rater Entity type largely determines reliability, which is a precision-of-score estimate for a particular application of the scale.
Working memory and intelligence: the same or different constructs
- Psychological Bulletin
, 2005
"... All in-text references underlined in blue are linked to publications on ResearchGate, letting you access and read them immediately. ..."
Abstract
-
Cited by 79 (2 self)
- Add to MetaCart
(Show Context)
All in-text references underlined in blue are linked to publications on ResearchGate, letting you access and read them immediately.
A beginner's guide to partial least squares analysis, Understanding Statistics”. Statistical Issues in Psychology and Social Sciences, Volume 3
- Number
, 2006
"... Since the introduction of covariance-based structural equation modeling (SEM) by Jöreskog in 1973, this technique has been received with considerable interest among empirical researchers. However, the predominance of LISREL, certainly the most well-known tool to perform this kind of analysis, has le ..."
Abstract
-
Cited by 71 (0 self)
- Add to MetaCart
Since the introduction of covariance-based structural equation modeling (SEM) by Jöreskog in 1973, this technique has been received with considerable interest among empirical researchers. However, the predominance of LISREL, certainly the most well-known tool to perform this kind of analysis, has led to the fact that not all re-searchers are aware of alternative techniques for SEM, such as partial least squares (PLS) analysis. Therefore, the objective of this article is to provide an easily compre-hensible introduction to this technique, which is particularly suited to situations in which constructs are measured by a very large number of indicators and where maxi-mum likelihood covariance-based SEM tools reach their limit. Because this article is intended as a general introduction, it avoids mathematical details as far as possible and instead focuses on a presentation of PLS, which can be understood without an in-depth knowledge of SEM. partial least squares, structural equation modeling, PLS, LISREL, SEM First-generation techniques, such as regression-based approaches (e.g., multiple re-gression analysis, discriminant analysis, logistic regression, analysis of variance) and factor or cluster analysis, belong to the core set of statistical instruments which can be used to either identify or confirm theoretical hypothesis based on the analysis of empirical data. Many researchers in various disciplines have applied one of these
The impact of childhood intelligence on later life: Following up the Scottish Mental Surveys of 1932 and 1947
- Journal of Personality and Social Psychology
, 2004
"... 70,805). These surveys are described. This research, using the surveys ’ data, examined (a) the stability of intelligence differences across the life span, (b) the determinants of cognitive change from childhood to old age, and (c) the impact of childhood intelligence on survival and health in old a ..."
Abstract
-
Cited by 68 (10 self)
- Add to MetaCart
(Show Context)
70,805). These surveys are described. This research, using the surveys ’ data, examined (a) the stability of intelligence differences across the life span, (b) the determinants of cognitive change from childhood to old age, and (c) the impact of childhood intelligence on survival and health in old age. Surviving participants of the Scottish Mental Surveys were tested, and the surveys ’ data were linked with public and health records. Novel findings on the stability of IQ scores from age 11 to age 80; sex differences in cognitive aging; the dedifferentiation hypothesis of cognitive aging; and the effect of childhood IQ on all-cause and specific mortality, morbidity, and frailty in old age are presented. Scotland’s Mental Survey Committee gathered in 1931 to plan a study to describe the mental ability of the Scottish nation’s children (Scottish Council for Research in Education, 1933). They found no practicable method of obtaining a representative sample. They decided to test the whole nation. On Wednesday, June 1, 1932, practically every child attending school in Scotland and born in 1921 took the same mental test with the same time limit after hearing the same instructions. Thus Scotland was and remains the only nation with mental test data for almost an entire birth cohort. What is more, they repeated the exercise in 1947, testing almost all