Results 1 - 10
of
874
Transfer of Cognitive Skill
, 1989
"... A framework for skill acquisition is proposed that includes two major stages in the development of a cognitive skill: a declarative stage in which facts about the skill domain are interpreted and a procedural stage in which the domain knowledge is directly embodied in procedures for performing the s ..."
Abstract
-
Cited by 894 (22 self)
- Add to MetaCart
A framework for skill acquisition is proposed that includes two major stages in the development of a cognitive skill: a declarative stage in which facts about the skill domain are interpreted and a procedural stage in which the domain knowledge is directly embodied in procedures for performing the skill. This general framework has been instantiated in the ACT system in which facts are encoded in a propositional network and procedures are encoded as productions. Knowledge compilation is the process by which the skill transits from the declarative stage to the procedural stage. It consists of the subprocesses of composition, which collapses sequences of productions into single productions, and proceduralization, which embeds factual knowledge into productions. Once proceduralized, further learning processes operate on the skill to make the productions more selective in their range of applications. These processes include generalization, discrimination, and strengthening of productions. Comparisons are made to similar concepts from past learning theories. How these learning mechanisms apply to produce the power law speedup in processing time with practice is discussed. It requires at least 100 hours of learning and practice to acquire any significant cognitive skill to a reasonable degree of proficiency. For instance, after 100 hours a student learning to program a computer has achieved only a very modest facility in the skill. Learning one's primary language takes tens of thousands of hours. The psychology of human learning has been very thin in ideas about what happens to skills under the impact of this amount of learning—and for obvious reasons. This article presents a theory about the changes in the nature of a skill over such large time scales and about the basic learning processes that are responsible.
Controlled and automatic human information processing: II. Perceptual learning, automatic attending and a general theory
- Psychological Review
, 1977
"... The two-process theory of detection, search, and attention presented by Schneider and Shiffrin is tested and extended in a series of experiments. The studies demonstrate the qualitative difference between two modes of information processing: automatic detection and controlled search. They trace the ..."
Abstract
-
Cited by 845 (12 self)
- Add to MetaCart
The two-process theory of detection, search, and attention presented by Schneider and Shiffrin is tested and extended in a series of experiments. The studies demonstrate the qualitative difference between two modes of information processing: automatic detection and controlled search. They trace the course of the learning of automatic detection, of categories, and of automaticattention responses. They show the dependence of automatic detection on attending responses and demonstrate how such responses interrupt controlled processing and interfere with the focusing of attention. The learning of categories is shown to improve controlled search performance. A general framework for human information processing is proposed; the framework emphasizes the roles of automatic and controlled processing. The theory is compared to and contrasted with extant models of search and attention.
A theory of memory retrieval
- PSYCHOL. REV
, 1978
"... A theory of memory retrieval is developed and is shown to apply over a range of experimental paradigms. Access to memory traces is viewed in terms of a resonance metaphor. The probe item evokes the search set on the basis of probe-memory item relatedness, just as a ringing tuning fork evokes sympath ..."
Abstract
-
Cited by 769 (83 self)
- Add to MetaCart
A theory of memory retrieval is developed and is shown to apply over a range of experimental paradigms. Access to memory traces is viewed in terms of a resonance metaphor. The probe item evokes the search set on the basis of probe-memory item relatedness, just as a ringing tuning fork evokes sympathetic vibrations in other tuning forks. Evidence is accumulated in parallel from each probe-memory item comparison, and each comparison is modeled by a continuous random walk process. In item recognition, the decision process is self-terminating on matching comparisons and exhaustive on nonmatching comparisons. The mathematical model produces predictions about accuracy, mean reaction time, error latency, and reaction time distributions that are in good accord with experimental data. The theory is applied to four item recognition paradigms (Sternberg, prememorized list, study-test, and continuous) and to speed-accuracy paradigms; results are found to provide a basis for comparison of these paradigms. It is noted that neural network models can be interfaced to the retrieval theory with little difficulty and that semantic memory models may benefit from such a retrieval scheme.
A capacity theory of comprehension: Individual differences in working memory
- Psychological Review
, 1992
"... A theory of the way working memory capacity constrains comprehension is proposed. The theory proposes that both processing and storage are mediated by activation and that the total amount of activation available in working memory varies among individuals. Individual differences in working memory cap ..."
Abstract
-
Cited by 700 (21 self)
- Add to MetaCart
A theory of the way working memory capacity constrains comprehension is proposed. The theory proposes that both processing and storage are mediated by activation and that the total amount of activation available in working memory varies among individuals. Individual differences in working memory capacity for language can account for qualitative and quantitative differences among college-age adults in several aspects of language comprehension. One aspect is syntactic modularity: The larger capacity of some individuals permits interaction among syntactic and pragmatic information, so that their syntactic processes are not informationally encapsulated. Another aspect is syntactic ambiguity: The larger capacity of some individuals permits them to maintain multiple interpretations. The theory is instantiated as a production system model in which the amount of activation available to the model affects how it adapts to the transient computational and storage demands that occur in comprehension. Working memory plays a central role in all forms of complex thinking, such as reasoning, problem solving, and language comprehension. However, its function in language comprehension is especially evident because comprehension entails processing
The role of deliberate practice in the acquisition of expert performance
- Psychological Review
, 1993
"... The theoretical framework presented in this article explains expert performance as the end result of individuals ' prolonged efforts to improve performance while negotiating motivational and external constraints. In most domains of expertise, individuals begin in their childhood a regimen of ef ..."
Abstract
-
Cited by 690 (15 self)
- Add to MetaCart
(Show Context)
The theoretical framework presented in this article explains expert performance as the end result of individuals ' prolonged efforts to improve performance while negotiating motivational and external constraints. In most domains of expertise, individuals begin in their childhood a regimen of effortful activities (deliberate practice) designed to optimize improvement. Individual differences, even among elite performers, are closely related to assessed amounts of deliberate practice. Many characteristics once believed to reflect innate talent are actually the result of intense practice extended for a minimum of 10 years. Analysis of expert performance provides unique evidence on the potential and limits of extreme environmental adaptation and learning. Our civilization has always recognized exceptional individuals, whose performance in sports, the arts, and science is vastly superior to that of the rest of the population. Speculations on the causes of these individuals ' extraordinary abilities and performance are as old as the first records of their achievements. Early accounts commonly attribute these individuals' outstanding performance to divine intervention, such as the
The empirical case for two systems of reasoning
, 1996
"... Distinctions have been proposed between systems of reasoning for centuries. This article distills properties shared by many of these distinctions and characterizes the resulting systems in light of recent findings and theoretical developments. One system is associative because its computations ref ..."
Abstract
-
Cited by 669 (4 self)
- Add to MetaCart
(Show Context)
Distinctions have been proposed between systems of reasoning for centuries. This article distills properties shared by many of these distinctions and characterizes the resulting systems in light of recent findings and theoretical developments. One system is associative because its computations reflect similarity structure and relations of temporal contiguity. The other is “rule based” because it operates on symbolic structures that have logical content and variables and because its computations have the properties that are normally assigned to rules. The systems serve complementary functions and can simultaneously generate different solutions to a reasoning problem. The rule-based system can suppress the associative system but not completely inhibit it. The article reviews evidence in favor of the distinction and its characterization.
Halfa century of research on the Stroop effect: An integrative review
- PsychologicalBulletin
, 1991
"... The literature on interference in the Stroop Color-Word Task, covering over 50 years and some 400 studies, is organized and reviewed. In so doing, a set ofl 8 reliable empirical findings is isolated that must be captured by any successful theory of the Stroop effect. Existing theoretical positions a ..."
Abstract
-
Cited by 666 (14 self)
- Add to MetaCart
(Show Context)
The literature on interference in the Stroop Color-Word Task, covering over 50 years and some 400 studies, is organized and reviewed. In so doing, a set ofl 8 reliable empirical findings is isolated that must be captured by any successful theory of the Stroop effect. Existing theoretical positions are summarized and evaluated in view of this critical evidence and the 2 major candidate theories--relative speed of processing and automaticity of reading--are found to be wanting. It is concluded that recent theories placing the explanatory weight on parallel processing of the irrelevant and the relevant dimensions are likely to be more successful than are earlier theories attempting to locate a single bottleneck in attention. In 1935, J. R. Stroop published his landmark article on attention and interference, an article more influential now than it was then. Why has the Stroop task continued to fascinate us? Perhaps the task is seen as tapping into the primitive operations of cognition, offering clues to the fundamental process of attention. Perhaps the robustness of the phenomenon provides a special challenge to decipher. Together these are powerful attractions
Toward an instance theory of automatization
- Psychological Review
, 1988
"... This article presents a theory in which automatization is construed as the acquisition of a domain-specific knowledge base, formed of separate representations, instances, of each exposure to the task. Processing is considered automatic if it relies on retrieval of stored instances, which will occur ..."
Abstract
-
Cited by 647 (38 self)
- Add to MetaCart
(Show Context)
This article presents a theory in which automatization is construed as the acquisition of a domain-specific knowledge base, formed of separate representations, instances, of each exposure to the task. Processing is considered automatic if it relies on retrieval of stored instances, which will occur only after practice in a consistent environment. Practice is important because it increases the amount retrieved and the speed of retrieval; consistency is important because it ensures that the retrieved instances will be useful. The theory accounts quantitatively for the power-function speed-up and predicts a power-function reduction in the standard deviation that is constrained to have the same exponent as the power function for the speed-up. The theory accounts for qualitative properties as well, explaining how some may disappear and others appear with practice. More generally, it provides an alternative to the modal view of automaticity, arguing that novice performance is limited by a lack of knowledge rather than a scarcity of resources. The focus on learning avoids many problems with the modal view that stem from its focus on resource limitations. Automaticity is an important phenomenon in everyday men-tal life. Most of us recognize that we perform routine activities quickly and effortlessly, with little thought and conscious aware-ness--in short, automatically (James, 1890). As a result, we of-ten perform those activities on "automatic pilot " and turn our minds to other things. For example, we can drive to dinner while conversing in depth with a visiting scholar, or we can make coffee while planning dessert. However, these benefits may be offset by costs. The automatic pilot can lead us astray, caus-ing errors and sometimes catastrophes (Reason & Myceilska, 1982). If the conversation is deep enough, we may find ourselves and the scholar arriving at the office rather than the restaurant, or we may discover that we aren't sure whether we put two or three scoops of coffee into the pot. Automaticity is also an important phenomenon in skill acqui-sition (e.g., Bryan & Harter, 1899). Skills are thought to consist largely of collections of automatic processes and procedures
Verbal reports as data
- Psychological Review
, 1980
"... The central proposal of this article is that verbal reports are data. Accounting for verbal reports, as for other kinds of data, requires explication of the mech-anisms by which the reports are generated, and the ways in which they are sensitive to experimental factors (instructions, tasks, etc.). W ..."
Abstract
-
Cited by 513 (3 self)
- Add to MetaCart
The central proposal of this article is that verbal reports are data. Accounting for verbal reports, as for other kinds of data, requires explication of the mech-anisms by which the reports are generated, and the ways in which they are sensitive to experimental factors (instructions, tasks, etc.). Within the theoret-ical framework of human information processing, we discuss different types of processes underlying verbalization and present a model of how subjects, in re-sponse to an instruction to think aloud, verbalize information that they are attending to in short-term memory (STM). Verbalizing information is shown to affect cognitive processes only if the instructions require verbalization of information that would not otherwise be attended to. From an analysis of what would be in STM at the time of report, the model predicts what can reliably be reported. The inaccurate reports found by other research are shown to result from requesting information that was never directly heeded, thus forcing subjects to infer rather than remember their mental processes. After a long period of time during which stimulus-response relations were at the focus of attention, research in psychology is now seeking to understand in detail the mecha-nisms and internal structure of cognitive pro-cesses that produce these relations. In the limiting case, we would like to have process models so explicit that they could actually produce the predicted behavior from the in-formation in the stimulus.