Results 1 -
5 of
5
Automatic Detection of Learning-Centered Affective States
"... Affect detection is a key component in developing intelligent educational interfaces that are capable of responding to the affective needs of students. In this paper, computer vision and machine learning techniques were used to detect students ’ affect as they used an educational game designed to te ..."
Abstract
-
Cited by 3 (3 self)
- Add to MetaCart
(Show Context)
Affect detection is a key component in developing intelligent educational interfaces that are capable of responding to the affective needs of students. In this paper, computer vision and machine learning techniques were used to detect students ’ affect as they used an educational game designed to teach fundamental principles of Newtonian physics. Data were collected in the real-world environment of a school computer lab, which provides unique challenges for detection of affect from facial expressions (primary channel) and gross body movements (secondary channel)—up to thirty students at a time participated in the class, moving around, gesturing, and talking to each other. Results were cross validated at the student level to ensure generalization to new students. Classification was successful at levels above chance for off-task behavior (area under receiver operating characteristic curve or AUC =.816) and each affective state including boredom (AUC =.610), confusion (.649), delight (.867), engagement (.679), and frustration (.631) as well as a five-way overall classification of affect (.655), despite the noisy nature of the data. Implications and prospects for affect-sensitive interfaces for educational software in classroom environments are discussed. Author Keywords Affect detection; naturalistic facial expressions; classroom data; in the wild. ACM Classification Keywords H.5.m. Information interfaces and presentation (e.g., HCI):
The additive value of multimodal features for predicting engagement, frustration, and learning during tutoring.
- Proceedings of the 16th International Conference on Multimodal Interaction, ACM,
, 2014
"... ABSTRACT Detecting learning-centered affective states is difficult, yet crucial for adapting most effectively to users. Within tutoring in particular, the combined context of student task actions and tutorial dialogue shape the student's affective experience. As we move toward detecting affect ..."
Abstract
-
Cited by 2 (0 self)
- Add to MetaCart
(Show Context)
ABSTRACT Detecting learning-centered affective states is difficult, yet crucial for adapting most effectively to users. Within tutoring in particular, the combined context of student task actions and tutorial dialogue shape the student's affective experience. As we move toward detecting affect, we may also supplement the task and dialogue streams with rich sensor data. In a study of introductory computer programming tutoring, human tutors communicated with students through a text-based interface. Automated approaches were leveraged to annotate dialogue, task actions, facial movements, postural positions, and hand-to-face gestures. These dialogue, nonverbal behavior, and task action input streams were then used to predict retrospective student self-reports of engagement and frustration, as well as pretest/posttest learning gains. The results show that the combined set of multimodal features is most predictive, indicating an additive effect. Additionally, the findings demonstrate that the role of nonverbal behavior may depend on the dialogue and task context in which it occurs. This line of research identifies contextual and behavioral cues that may be leveraged in future adaptive multimodal systems.
Predicting Learning and Affect from Multimodal Data Streams in Task-Oriented Tutorial Dialogue
"... Learners experience a wide array of cognitive and affective states during tutoring. Detecting and responding to these states is a core problem of adaptive learning environments that aim to foster motivation and increase learning. Recognizing learner affect through nonverbal behavior is particularly ..."
Abstract
-
Cited by 2 (1 self)
- Add to MetaCart
(Show Context)
Learners experience a wide array of cognitive and affective states during tutoring. Detecting and responding to these states is a core problem of adaptive learning environments that aim to foster motivation and increase learning. Recognizing learner affect through nonverbal behavior is particularly challenging, as students display affect across numerous modalities. This study utilizes an automatically extracted set of multimodal nonverbal behaviors and task actions to predict learning and affect in a data set of sixty-three computer-mediated human tutoring sessions. Predictive models of post-session self-reported engagement, frustration, and learning were evaluated with leave-one-out cross-validation. Nonverbal behaviors conditioned on task events and typing were found to be more predictive than incoming student self-efficacy and pretest score. Face and gesture were predictive of engagement and frustration, while face and posture was predictive of learning. The nonverbal model features captured moments when students were most active on the task, such as writing and testing the Java program. These results provide initial evidence linking affect, moment-by-moment multimodal nonverbal behavior, and task performance during tutoring. They improve understanding of learner affect and enable automated tutorial interventions that adapt to student states as a highly effective human tutor would.
Predicting Learning and Engagement in Tutorial Dialogue: A Personality-Based Model
"... ABSTRACT A variety of studies have established that users with different personality profiles exhibit different patterns of behavior when interacting with a system. Although patterns of behavior have been successfully used to predict cognitive and affective outcomes of an interaction, little work h ..."
Abstract
- Add to MetaCart
(Show Context)
ABSTRACT A variety of studies have established that users with different personality profiles exhibit different patterns of behavior when interacting with a system. Although patterns of behavior have been successfully used to predict cognitive and affective outcomes of an interaction, little work has been done to identify the variations in these patterns based on user personality profile. In this paper, we model sequences of facial expressions, postural shifts, hand-to-face gestures, system interaction events, and textual dialogue messages of a user interacting with a human tutor in a computermediated tutorial session. We use these models to predict the user's learning gain, frustration, and engagement at the end of the session. In particular, we examine the behavior of users based on their Extraversion trait score of a Big Five Factor personality survey. The analysis reveals a variety of personality-specific sequences of behavior that are significantly indicative of cognitive and affective outcomes. These results could impact user experience design of future interactive systems.
Copyright © 200x Inderscience Enterprises Ltd.
"... Learner characteristics and dialogue: recognising effective and student-adaptive tutorial strategies ..."
Abstract
- Add to MetaCart
Learner characteristics and dialogue: recognising effective and student-adaptive tutorial strategies