19
Model of Engagement for Educational Agents based on Mouse and Keyboard Events LaVonda Brown Georgia Institute of Technology LearnLab Workshop 2012 Carnegie Melon University

LaVonda Brown Georgia Institute of Technology LearnLab Workshop 2012 Carnegie Melon University

Embed Size (px)

Citation preview

Model of Engagement for Educational Agents based on Mouse and Keyboard Events

LaVonda BrownGeorgia Institute of Technology

LearnLab Workshop 2012Carnegie Melon University

Int’l Conference on Robotics and Automation 2010Anchorage, AlaskaHumAnS Lab. Presentation2

Purpose Motivation Background Design Hypotheses Results Conclusion Future Work

Outline

Int’l Conference on Robotics and Automation 2010Anchorage, AlaskaHumAnS Lab. Presentation3

Purpose

To develop a reliable, non-invasive method of monitoring academic engagement within the domain of computer-based education (CBE)

Int’l Conference on Robotics and Automation 2010Anchorage, AlaskaHumAnS Lab. Presentation4

Motivation Socially Interactive

Robot Tutor (SIRT)◦ Real-time monitoring of

actions to determine level of engagement and understanding

◦ Adapt to educational needs through instructional scaffolding

◦ Engage children by forming a personal relationship Essential for longevity

Platform

Int’l Conference on Robotics and Automation 2010Anchorage, AlaskaHumAnS Lab. Presentation5

Background Teachers/tutors are able to observe the

student’s engagement in real-time and employ strategies to reengage the student

They are able to determine engagement by following behavioral cues from students

This improves attention, involvement and motivation to learn

Behavioral engagement is often related to the academic achievement of a student

Int’l Conference on Robotics and Automation 2010Anchorage, AlaskaHumAnS Lab. Presentation6

Related Work Scales/Surveys

◦ Used to evaluate motivation once the student has completed a system

Electroencephalography (EEG) signal measurements◦ Able to identify subtle shifts in alertness,

attention, and workload in real time Eye Gaze and Head Pose

◦ Able to determine six user states in an e-learning environment: attentive, full of interest, frustrated/struggling to read, distracted, tired/sleepy, and not paying attention

Int’l Conference on Robotics and Automation 2010Anchorage, AlaskaHumAnS Lab. Presentation7

Design Eye gaze and head pose will be the baseline Will use this to develop a novel model of

student engagement based on mouse and keyboard events.

Two tests of high and low difficulty Three event processes will be monitored

◦ Total Time slow, average, or fast

◦ Response Validity correct or incorrect

◦ Proper Function Execution on-task or off-task

Int’l Conference on Robotics and Automation 2010Anchorage, AlaskaHumAnS Lab. Presentation8

Hypothesis 1 Hypothesis 1. The student is engaged if his

or her series of events (or combination of events) are classified as:◦ On-task and correct (regardless of speed)◦ On-task, slow or average, and incorrect

Int’l Conference on Robotics and Automation 2010Anchorage, AlaskaHumAnS Lab. Presentation9

Hypothesis 2 Eye gaze and head pose will not be an

accurate measure of user state/engagement for the high difficulty test.

The use of pencil and paper will create false negatives since eye gaze will be directed towards the paper instead of the computer screen.

Int’l Conference on Robotics and Automation 2010Anchorage, AlaskaHumAnS Lab. Presentation10

Hypothesis 3 Various combinations of the event processes can

determine the following about the engaged student: ◦ If the student is on-task and has a series of fast

responses with a series of correct answers, the student needs questions of higher difficulty.

◦ If the student is on-task and has a series of slow and/or average responses with a series of correct answers, the student does a great deal of thinking and understands the material.

◦ If the student is on-task and has a series of slow and/or average responses with a series of incorrect answers, the student lacks understanding and needs questions of lesser difficulty

Int’l Conference on Robotics and Automation 2010Anchorage, AlaskaHumAnS Lab. Presentation11

Results: Question Time

Int’l Conference on Robotics and Automation 2010Anchorage, AlaskaHumAnS Lab. Presentation12

Results: Question Time

Int’l Conference on Robotics and Automation 2010Anchorage, AlaskaHumAnS Lab. Presentation13

Results: Test Time

Int’l Conference on Robotics and Automation 2010Anchorage, AlaskaHumAnS Lab. Presentation14

Results: Test Responses

Int’l Conference on Robotics and Automation 2010Anchorage, AlaskaHumAnS Lab. Presentation

Results: Event Combinations

15

This chart shows the how often we received each combination of events throughout both tests (S=slow, A=average, F=fast, C=correct, I=incorrect, O=on-task, O’=off-task). The combinations OIF, O’CS, O’CA, O’CF, O’IS, O’IA, O’IF do not occur during this study.

Int’l Conference on Robotics and Automation 2010Anchorage, AlaskaHumAnS Lab. Presentation16

Results: Eye Gaze

Int’l Conference on Robotics and Automation 2010Anchorage, AlaskaHumAnS Lab. Presentation17

Conclusion In order for a student to be engaged, he or

she must be on-task or choose events that successfully execute functions needed to navigate through the assessment.

However, if a student is on-task, but answers incorrectly, and at a fast pace (OIF), he or she is classified as being disengaged.

If a student is classified as being off-task, we can automatically classify this student as being disengaged (regardless of speed and/or response).

Int’l Conference on Robotics and Automation 2010Anchorage, AlaskaHumAnS Lab. Presentation18

Future Work Add a survey to the end of the experiment to better

determine understanding, engagement, and difficulty.◦ If the student is on-task and has a series of fast responses

with a series of correct answers (OCF), the student may need questions of higher difficulty.

◦ If the student is on-task and has a series of slow responses with a series of correct answers (OCS), the student may understand the material and require more time to think.

◦ If the student is on-task and has a series of slow responses with a series of incorrect answers (OIS), the student may lack understanding and need questions of lesser difficulty.

This additional information will be used in the future to better integrate instructional scaffolding and adaptation with the device.

Int’l Conference on Robotics and Automation 2010Anchorage, AlaskaHumAnS Lab. Presentation19

Questions???