Skip to main content
U.S. Department of Energy
Office of Scientific and Technical Information

Using after-action review based on automated performance assessment to enhance training effectiveness.

Conference ·
OSTI ID:1026982

Training simulators have become increasingly popular tools for instructing humans on performance in complex environments. However, the question of how to provide individualized and scenario-specific assessment and feedback to students remains largely an open question. In this work, we follow-up on previous evaluations of the Automated Expert Modeling and Automated Student Evaluation (AEMASE) system, which automatically assesses student performance based on observed examples of good and bad performance in a given domain. The current study provides a rigorous empirical evaluation of the enhanced training effectiveness achievable with this technology. In particular, we found that students given feedback via the AEMASE-based debrief tool performed significantly better than students given only instructor feedback on two out of three domain-specific performance metrics.

Research Organization:
Sandia National Laboratories (SNL), Albuquerque, NM, and Livermore, CA (United States)
Sponsoring Organization:
USDOE
DOE Contract Number:
AC04-94AL85000
OSTI ID:
1026982
Report Number(s):
SAND2010-6514C; TRN: US201121%%193
Resource Relation:
Conference: Proposed for presentation at the Human Factors and Ergonomics Society Meetings held September 27-October 1, 2010 in San Francisco, CA.
Country of Publication:
United States
Language:
English