skip to main content
DOE PAGES title logo U.S. Department of Energy
Office of Scientific and Technical Information

This content will become publicly available on February 15, 2020

Title: How to Host An Effective Data Competition: Statistical Advice for Competition Design and Analysis

Abstract

Data competitions rely on real-time leaderboards to rank competitor entries and stimulate algorithm improvement. While such competitions have become quite popular and prevalent, particularly in supervised learning formats, their implementations by the host are highly variable. Without careful planning, a supervised learning competition is vulnerable to overfitting, where the winning solutions are so closely tuned to the particular set of provided data that they cannot generalize to the underlying problem of interest to the host. This paper outlines some important considerations for strategically designing relevant and informative data sets to maximize the learning outcome from hosting a competition based on our experience. It also describes a postcompetition analysis that enables robust and efficient assessment of the strengths and weaknesses of solutions from different competitors, as well as greater understanding of the regions of the input space that are well-solved. The postcompetition analysis, which complements the leaderboard, uses exploratory data analysis and generalized linear models (GLMs). The GLMs not only expand the range of results we can explore, they also provide more detailed analysis of individual subquestions including similarities and differences between algorithms across different types of scenarios, universally easy or hard regions of the input space, and different learning objectives.more » When coupled with a strategically planned data generation approach, the methods provide richer and more informative summaries to enhance the interpretation of results beyond just the rankings on the leaderboard. The methods are illustrated with a recently completed competition to evaluate algorithms capable of detecting, identifying, and locating radioactive materials in an urban environment.« less

Authors:
ORCiD logo [1]; ORCiD logo [1];  [2]; ORCiD logo [1];  [3]; ORCiD logo [1]
  1. Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
  2. Univ. of South Florida, Tampa, FL (United States)
  3. Pennsylvania State Univ., University Park, PA (United States)
Publication Date:
Research Org.:
Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
Sponsoring Org.:
U.S. Department of Homeland Security; USDOE
OSTI Identifier:
1526960
Report Number(s):
LA-UR-18-31113
Journal ID: ISSN 1932-1864
Grant/Contract Number:  
89233218CNA000001
Resource Type:
Accepted Manuscript
Journal Name:
Statistical Analysis and Data Mining
Additional Journal Information:
Journal Name: Statistical Analysis and Data Mining; Journal ID: ISSN 1932-1864
Publisher:
Wiley
Country of Publication:
United States
Language:
English
Subject:
97 MATHEMATICS AND COMPUTING

Citation Formats

Anderson-Cook, Christine Michaela, Myers, Kary Lynn, Lu, Lu, Fugate, Michael Lynn, Quinlan, Kevin, and Pawley, Norma Helen. How to Host An Effective Data Competition: Statistical Advice for Competition Design and Analysis. United States: N. p., 2019. Web. doi:10.1002/sam.11404.
Anderson-Cook, Christine Michaela, Myers, Kary Lynn, Lu, Lu, Fugate, Michael Lynn, Quinlan, Kevin, & Pawley, Norma Helen. How to Host An Effective Data Competition: Statistical Advice for Competition Design and Analysis. United States. doi:10.1002/sam.11404.
Anderson-Cook, Christine Michaela, Myers, Kary Lynn, Lu, Lu, Fugate, Michael Lynn, Quinlan, Kevin, and Pawley, Norma Helen. Fri . "How to Host An Effective Data Competition: Statistical Advice for Competition Design and Analysis". United States. doi:10.1002/sam.11404.
@article{osti_1526960,
title = {How to Host An Effective Data Competition: Statistical Advice for Competition Design and Analysis},
author = {Anderson-Cook, Christine Michaela and Myers, Kary Lynn and Lu, Lu and Fugate, Michael Lynn and Quinlan, Kevin and Pawley, Norma Helen},
abstractNote = {Data competitions rely on real-time leaderboards to rank competitor entries and stimulate algorithm improvement. While such competitions have become quite popular and prevalent, particularly in supervised learning formats, their implementations by the host are highly variable. Without careful planning, a supervised learning competition is vulnerable to overfitting, where the winning solutions are so closely tuned to the particular set of provided data that they cannot generalize to the underlying problem of interest to the host. This paper outlines some important considerations for strategically designing relevant and informative data sets to maximize the learning outcome from hosting a competition based on our experience. It also describes a postcompetition analysis that enables robust and efficient assessment of the strengths and weaknesses of solutions from different competitors, as well as greater understanding of the regions of the input space that are well-solved. The postcompetition analysis, which complements the leaderboard, uses exploratory data analysis and generalized linear models (GLMs). The GLMs not only expand the range of results we can explore, they also provide more detailed analysis of individual subquestions including similarities and differences between algorithms across different types of scenarios, universally easy or hard regions of the input space, and different learning objectives. When coupled with a strategically planned data generation approach, the methods provide richer and more informative summaries to enhance the interpretation of results beyond just the rankings on the leaderboard. The methods are illustrated with a recently completed competition to evaluate algorithms capable of detecting, identifying, and locating radioactive materials in an urban environment.},
doi = {10.1002/sam.11404},
journal = {Statistical Analysis and Data Mining},
number = ,
volume = ,
place = {United States},
year = {2019},
month = {2}
}

Journal Article:
Free Publicly Available Full Text
This content will become publicly available on February 15, 2020
Publisher's Version of Record

Save / Share: