skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Jet-images — deep learning edition

Abstract

Building on the notion of a particle physics detector as a camera and the collimated streams of high energy particles, or jets, it measures as an image, we investigate the potential of machine learning techniques based on deep learning architectures to identify highly boosted W bosons. Modern deep learning algorithms trained on jet images can out-perform standard physically-motivated feature driven approaches to jet tagging. We develop techniques for visualizing how these features are learned by the network and what additional information is used to improve performance. Finally, this interplay between physically-motivated feature driven tools and supervised learning algorithms is general and can be used to significantly increase the sensitivity to discover new particles and new forces, and gain a deeper understanding of the physics within jets.

Authors:
 [1];  [2];  [3];  [2];  [2]
  1. Stanford Univ., CA (United States). Inst. for Computational and Mathematical Engineering
  2. SLAC National Accelerator Lab., Menlo Park, CA (United States)
  3. Stanford Univ., CA (United States). Dept. of Statistics
Publication Date:
Research Org.:
SLAC National Accelerator Lab., Menlo Park, CA (United States)
Sponsoring Org.:
USDOE Office of Science (SC)
Contributing Org.:
Stanford Univ., CA (United States)
OSTI Identifier:
1271300
Grant/Contract Number:
AC02-76SF00515
Resource Type:
Journal Article: Accepted Manuscript
Journal Name:
Journal of High Energy Physics (Online)
Additional Journal Information:
Journal Name: Journal of High Energy Physics (Online); Journal Volume: 2016; Journal Issue: 7; Journal ID: ISSN 1029-8479
Publisher:
Springer Berlin
Country of Publication:
United States
Language:
English
Subject:
72 PHYSICS OF ELEMENTARY PARTICLES AND FIELDS; 97 MATHEMATICS AND COMPUTING; jet substructure; hadron-hadron scattering (experiments)

Citation Formats

de Oliveira, Luke, Kagan, Michael, Mackey, Lester, Nachman, Benjamin, and Schwartzman, Ariel. Jet-images — deep learning edition. United States: N. p., 2016. Web. doi:10.1007/JHEP07(2016)069.
de Oliveira, Luke, Kagan, Michael, Mackey, Lester, Nachman, Benjamin, & Schwartzman, Ariel. Jet-images — deep learning edition. United States. doi:10.1007/JHEP07(2016)069.
de Oliveira, Luke, Kagan, Michael, Mackey, Lester, Nachman, Benjamin, and Schwartzman, Ariel. Wed . "Jet-images — deep learning edition". United States. doi:10.1007/JHEP07(2016)069. https://www.osti.gov/servlets/purl/1271300.
@article{osti_1271300,
title = {Jet-images — deep learning edition},
author = {de Oliveira, Luke and Kagan, Michael and Mackey, Lester and Nachman, Benjamin and Schwartzman, Ariel},
abstractNote = {Building on the notion of a particle physics detector as a camera and the collimated streams of high energy particles, or jets, it measures as an image, we investigate the potential of machine learning techniques based on deep learning architectures to identify highly boosted W bosons. Modern deep learning algorithms trained on jet images can out-perform standard physically-motivated feature driven approaches to jet tagging. We develop techniques for visualizing how these features are learned by the network and what additional information is used to improve performance. Finally, this interplay between physically-motivated feature driven tools and supervised learning algorithms is general and can be used to significantly increase the sensitivity to discover new particles and new forces, and gain a deeper understanding of the physics within jets.},
doi = {10.1007/JHEP07(2016)069},
journal = {Journal of High Energy Physics (Online)},
number = 7,
volume = 2016,
place = {United States},
year = {Wed Jul 13 00:00:00 EDT 2016},
month = {Wed Jul 13 00:00:00 EDT 2016}
}

Journal Article:
Free Publicly Available Full Text
Publisher's Version of Record

Citation Metrics:
Cited by: 13 works
Citation information provided by
Web of Science

Save / Share: