DOE Patents title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Preemptive cache management policies for processing units

Abstract

A processing system includes at least one central processing unit (CPU) core, at least one graphics processing unit (GPU) core, a main memory, and a coherence directory for maintaining cache coherence. The at least one CPU core receives a CPU cache flush command to flush cache lines stored in cache memory of the at least one CPU core prior to launching a GPU kernel. The coherence directory transfers data associated with a memory access request by the at least one GPU core from the main memory without issuing coherence probes to caches of the at least one CPU core.

Inventors:
; ;
Issue Date:
Research Org.:
Lawrence Livermore National Laboratory (LLNL), Livermore, CA (United States)
Sponsoring Org.:
USDOE
OSTI Identifier:
1568401
Patent Number(s):
10303602
Application Number:
15/475,435
Assignee:
Advanced Micro Devices, Inc. (Santa Clara, CA)
Patent Classifications (CPCs):
G - PHYSICS G06 - COMPUTING G06F - ELECTRIC DIGITAL DATA PROCESSING
DOE Contract Number:  
AC52-07NA27344; B609201
Resource Type:
Patent
Resource Relation:
Patent File Date: 03/31/2017
Country of Publication:
United States
Language:
English

Citation Formats

Kayiran, Onur, Loh, Gabriel H., and Eckert, Yasuko. Preemptive cache management policies for processing units. United States: N. p., 2019. Web.
Kayiran, Onur, Loh, Gabriel H., & Eckert, Yasuko. Preemptive cache management policies for processing units. United States.
Kayiran, Onur, Loh, Gabriel H., and Eckert, Yasuko. Tue . "Preemptive cache management policies for processing units". United States. https://www.osti.gov/servlets/purl/1568401.
@article{osti_1568401,
title = {Preemptive cache management policies for processing units},
author = {Kayiran, Onur and Loh, Gabriel H. and Eckert, Yasuko},
abstractNote = {A processing system includes at least one central processing unit (CPU) core, at least one graphics processing unit (GPU) core, a main memory, and a coherence directory for maintaining cache coherence. The at least one CPU core receives a CPU cache flush command to flush cache lines stored in cache memory of the at least one CPU core prior to launching a GPU kernel. The coherence directory transfers data associated with a memory access request by the at least one GPU core from the main memory without issuing coherence probes to caches of the at least one CPU core.},
doi = {},
journal = {},
number = ,
volume = ,
place = {United States},
year = {Tue May 28 00:00:00 EDT 2019},
month = {Tue May 28 00:00:00 EDT 2019}
}

Works referenced in this record:

System and Method for Simplifying Cache Coherence Using Multiple Write Policies
patent-application, September 2013


Method and System for Shutting Down Active Core Based Caches
patent-application, June 2014


Cache Coherency Using Die-Stacked Memory Device with Logic Die
patent-application, June 2014