Skip to main content
U.S. Department of Energy
Office of Scientific and Technical Information

Degradation of nuclear plant temperature sensors

Technical Report ·
OSTI ID:6443522
A program was established and initial tests were performed to evaluate long term performance of resistance temperature detectors (RTDs) of the type used in US nuclear power plants. The effort addressed the effect of aging on RTD calibration accuracy and response time. This Phase I effort included exposure of thirteen nuclear safety system grade RTD elements to simulated LWR temperatures. Full calibrations were performed initially and monthly, sensors were monitored and cross checked continuously during exposure, and response time tests were performed before and after exposure. Short term calibration drifts of as much as 1.8/sup 0/F (1/sup 0/C) were observed. Response times were essentially unaffected by this testing. This program shows that there is a sound reason for concern about the accuracy of temperature measurements in nuclear power plants. These limited tests should be expanded in a Phase II program to involve more sensors and longer exposures to simulated LWR conditions in order to obtain statistically significant data. This data is needed to establish meaningful testing or replacement intervals for safety system RTDs. An important corollary benefit from this expanded program will be a better definition of achievable accuracies in RTD calibration. This report concludes a six-month Phase I project performed for the Nuclear Regulatory Commission under the SBIR program.
Research Organization:
Analysis and Measurement Services, Knoxville, TN (USA); Nuclear Regulatory Commission, Washington, DC (USA). Div. of Engineering
OSTI ID:
6443522
Report Number(s):
NUREG/CR-4928; ON: TI87900824
Country of Publication:
United States
Language:
English