Accelerating Science Discovery - Join the Discussion

Published by Peter Lincoln

Department of Energy PAGES Public Access Gateway for Energy & Science

To help get the word out to researchers funded by the Department of Energy (DOE) at DOE national laboratories and research universities around the country, the DOE Office of Scientific and Technical Information (OSTI) and the DOE Oak Ridge National Laboratory (ORNL) have teamed up to produce a video about DOE PAGES, the DOE Public Access Gateway for Energy and Science.

DOE PAGES offers free public access to the best available full-text version of DOE-affiliated scholarly publications – either the peer-reviewed, accepted manuscript or the published scientific journal article – after an administrative interval of 12 months.  

Entitled “A Video Message about DOE PAGES for DOE-funded Authors of Scientific Publications,” the infographic video provides an introduction to the DOE portal to scholarly publications resulting from DOE research funding – and encourages DOE laboratory and grantee researchers to submit their accepted manuscripts to OSTI, which developed and maintains the repository for the Department. 

Published by Kathy Chambers

Harold C. Urey

Image credit: Energy.gov

To celebrate 70 years of advancing scientific knowledge, OSTI is featuring some of the leading scientists and works particularly relevant to the formation of DOE, OSTI, and their predecessor organizations and is highlighting Nobel laureates and other important research figures in DOE’s history.  Their accomplishments were key to the evolution of the Department of Energy, and OSTI’s collections include many of their publications. 

The pioneering work of American chemist and physicist Harold C. Urey on isotopes led to his discovery of deuterium in 1931 and earned him the 1934 Nobel Prize in Chemistry.  This discovery was one of his many contributions in several fields of science during his long and diverse career. 

By 1929, the theory of isotopes, or the idea that an individual element could consist of atoms with the same number of protons but with different masses, had been developed, and the less-abundant isotopes of carbon, nitrogen, and oxygen had been discovered.  Urey, who at the time was an associate professor at Columbia University, believed that isotopes of hydrogen could be more important, so he devised an experiment to look for them.

Published by Kathy Chambers

Ernest Orlando Lawrence
Ernest Orlando Lawrence.  Image credit: Energy.gov

To celebrate 70 years of advancing scientific knowledge, OSTI is featuring some of the leading scientists and works particularly relevant to the formation of DOE, OSTI, and their predecessor organizations and is highlighting Nobel Laureates and other important research figures in DOE’s history.  Their accomplishments were key to the evolution of the Department of Energy, and OSTI’s collections include many of their publications. 

Ernest Orlando Lawrence’s love of science began at an early age and continued throughout his life.  His parents and grandparents were educators and encouraged hard work and curiosity.  While working on his Bachelor of Arts degree in chemistry at the University of South Dakota and thinking of pursuing a career in medicine, Lawrence became influenced by faculty mentors in the field of physics and decided instead to pursue his graduate degree in physics at the University of Minnesota.  After completing his Master’s degree, he studied for a year at the University of Chicago, where, Lawrence “caught fire as a researcher,” in the words of a later observer.  After Lawrence earned his Ph.D. in physics at Yale University in 1925, he stayed on for another three years as a National Research Fellow and an assistant professor of physics.  In 1928, Lawrence was recruited by the University of California, Berkeley as associate professor of physics.  Two years later, at the age of 27, he became the youngest full professor at Berkeley.

Published by Kathy Chambers

Deep Learning Neural Networks - Mimicking the Human Brain
Image Credit: iStock.com/Henrik5000

If you have used your cell phone’s personal assistant to help you find your way or taken advantage of its translation or speech-to-text programs, then you have benefitted from a deep learning neural network architecture.  Inspired by the human brain’s ability to learn, deep learning neural networks are based on a class of machine algorithms that can learn to find patterns and closely represent those patterns at many levels.  As additional information is received, the network refines those patterns, gains experience, and improves its probabilities, essentially learning from its mistakes.  This is called “deep learning” because the networks that are involved have a depth of more than just a few layers. 

Basic deep learning concepts were developed many years ago; with today’s availability of high performance computing environments and massive datasets, there has been a resurgence of deep learning neural network research throughout the science community.  Scalable tools are being developed to train these networks, and brain-inspired computing algorithms are achieving state-of-the-art results on tasks such as visual object classification, speech and image recognition, bioinfomatics, neuroscience, language modeling, and natural language understanding. 

Published by Kathy Chambers

X-ray imaging shows how memristors work at  an atomic scale
X-ray imaging shows how memristors work at
an atomic scale.  Image credit:  SLAC National
Accelerator Laboratory

A tiny device called a memristor holds great promise for a new era of electronics.  Unlike a conventional resistor, its resistance can be reset, and it remembers its resistance.  It functions in a way that is similar to synapses in the human brain, where neurons pass and receive information.  A memristor is a two-terminal device whose resistance depends on the voltages applied to it in the past.  When the voltage is turned off, the resistance remains or remembers where it was previously.  This little device actually learns.  A commercially viable memristor could enable us to move away from flash memory and silicon-based computing to smart energy-efficient computers that operate similarly to the human brain, with the capability to comprehend speech and images, and with highly advanced memory retention.