Accelerating Science Discovery - Join the Discussion

Published by Peter Lincoln
Department of Energy PAGES Public Access Gateway for Energy & ScienceTo help get the word out to researchers funded by the Department of Energy (DOE) at DOE national laboratories and research universities around the country, the DOE Office of Scientific and Technical Information (OSTI) and the DOE Oak Ridge National Laboratory (ORNL) have teamed up to produce a video about DOE PAGES, the DOE Public Access Gateway for Energy and Science.DOE PAGES offers free public access to the best available full-text version of DOE-affiliated scholarly publications – either the peer-reviewed, accepted manuscript or the published scientific journal article – after an administrative interval of 12 months.
Published by Kathy Chambers
Harold C. UreyImage credit: Energy.govTo celebrate 70 years of advancing scientific knowledge, OSTI is featuring some of the leading scientists and works particularly relevant to the formation of DOE, OSTI, and their predecessor organizations and is highlighting Nobel laureates and other important research figures in DOE’s history.  Their accomplishments were key to the evolution of the Department of Energy, and OSTI’s collections include many of their publications. The pioneering work of American chemist and physicist Harold C.
Published by Kathy Chambers
Ernest Orlando LawrenceErnest Orlando Lawrence.  Image credit: Energy.govTo celebrate 70 years of advancing scientific knowledge, OSTI is featuring some of the leading scientists and works particularly relevant to the formation of DOE, OSTI, and their predecessor organizations and is highlighting Nobel Laureates and other important research figures in DOE’s history.  Their accomplishments were key to the evolution of the Department of Energy, and OSTI’s collections include many of their publications. Ernest Orlando Lawrence’s love of science began at an early age and continued throughout his life.  His parents and grandparents were educators and encouraged hard work and curiosity.  While working on his Bachelor of Arts degree in chemistry at the University of South Dakota and thinking of pursuing a career in medicine, Lawrence became influenced by faculty mentors in the field of physics and decided instead to pursue his graduate degree in physics at the University of Minnesota.  After completing his Master’s degree, he studied for a year at the University of Chicago, where, Lawrence “caught fire as a researcher,” in the words of a later observer.  After Lawrence earned his Ph.D.
Published by Kathy Chambers
Deep Learning Neural Networks - Mimicking the Human BrainImage Credit: iStock.com/Henrik5000If you have used your cell phone’s personal assistant to help you find your way or taken advantage of its translation or speech-to-text programs, then you have benefitted from a deep learning neural network architecture.  Inspired by the human brain’s ability to learn, deep learning neural networks are based on a class of machine algorithms that can learn to find patterns and closely represent those patterns at many levels.  As additional information is received, the network refines those patterns, gains experience, and improves its probabilities, essentially learning from its mistakes.  This is called “deep learning” because the networks that are involved have a depth of more than just a few layers. Basic deep learning concepts were developed many years ago; with today’s availability of high performance computing environments and massive datasets, there has been a resurgence of deep learning neural network research throughout the science community.  Scalable tools are being developed to train these networks, and brain-inspired computing algorithms are achieving state-of-the-art results on tasks such as visual object classification, speech and image recognition, bioinfomatics, neuroscience, language modeling, and natural language understanding. Improvements in computational energy efficiency and throughput are being realized in neurosynaptic or cognitive neural network architectures.  A great example of this is the Lawrence Livermore National Laboratory (LLNL) and IBM Research collaboration to build a new brain-inspired supercomputer.  The hardware and software ecosystem is based on IBM’s breakthrough neu
Published by Kathy Chambers
X-ray imaging shows how memristors work at  an atomic scaleX-ray imaging shows how memristors work at an atomic scale.  Image credit:  SLAC National Accelerator LaboratoryA tiny device called a memristor holds great promise for a new era of electronics.  Unlike a conventional resistor, its resistance can be reset, and it remembers its resistance.  It functions in a way that is similar to synapses in the human brain, where neurons pass and receive information.  A memristor is a two-terminal device whose resistance depends on the voltages applied to it in the past.  When the voltage is turned off, the resistance remains or remembers where it was previously.  This little device actually learns.  A commercially viable memristor could enable us to move away from flash memory and silicon-based computing to smart energy-efficient computers that operate similarly to the human brain, with the capability to comprehend speech and images, and with highly advanced memory retention.The memristor was first predicted theoretically by University of California, Berkeley professor Leon Chua in 1971 as the fourth basic electrical device element alongside the resistor, capacitor, and inductor.  He named his device a memristor—a contraction of the words “memory” and “resistor.”  Chua’s concept, as originally described, involved magnetic flux in the memristor’s operation.  But in 2008, when Richard Stanley Williams and researchers at Hewlett-Packard engineered a non-magnetic device based on other long-known material properties, their description of it in terms of Chua’s memristor concept rocked the electronics research community.During the past decade, memristor designs, materials, and behavior have been explored by the Hewlett