Smarter Experiments for Faster Materials Discovery
August 30, 2019 | Brookhaven National LaboratoryEstimated reading time: 7 minutes
A team of scientists from the U.S. Department of Energy’s Brookhaven National Laboratory and Lawrence Berkeley National Laboratory designed, created, and successfully tested a new algorithm to make smarter scientific measurement decisions. The algorithm, a form of artificial intelligence (AI), can make autonomous decisions to define and perform the next step of an experiment. The team described the capabilities and flexibility of their new measurement tool in a paper published on August 14, 2019 in Nature Scientific Reports.
From Galileo and Newton to the recent discovery of gravitational waves, performing scientific experiments to understand the world around us has been the driving force of our technological advancement for hundreds of years. Improving the way researchers do their experiments can have tremendous impact on how quickly those experiments yield applicable results for new technologies.
Over the last decades, researchers have sped up their experiments through automation and an ever-growing assortment of fast measurement tools. However, some of the most interesting and important scientific challenges—such as creating improved battery materials for energy storage or new quantum materials for new types of computers—still require very demanding and time-consuming experiments.
By creating a new decision-making algorithm as part of a fully automated experimental setup, the interdisciplinary team from two of Brookhaven’s DOE Office of Science user facilities—the Center for Functional Nanomaterials (CFN) and the National Synchrotron Light Source II (NSLS-II)—and Berkeley Lab’s Center for Advanced Mathematics for Energy Research Applications (CAMERA) offers the possibility to study these challenges in a more efficient fashion.
The challenge of complexity
The goal of many experiments is to gain knowledge about the material that is studied, and scientists have a well-tested way to do this: They take a sample of the material and measure how it reacts to changes in its environment.
A standard approach for scientists at user facilities like NSLS-II and CFN is to manually scan through the measurements from a given experiment to determine the next area where they might want to run an experiment. But access to these facilities’ high-end materials-characterization tools is limited, so measurement time is precious. A research team might only have a few days to measure their materials, so they need to make the most out of each measurement.
"The key to achieving a minimum number of measurements and maximum quality of the resulting model is to go where uncertainties are large,” said Marcus Noack, a postdoctoral scholar at CAMERA and lead author of the study. “Performing measurements there will most effectively reduce the overall model uncertainty.”
As Kevin Yager, a co-author and CFN scientist, pointed out, “The final goal is not only to take data faster but also to improve the quality of the data we collect. I think of it as experimentalists switching from micromanaging their experiment to managing at a higher level. Instead of having to decide where to measure next on the sample, the scientists can instead think about the big picture, which is ultimately what we as scientists are trying to do.”
“This new approach is an applied example of artificial intelligence,” said co-author Masafumi Fukuto, a scientist at NSLS-II. “The decision-making algorithm is replacing the intuition of the human experimenter and can scan through the data and make smart decisions about how the experiment should proceed.”
More information for less?
In practice, before starting an experiment, the scientists define a set of goals they want to get out of the measurement. With these goals set, the algorithm looks at the previously measured data while the experiment is ongoing to determine the next measurement. On its search for the best next measurement, the algorithm creates a surrogate model of the data, which is an educated guess as to how the material will behave in the next possible steps, and calculates the uncertainty—basically how confident it is in its guess—for each possible next step. Based on this, it then selects the most uncertain option to measure next. The trick here is by picking the most uncertain step to measure next, the algorithm maximizes the amount of knowledge it gains by making that measurement. The algorithm not only maximizes the information gain during the measurement, it also defines when to end the experiment by figuring out the moment when any additional measurements would not result in more knowledge.
Page 1 of 2
Suggested Items
Real Time with… IPC APEX EXPO 2024: Automation in North American PCB Shops
05/17/2024 | Real Time with...IPC APEX EXPOBenmayor Group has entered the North American market's automation landscape with their Technosystem division. In this interview, Eduardo Benmayor highlights this underinvestment and current efforts to catch up and address challenges related to strategic planning. Eduardo shares Technosystem's automation journey, from simple equipment to robotic arms, stressing the importance of machine communication and data analysis. He also offers advice on implementing automation in older facilities.
Using AI to Redefine Productivity
05/15/2024 | Nolan Johnson, SMT007 MagazinePlato Systems, a machine perception company spun out of Stanford University, employs AI and video data to analyze and optimize the human component in manufacturing. Initially focused on semiconductors, Plato Systems has expanded into EMS manufacturing. Co-founder and CEO Amin Arbabian, along with product advisor Anders Holden and head of growth Luis Vidal, discuss their approach to changeover optimization and its impact on productivity in the industry. They’ve also included customer Raj Vora in the conversation.
Real Time with… IPC APEX EXPO 2024: Manufacturing Intelligence from the Factory Floor
05/15/2024 | Real Time with...IPC APEX EXPONolan Johnson and Ranjan Chatterjee, Vice President of Smart Factory Business Units at PDF Solutions, discuss the background of Cimetrix and PDF Solutions. They explore the analytics tools provided by PDF Solutions, the merging of semiconductor and electronics manufacturing, and data handling in these industries. They also discuss different product lines, standards, packaging technologies, data usage, and integration with ERP systems
Nolan’s Notes: Coming to Terms With AI
05/07/2024 | Nolan Johnson -- Column: Nolan's NotesHow fast do things move in the world of data analytics? Here’s an example. We’ve been planning this issue on artificial intelligence for the past few months, and, in fact, I had already written this column about a month ago. Then I went to IPC APEX EXPO and upended it all. I originally had compared AI to drag racing in that (CPU) horsepower and new (data) vehicles have steadily delivered higher performance competition. That seemed pretty accurate given how generative AI models dominated the popular media with amazing results—and sometimes spectacular crashes.
RTX's Advanced Ground System for Space-based Missile Warning Now Operational
05/06/2024 | RTXAn advanced ground system for space-based missile warning developed by Raytheon, an RTX business, is now operational at the U.S. Space Force's Overhead Persistent Infrared Battlespace Awareness Center (OBAC).