Machine learning reduces hazards in nuclear power plants
Nuclear researcher Juliana Pacheco Duarte is part of a team using nearly $800,000 in funding from the U.S. Department of Energy to build new prediction models for hazards that could impact nuclear power plants.
According to Duarte, nuclear energy has the potential to rapidly expand the world’s energy market, but advances in technology must come with new methods for risk management. The phrase “risk management in a nuclear power plant” may immediately call to mind the avoidance of a nuclear explosion or a radiation leak such as Chernobyl, but this risk management begins with a plantwide assessment to ensure that existing local hazards do not result in an unsafe shutdown of a plant. Duarte’s group will create a general risk reduction methodology to more accurately predict local hazards that, if not managed, could lead to significant events.
Though they don’t usually spell environmental disaster, local hazards can still equal big costs. Aside from the radioactive elements, there are many combustible sources within a power plant. A control room, pump room, or turbine room are composed of complex networks of mechanics and electronics that demand their own safety standards. The loss of any of those peripherals could spell a major repair cost in addition to the loss of energy production.
Duarte’s team includes Virginia Tech Professor Brian Lattimer, University of Wisconsin-Madison faculty Jun Wang and Michael Corradini, and Convergent Science co-owner and vice president Kelly Senecal. This grouping will combine expertise in the mechanics of fire, engineering physics, and custom machine learning to build predictive models for probabilistic risk assessments.
The Department of Energy made the move to improve hazard prediction models following a period of progress in understanding the properties of fire. The proposal indicates that, while new data and measurement methods have been steadily advancing the understanding of fire, many of the applied risk management models in place for nuclear power plants are based on data collected before these improvements. Duarte’s team will identify what new data are needed and provide new tools to set the stage for decreased uncertainty.
To make those connections, the group will use simulation Monte Carlo analysis, statistical analysis, and machine learning to reduce the uncertainty in describing a hazard event. This will be achieved by running a series of simulations that provide variations on hazard event conditions, conducting statistics on existing data and simulation results to identify key parameters that most significantly affect the hazard level, and using deep learning models and statistics to identify how to lower the hazard uncertainty. This will give a clearer picture of any gaps in the current data and inform a more complete approach for risk reduction.
This application of machine learning with simulation data will also provide additional possibilities for field engineers to more accurately assess the hazards for their specific plant. Field conditions are typically not the same as data set conditions. Machine learning models developed in this research will be able to rapidly translate what the appropriate hazard event conditions are for the given field conditions.
While the scope of the initial call from the Department of Energy was centered on fire modeling, the group’s techniques could also be used for other kinds of prediction. By using adapted versions of this methodology, the impact of additional events, such as hurricanes, earthquakes and tsunamis, could also be anticipated.
Written by Alex Parrish