Preprints
https://doi.org/10.5194/gmd-2021-402
https://doi.org/10.5194/gmd-2021-402

Submitted as: development and technical paper 16 Dec 2021

Submitted as: development and technical paper | 16 Dec 2021

Review status: this preprint is currently under review for the journal GMD.

Conservation laws in a neural network architecture: Enforcing the atom balance of a Julia-based photochemical model (v0.2.0)

Patrick Obin Sturm1 and Anthony S. Wexler1,2 Patrick Obin Sturm and Anthony S. Wexler
  • 1Air Quality Research Center, University of California, Davis, California 95616 USA
  • 2Departments of Mechanical and Aerospace Engineering, Civil and Environmental Engineering, and Land, Air and Water Resources, University of California, Davis, California 95616 USA

Abstract. Models of atmospheric phenomena provide insight into climate, air quality, and meteorology, and provide a mechanism for understanding the effect of future emissions scenarios. To accurately represent atmospheric phenomena, these models consume vast quantities of computational resources. Machine learning (ML) techniques such as neural networks have the potential to emulate compute-intensive components of these models to reduce their computational burden. However, such ML surrogate models may lead to nonphysical predictions that are difficult to uncover. Here we present a neural network architecture that enforces conservation laws. Instead of simply predicting properties of interest, a physically interpretable hidden layer within the network predicts fluxes between properties which are subsequently related to the properties of interest. As an example, we design a physics-constrained neural network surrogate model of photochemistry using this approach and find that it conserves atoms as they flow between molecules to machine precision, while outperforming a naïve neural network in terms of accuracy and non-negativity of concentrations.

Patrick Obin Sturm and Anthony S. Wexler

Status: open (until 11 Feb 2022)

Comment types: AC – author | RC – referee | CC – community | EC – editor | CEC – chief editor | : Report abuse
  • CC1: 'Model comparison and naming conventions', Oscar Jacquot, 14 Jan 2022 reply
    • CC2: 'Reply on CC1', Oscar Jacquot, 19 Jan 2022 reply
  • RC1: 'Comment on gmd-2021-402', Anonymous Referee #1, 18 Jan 2022 reply

Patrick Obin Sturm and Anthony S. Wexler

Patrick Obin Sturm and Anthony S. Wexler

Viewed

Total article views: 368 (including HTML, PDF, and XML)
HTML PDF XML Total BibTeX EndNote
278 81 9 368 1 0
  • HTML: 278
  • PDF: 81
  • XML: 9
  • Total: 368
  • BibTeX: 1
  • EndNote: 0
Views and downloads (calculated since 16 Dec 2021)
Cumulative views and downloads (calculated since 16 Dec 2021)

Viewed (geographical distribution)

Total article views: 344 (including HTML, PDF, and XML) Thereof 344 with geography defined and 0 with unknown origin.
Country # Views %
  • 1
1
 
 
 
 
Latest update: 27 Jan 2022
Download
Short summary
Large air quality and climate models require vast amounts of computational power. Machine learning tools like neural networks can be used to make these models more efficient, with the downside that their results might not make physical sense or be easy to interpret. This work develops a physically interpretable neural network that obeys scientific laws like conservation of mass, and models atmospheric composition more accurately than a traditional neural network.