Articles | Volume 15, issue 2
https://doi.org/10.5194/gmd-15-379-2022
© Author(s) 2022. This work is distributed under
the Creative Commons Attribution 4.0 License.
the Creative Commons Attribution 4.0 License.
https://doi.org/10.5194/gmd-15-379-2022
© Author(s) 2022. This work is distributed under
the Creative Commons Attribution 4.0 License.
the Creative Commons Attribution 4.0 License.
Evaluation and optimisation of the I/O scalability for the next generation of Earth system models: IFS CY43R3 and XIOS 2.0 integration as a case study
Barcelona Supercomputing Center – Centro Nacional de Supercomputación (BSC-CNS), Barcelona, Spain
Gijs van den Oord
Netherlands eScience Center (NLeSC), Amsterdam, the Netherlands
Mario C. Acosta
Barcelona Supercomputing Center – Centro Nacional de Supercomputación (BSC-CNS), Barcelona, Spain
Glenn D. Carver
European Centre for Medium-Range Weather Forecasts (ECMWF), Reading, United Kingdom
Related authors
Francisco J. Doblas-Reyes, Jenni Kontkanen, Irina Sandu, Mario Acosta, Mohammed Hussam Al Turjmam, Ivan Alsina-Ferrer, Miguel Andrés-Martínez, Leo Arriola, Marvin Axness, Marc Batlle Martín, Peter Bauer, Tobias Becker, Daniel Beltrán, Sebastian Beyer, Hendryk Bockelmann, Pierre-Antoine Bretonnière, Sebastien Cabaniols, Silvia Caprioli, Miguel Castrillo, Aparna Chandrasekar, Suvarchal Cheedela, Victor Correal, Emanuele Danovaro, Paolo Davini, Jussi Enkovaara, Claudia Frauen, Barbara Früh, Aina Gaya Àvila, Paolo Ghinassi, Rohit Ghosh, Supriyo Ghosh, Iker González, Katherine Grayson, Matthew Griffith, Ioan Hadade, Christopher Haine, Carl Hartick, Utz-Uwe Haus, Shane Hearne, Heikki Järvinen, Bernat Jiménez, Amal John, Marlin Juchem, Thomas Jung, Jessica Kegel, Matthias Kelbling, Kai Keller, Bruno Kinoshita, Theresa Kiszler, Daniel Klocke, Lukas Kluft, Nikolay Koldunov, Tobias Kölling, Joonas Kolstela, Luis Kornblueh, Sergey Kosukhin, Aleksander Lacima-Nadolnik, Jeisson Javier Leal Rojas, Jonni Lehtiranta, Tuomas Lunttila, Anna Luoma, Pekka Manninen, Alexey Medvedev, Sebastian Milinski, Ali Omar Abdelazim Mohammed, Sebastian Müller, Devaraju Naryanappa, Natalia Nazarova, Sami Niemelä, Bimochan Niraula, Henrik Nortamo, Aleksi Nummelin, Matteo Nurisso, Pablo Ortega, Stella Paronuzzi, Xabier Pedruzo Bagazgoitia, Charles Pelletier, Carlos Peña, Suraj Polade, Himansu Pradhan, Rommel Quintanilla, Tiago Quintino, Thomas Rackow, Jouni Räisänen, Maqsood Mubarak Rajput, René Redler, Balthasar Reuter, Nuno Rocha Monteiro, Francesc Roura-Adserias, Silva Ruppert, Susan Sayed, Reiner Schnur, Tanvi Sharma, Dmitry Sidorenko, Outi Sievi-Korte, Albert Soret, Christian Steger, Bjorn Stevens, Jan Streffing, Jaleena Sunny, Luiggi Tenorio, Stephan Thober, Ulf Tigerstedt, Oriol Tinto, Juha Tonttila, Heikki Tuomenvirta, Lauri Tuppi, Ginka Van Thielen, Emanuele Vitali, Jost von Hardenberg, Ingo Wagner, Nils Wedi, Jan Wehner, Sven Willner, Xavier Yepes-Arbós, Florian Ziemen, and Janos Zimmermann
EGUsphere, https://doi.org/10.5194/egusphere-2025-2198, https://doi.org/10.5194/egusphere-2025-2198, 2025
This preprint is open for discussion and under review for Geoscientific Model Development (GMD).
Short summary
Short summary
The Climate Change Adaptation Digital Twin (Climate DT) pioneers the operationalisation of climate projections. The system produces global simulations with local granularity for adaptation decision-making. Applications are embedded to generate tailored indicators. A unified workflow orchestrates all components in several supercomputers. Data management ensures consistency and streaming enables real-time use. It is a complementary innovation to initiatives like CMIP, CORDEX, and climate services.
Ralf Döscher, Mario Acosta, Andrea Alessandri, Peter Anthoni, Thomas Arsouze, Tommi Bergman, Raffaele Bernardello, Souhail Boussetta, Louis-Philippe Caron, Glenn Carver, Miguel Castrillo, Franco Catalano, Ivana Cvijanovic, Paolo Davini, Evelien Dekker, Francisco J. Doblas-Reyes, David Docquier, Pablo Echevarria, Uwe Fladrich, Ramon Fuentes-Franco, Matthias Gröger, Jost v. Hardenberg, Jenny Hieronymus, M. Pasha Karami, Jukka-Pekka Keskinen, Torben Koenigk, Risto Makkonen, François Massonnet, Martin Ménégoz, Paul A. Miller, Eduardo Moreno-Chamarro, Lars Nieradzik, Twan van Noije, Paul Nolan, Declan O'Donnell, Pirkka Ollinaho, Gijs van den Oord, Pablo Ortega, Oriol Tintó Prims, Arthur Ramos, Thomas Reerink, Clement Rousset, Yohan Ruprich-Robert, Philippe Le Sager, Torben Schmith, Roland Schrödner, Federico Serva, Valentina Sicardi, Marianne Sloth Madsen, Benjamin Smith, Tian Tian, Etienne Tourigny, Petteri Uotila, Martin Vancoppenolle, Shiyu Wang, David Wårlind, Ulrika Willén, Klaus Wyser, Shuting Yang, Xavier Yepes-Arbós, and Qiong Zhang
Geosci. Model Dev., 15, 2973–3020, https://doi.org/10.5194/gmd-15-2973-2022, https://doi.org/10.5194/gmd-15-2973-2022, 2022
Short summary
Short summary
The Earth system model EC-Earth3 is documented here. Key performance metrics show physical behavior and biases well within the frame known from recent models. With improved physical and dynamic features, new ESM components, community tools, and largely improved physical performance compared to the CMIP5 version, EC-Earth3 represents a clear step forward for the only European community ESM. We demonstrate here that EC-Earth3 is suited for a range of tasks in CMIP6 and beyond.
Francisco J. Doblas-Reyes, Jenni Kontkanen, Irina Sandu, Mario Acosta, Mohammed Hussam Al Turjmam, Ivan Alsina-Ferrer, Miguel Andrés-Martínez, Leo Arriola, Marvin Axness, Marc Batlle Martín, Peter Bauer, Tobias Becker, Daniel Beltrán, Sebastian Beyer, Hendryk Bockelmann, Pierre-Antoine Bretonnière, Sebastien Cabaniols, Silvia Caprioli, Miguel Castrillo, Aparna Chandrasekar, Suvarchal Cheedela, Victor Correal, Emanuele Danovaro, Paolo Davini, Jussi Enkovaara, Claudia Frauen, Barbara Früh, Aina Gaya Àvila, Paolo Ghinassi, Rohit Ghosh, Supriyo Ghosh, Iker González, Katherine Grayson, Matthew Griffith, Ioan Hadade, Christopher Haine, Carl Hartick, Utz-Uwe Haus, Shane Hearne, Heikki Järvinen, Bernat Jiménez, Amal John, Marlin Juchem, Thomas Jung, Jessica Kegel, Matthias Kelbling, Kai Keller, Bruno Kinoshita, Theresa Kiszler, Daniel Klocke, Lukas Kluft, Nikolay Koldunov, Tobias Kölling, Joonas Kolstela, Luis Kornblueh, Sergey Kosukhin, Aleksander Lacima-Nadolnik, Jeisson Javier Leal Rojas, Jonni Lehtiranta, Tuomas Lunttila, Anna Luoma, Pekka Manninen, Alexey Medvedev, Sebastian Milinski, Ali Omar Abdelazim Mohammed, Sebastian Müller, Devaraju Naryanappa, Natalia Nazarova, Sami Niemelä, Bimochan Niraula, Henrik Nortamo, Aleksi Nummelin, Matteo Nurisso, Pablo Ortega, Stella Paronuzzi, Xabier Pedruzo Bagazgoitia, Charles Pelletier, Carlos Peña, Suraj Polade, Himansu Pradhan, Rommel Quintanilla, Tiago Quintino, Thomas Rackow, Jouni Räisänen, Maqsood Mubarak Rajput, René Redler, Balthasar Reuter, Nuno Rocha Monteiro, Francesc Roura-Adserias, Silva Ruppert, Susan Sayed, Reiner Schnur, Tanvi Sharma, Dmitry Sidorenko, Outi Sievi-Korte, Albert Soret, Christian Steger, Bjorn Stevens, Jan Streffing, Jaleena Sunny, Luiggi Tenorio, Stephan Thober, Ulf Tigerstedt, Oriol Tinto, Juha Tonttila, Heikki Tuomenvirta, Lauri Tuppi, Ginka Van Thielen, Emanuele Vitali, Jost von Hardenberg, Ingo Wagner, Nils Wedi, Jan Wehner, Sven Willner, Xavier Yepes-Arbós, Florian Ziemen, and Janos Zimmermann
EGUsphere, https://doi.org/10.5194/egusphere-2025-2198, https://doi.org/10.5194/egusphere-2025-2198, 2025
This preprint is open for discussion and under review for Geoscientific Model Development (GMD).
Short summary
Short summary
The Climate Change Adaptation Digital Twin (Climate DT) pioneers the operationalisation of climate projections. The system produces global simulations with local granularity for adaptation decision-making. Applications are embedded to generate tailored indicators. A unified workflow orchestrates all components in several supercomputers. Data management ensures consistency and streaming enables real-time use. It is a complementary innovation to initiatives like CMIP, CORDEX, and climate services.
Sergi Palomas, Mario C. Acosta, Gladys Utrera, and Etienne Tourigny
Geosci. Model Dev., 18, 3661–3679, https://doi.org/10.5194/gmd-18-3661-2025, https://doi.org/10.5194/gmd-18-3661-2025, 2025
Short summary
Short summary
We present an automatic tool that optimizes resource distribution in coupled climate models, enhancing speed and reducing computational costs without requiring expert knowledge. Users can set energy/time criteria or limit resource usage. Tested on various European Community Earth System Model (EC-Earth) configurations and high-performance computing (HPC) platforms, it achieved up to 34 % faster simulations with fewer resources.
Manuel G. Marciani, Miguel Castrillo, Gladys Utrera, Mario C. Acosta, Bruno P. Kinoshita, and Francisco Doblas-Reyes
EGUsphere, https://doi.org/10.5194/egusphere-2025-1104, https://doi.org/10.5194/egusphere-2025-1104, 2025
Short summary
Short summary
Earth System Model simulations are executed with workflows in congested HPC resources. These workflows could be made of thousands of tasks, which, if naively submitted to be executed, might add overheads due to queueing for resources. In this paper we explored a technique of aggregating tasks into a single submission. We related it to a key factor used by the software in charge of the scheduling. We find that this simple technique can reduce up to 7 % of the time spent in queue.
Kai Rasmus Keller, Marta Alerany Solé, and Mario Acosta
EGUsphere, https://doi.org/10.5194/egusphere-2025-1367, https://doi.org/10.5194/egusphere-2025-1367, 2025
Short summary
Short summary
Can we be sure that different computing environments, that should not change the model climate, indeed leave the climate unaltered? In this article, we present a novel methodology that answers whether two model climates are statistically the same. Besides a new methodology, able to detect significant differences between two model climates 60 % more accurately than a similar recent state-of-the-art method, we also provide an analysis on what actually constitutes a different climate.
Eduardo Moreno-Chamarro, Thomas Arsouze, Mario Acosta, Pierre-Antoine Bretonnière, Miguel Castrillo, Eric Ferrer, Amanda Frigola, Daria Kuznetsova, Eneko Martin-Martinez, Pablo Ortega, and Sergi Palomas
Geosci. Model Dev., 18, 461–482, https://doi.org/10.5194/gmd-18-461-2025, https://doi.org/10.5194/gmd-18-461-2025, 2025
Short summary
Short summary
We present the high-resolution model version of the EC-Earth global climate model to contribute to HighResMIP. The combined model resolution is about 10–15 km in both the ocean and atmosphere, which makes it one of the finest ever used to complete historical and scenario simulations. This model is compared with two lower-resolution versions, with a 100 km and a 25 km grid. The three models are compared with observations to study the improvements thanks to the increased resolution.
Colin G. Jones, Fanny Adloff, Ben B. B. Booth, Peter M. Cox, Veronika Eyring, Pierre Friedlingstein, Katja Frieler, Helene T. Hewitt, Hazel A. Jeffery, Sylvie Joussaume, Torben Koenigk, Bryan N. Lawrence, Eleanor O'Rourke, Malcolm J. Roberts, Benjamin M. Sanderson, Roland Séférian, Samuel Somot, Pier Luigi Vidale, Detlef van Vuuren, Mario Acosta, Mats Bentsen, Raffaele Bernardello, Richard Betts, Ed Blockley, Julien Boé, Tom Bracegirdle, Pascale Braconnot, Victor Brovkin, Carlo Buontempo, Francisco Doblas-Reyes, Markus Donat, Italo Epicoco, Pete Falloon, Sandro Fiore, Thomas Frölicher, Neven S. Fučkar, Matthew J. Gidden, Helge F. Goessling, Rune Grand Graversen, Silvio Gualdi, José M. Gutiérrez, Tatiana Ilyina, Daniela Jacob, Chris D. Jones, Martin Juckes, Elizabeth Kendon, Erik Kjellström, Reto Knutti, Jason Lowe, Matthew Mizielinski, Paola Nassisi, Michael Obersteiner, Pierre Regnier, Romain Roehrig, David Salas y Mélia, Carl-Friedrich Schleussner, Michael Schulz, Enrico Scoccimarro, Laurent Terray, Hannes Thiemann, Richard A. Wood, Shuting Yang, and Sönke Zaehle
Earth Syst. Dynam., 15, 1319–1351, https://doi.org/10.5194/esd-15-1319-2024, https://doi.org/10.5194/esd-15-1319-2024, 2024
Short summary
Short summary
We propose a number of priority areas for the international climate research community to address over the coming decade. Advances in these areas will both increase our understanding of past and future Earth system change, including the societal and environmental impacts of this change, and deliver significantly improved scientific support to international climate policy, such as future IPCC assessments and the UNFCCC Global Stocktake.
Mario C. Acosta, Sergi Palomas, Stella V. Paronuzzi Ticco, Gladys Utrera, Joachim Biercamp, Pierre-Antoine Bretonniere, Reinhard Budich, Miguel Castrillo, Arnaud Caubel, Francisco Doblas-Reyes, Italo Epicoco, Uwe Fladrich, Sylvie Joussaume, Alok Kumar Gupta, Bryan Lawrence, Philippe Le Sager, Grenville Lister, Marie-Pierre Moine, Jean-Christophe Rioual, Sophie Valcke, Niki Zadeh, and Venkatramani Balaji
Geosci. Model Dev., 17, 3081–3098, https://doi.org/10.5194/gmd-17-3081-2024, https://doi.org/10.5194/gmd-17-3081-2024, 2024
Short summary
Short summary
We present a collection of performance metrics gathered during the Coupled Model Intercomparison Project Phase 6 (CMIP6), a worldwide initiative to study climate change. We analyse the metrics that resulted from collaboration efforts among many partners and models and describe our findings to demonstrate the utility of our study for the scientific community. The research contributes to understanding climate modelling performance on the current high-performance computing (HPC) architectures.
Vincent Huijnen, Philippe Le Sager, Marcus O. Köhler, Glenn Carver, Samuel Rémy, Johannes Flemming, Simon Chabrillat, Quentin Errera, and Twan van Noije
Geosci. Model Dev., 15, 6221–6241, https://doi.org/10.5194/gmd-15-6221-2022, https://doi.org/10.5194/gmd-15-6221-2022, 2022
Short summary
Short summary
We report on the first implementation of atmospheric chemistry and aerosol as part of the OpenIFS model, based on the CAMS global model. We give an overview of the model and evaluate two reference model configurations, with and without the stratospheric chemistry extension, against a variety of observational datasets. This OpenIFS version with atmospheric composition components is open to the scientific user community under a standard OpenIFS license.
Marcus Falls, Raffaele Bernardello, Miguel Castrillo, Mario Acosta, Joan Llort, and Martí Galí
Geosci. Model Dev., 15, 5713–5737, https://doi.org/10.5194/gmd-15-5713-2022, https://doi.org/10.5194/gmd-15-5713-2022, 2022
Short summary
Short summary
This paper describes and tests a method which uses a genetic algorithm (GA), a type of optimisation algorithm, on an ocean biogeochemical model. The aim is to produce a set of numerical parameters that best reflect the observed data of particulate organic carbon in a specific region of the ocean. We show that the GA can provide optimised model parameters in a robust and efficient manner and can also help detect model limitations, ultimately leading to a reduction in the model uncertainties.
Rolf Hut, Niels Drost, Nick van de Giesen, Ben van Werkhoven, Banafsheh Abdollahi, Jerom Aerts, Thomas Albers, Fakhereh Alidoost, Bouwe Andela, Jaro Camphuijsen, Yifat Dzigan, Ronald van Haren, Eric Hutton, Peter Kalverla, Maarten van Meersbergen, Gijs van den Oord, Inti Pelupessy, Stef Smeets, Stefan Verhoeven, Martine de Vos, and Berend Weel
Geosci. Model Dev., 15, 5371–5390, https://doi.org/10.5194/gmd-15-5371-2022, https://doi.org/10.5194/gmd-15-5371-2022, 2022
Short summary
Short summary
With the eWaterCycle platform, we are providing the hydrological community with a platform to conduct their research that is fully compatible with the principles of both open science and FAIR science. The eWatercyle platform gives easy access to well-known hydrological models, big datasets and example experiments. Using eWaterCycle hydrologists can easily compare the results from different models, couple models and do more complex hydrological computational research.
Matthew L. Dawson, Christian Guzman, Jeffrey H. Curtis, Mario Acosta, Shupeng Zhu, Donald Dabdub, Andrew Conley, Matthew West, Nicole Riemer, and Oriol Jorba
Geosci. Model Dev., 15, 3663–3689, https://doi.org/10.5194/gmd-15-3663-2022, https://doi.org/10.5194/gmd-15-3663-2022, 2022
Short summary
Short summary
Progress in identifying complex, mixed-phase physicochemical processes has resulted in an advanced understanding of the evolution of atmospheric systems but has also introduced a level of complexity that few atmospheric models were designed to handle. We present a flexible treatment for multiphase chemical processes for models of diverse scale, from box up to global models. This enables users to build a customized multiphase mechanism that is accessible to a much wider community.
Ralf Döscher, Mario Acosta, Andrea Alessandri, Peter Anthoni, Thomas Arsouze, Tommi Bergman, Raffaele Bernardello, Souhail Boussetta, Louis-Philippe Caron, Glenn Carver, Miguel Castrillo, Franco Catalano, Ivana Cvijanovic, Paolo Davini, Evelien Dekker, Francisco J. Doblas-Reyes, David Docquier, Pablo Echevarria, Uwe Fladrich, Ramon Fuentes-Franco, Matthias Gröger, Jost v. Hardenberg, Jenny Hieronymus, M. Pasha Karami, Jukka-Pekka Keskinen, Torben Koenigk, Risto Makkonen, François Massonnet, Martin Ménégoz, Paul A. Miller, Eduardo Moreno-Chamarro, Lars Nieradzik, Twan van Noije, Paul Nolan, Declan O'Donnell, Pirkka Ollinaho, Gijs van den Oord, Pablo Ortega, Oriol Tintó Prims, Arthur Ramos, Thomas Reerink, Clement Rousset, Yohan Ruprich-Robert, Philippe Le Sager, Torben Schmith, Roland Schrödner, Federico Serva, Valentina Sicardi, Marianne Sloth Madsen, Benjamin Smith, Tian Tian, Etienne Tourigny, Petteri Uotila, Martin Vancoppenolle, Shiyu Wang, David Wårlind, Ulrika Willén, Klaus Wyser, Shuting Yang, Xavier Yepes-Arbós, and Qiong Zhang
Geosci. Model Dev., 15, 2973–3020, https://doi.org/10.5194/gmd-15-2973-2022, https://doi.org/10.5194/gmd-15-2973-2022, 2022
Short summary
Short summary
The Earth system model EC-Earth3 is documented here. Key performance metrics show physical behavior and biases well within the frame known from recent models. With improved physical and dynamic features, new ESM components, community tools, and largely improved physical performance compared to the CMIP5 version, EC-Earth3 represents a clear step forward for the only European community ESM. We demonstrate here that EC-Earth3 is suited for a range of tasks in CMIP6 and beyond.
Sarah Sparrow, Andrew Bowery, Glenn D. Carver, Marcus O. Köhler, Pirkka Ollinaho, Florian Pappenberger, David Wallom, and Antje Weisheimer
Geosci. Model Dev., 14, 3473–3486, https://doi.org/10.5194/gmd-14-3473-2021, https://doi.org/10.5194/gmd-14-3473-2021, 2021
Short summary
Short summary
This paper describes how the research version of the European Centre for Medium-Range Weather Forecasts’ Integrated Forecast System is combined with climateprediction.net’s public volunteer computing resource to develop OpenIFS@home. Thousands of volunteer personal computers simulated slightly different realizations of Tropical Cyclone Karl to demonstrate the performance of the large-ensemble forecast. OpenIFS@Home offers researchers a new tool to study weather forecasts and related questions.
Pirkka Ollinaho, Glenn D. Carver, Simon T. K. Lang, Lauri Tuppi, Madeleine Ekblom, and Heikki Järvinen
Geosci. Model Dev., 14, 2143–2160, https://doi.org/10.5194/gmd-14-2143-2021, https://doi.org/10.5194/gmd-14-2143-2021, 2021
Short summary
Short summary
OpenEnsemble 1.0 is a novel dataset that aims to open ensemble or probabilistic weather forecasting research up to the academic community. The dataset contains atmospheric states that are required for running model forecasts of atmospheric evolution. Our capacity to observe the atmosphere is limited; thus, a single reconstruction of the atmospheric state contains some errors. Our dataset provides sets of 50 slightly different atmospheric states so that these errors can be taken into account.
Cited articles
Barros, S., Dent, D., Isaksen, L., Robinson, G., Mozdzynski, G., and Wollenweber, F.: The IFS model: A parallel production weather code, Parallel Comput., 21, 1621–1638, https://doi.org/10.1016/0167-8191(96)80002-0, 1995. a, b
Chassignet, E. P. and Marshall, D. P.: Gulf Stream Separation in Numerical
Ocean Models, Geophys. Monogr. Ser., 177, 39–61, https://doi.org/10.1029/177GM05, 2008. a
Demory, M.-E., Vidale, P. L., Roberts, M. J., Berrisford, P., Strachan, J.,
Schiemann, R., and Mizielinski, M. S.: The role of horizontal resolution in
simulating drivers of the global hydrological cycle, Clim. Dynam., 42,
2201–2225, https://doi.org/10.1007/s00382-013-1924-4, 2013. a
Dorier, M., Antoniu, G., Cappello, F., Snir, M., and Orf, L.: Damaris: How to Efficiently Leverage Multicore Parallelism to Achieve Scalable, Jitter-free I/O, in: 2012 IEEE International Conference on Cluster Computing, 24–28 September 2012, Beijing, China, IEEE, 155–163, https://doi.org/10.1109/CLUSTER.2012.26, 2012. a
ECMWF: IFS Documentation CY43R3 – Part VI: Technical and computational
procedures, in: IFS Documentation CY43R3, chap. 6, ECMWF, 1–227,
https://doi.org/10.21957/nrwhwmukh, 2017a. a, b
ECMWF: Modelling and Prediction, available at: https://www.ecmwf.int/en/research/modelling-and-prediction, last
access: 23 October 2017b. a
ECMWF: Supercomputer, available at: https://www.ecmwf.int/en/computing/our-facilities/supercomputer, last
access: 24 January 2018. a
Eyring, V., Bony, S., Meehl, G. A., Senior, C. A., Stevens, B., Stouffer, R. J., and Taylor, K. E.: Overview of the Coupled Model Intercomparison Project Phase 6 (CMIP6) experimental design and organization, Geosci. Model Dev., 9, 1937–1958, https://doi.org/10.5194/gmd-9-1937-2016, 2016. a
Folk, M., Heber, G., Koziol, Q., Pourmal, E., and Robinson, D.: An overview of the HDF5 technology suite and its applications, in: Proceedings of the
EDBT/ICDT 2011 Workshop on Array Databases – AD '11, 25 March 2011, Uppsala, Sweden, ACM Press,
New York, New York, USA, 36–47, https://doi.org/10.1145/1966895.1966900, 2011. a
Gao, K., Liao, W.-k., Nisar, A., Choudhary, A., Ross, R., and Latham, R.:
Using Subfiling to Improve Programming Flexibility and Performance of
Parallel Shared-file I/O, in: 2009 International Conference on Parallel
Processing, 22–25 September 2009, Vienna, Austria, IEEE, 470–477, https://doi.org/10.1109/ICPP.2009.68, 2009. a
Gates, W. L., Boyle, J. S., Covey, C., Dease, C. G., Doutriaux, C. M., Drach,
R. S., Fiorino, M., Gleckler, P. J., Hnilo, J. J., Marlais, S. M., Phillips,
T. J., Potter, G. L., Santer, B. D., Sperber, K. R., Taylor, K. E., and
Williams, D. N.: An Overview of the Results of the Atmospheric Model
Intercomparison Project (AMIP I), B. Am. Meteorol. Soc., 80, 29–56, https://doi.org/10.1175/1520-0477(1999)080<0029:AOOTRO>2.0.CO;2, 1999. a
Haarsma, R., Acosta, M., Bakhshi, R., Bretonnière, P.-A., Caron, L.-P., Castrillo, M., Corti, S., Davini, P., Exarchou, E., Fabiano, F., Fladrich, U., Fuentes Franco, R., García-Serrano, J., von Hardenberg, J., Koenigk, T., Levine, X., Meccia, V. L., van Noije, T., van den Oord, G., Palmeiro, F. M., Rodrigo, M., Ruprich-Robert, Y., Le Sager, P., Tourigny, E., Wang, S., van Weele, M., and Wyser, K.: HighResMIP versions of EC-Earth: EC-Earth3P and EC-Earth3P-HR – description, model computational performance and basic validation, Geosci. Model Dev., 13, 3507–3527, https://doi.org/10.5194/gmd-13-3507-2020, 2020. a
Haarsma, R. J., Hazeleger, W., Severijns, C., de Vries, H., Sterl, A.,
Bintanja, R., van Oldenborgh, G. J., and van den Brink, H. W.: More
hurricanes to hit western Europe due to global warming, Geophys. Res.
Lett., 40, 1783–1788, https://doi.org/10.1002/grl.50360, 2013. a
Hanke, M., Biercamp, J., Escamilla, C. O., Jahns, T., Kleberg, D., Selwood, P., and Mullerworth, S.: Deliverable 7.3 – Reference implementations of
Parallel I/O and of I/O Server, Tech. rep., DKRZ, available at: https://is.enes.org/archive/documents/IS-ENES_D7.3.pdf (last access: 24 October 2017),
2013. a
Hartnett, E. and Edwards, J.: The Parallelio (Pio) C/Fortran Libraries For
Scalable Hpc Performance, in: 37th Conference on Environmental Information
Processing Technologies, American Meteorological Society Annual Meeting, 10–15 January 2021, online, available at: https://www.researchgate.net/publication/348169990_THE_PARALLELIO_PIO_CFORTRAN_LIBRARIES_FOR_SCALABLE_HPC_PERFORMANCE, last access: 28 October 2021a. a
Hartnett, E. and Edwards, J.: Poster: The Parallelio (Pio) C/Fortran Libraries For Scalable Hpc Performance, in: 37th Conference on Environmental
Information Processing Technologies, American Meteorological Society Annual
Meeting, 10–15 January 2021, online, available at:
https://www.researchgate.net/publication/348170136_THE_PARALLELIO_PIO_CFORTRAN_LIBRARIES_FOR_SCALABLE_HPC_PERFORMANCE,
last access: 28 Octboer 2021b. a
Hazeleger, W., Severijns, C., Semmler, T., Ştefănescu, S., Yang, S., Wang,
X., Wyser, K., Dutra, E., Baldasano, J. M., Bintanja, R., Bougeault, P.,
Caballero, R., Ekman, A. M. L., Christensen, J. H., van den Hurk, B.,
Jimenez, P., Jones, C., Kållberg, P., Koenigk, T., McGrath, R., Miranda,
P., Van Noije, T., Palmer, T., Parodi, J. A., Schmith, T., Selten, F.,
Storelvmo, T., Sterl, A., Tapamo, H., Vancoppenolle, M., Viterbo, P., and
Willén, U.: EC-Earth: A Seamless Earth-System Prediction Approach in
Action, B. Am. Meteorol. Soc., 91, 1357–1363,
https://doi.org/10.1175/2010BAMS2877.1, 2010. a
Huang, X. M., Wang, W. C., Fu, H. H., Yang, G. W., Wang, B., and Zhang, C.: A fast input/output library for high-resolution climate models, Geosci. Model Dev., 7, 93–103, https://doi.org/10.5194/gmd-7-93-2014, 2014. a
Ishiwatari, M., Toyoda, E., Morikawa, Y., Takehiro, S., Sasaki, Y., Nishizawa, S., Odaka, M., Otobe, N., Takahashi, Y. O., Nakajima, K., Horinouchi, T., Shiotani, M., Hayashi, Y.-Y., and Gtool development group: “Gtool5”: a Fortran90 library of input/output interfaces for self-descriptive multi-dimensional numerical data, Geosci. Model Dev., 5, 449–455, https://doi.org/10.5194/gmd-5-449-2012, 2012. a
Jackson, A., Reid, F., Hein, J., Soba, A., and Saez, X.: High Performance
I/O, in: 2011 19th International Euromicro Conference on Parallel,
Distributed and Network-Based Processing, 9–11 February 2011, Ayia Napa, Cyprus, IEEE, 349–356, https://doi.org/10.1109/PDP.2011.16, 2011. a
Jin, C., Klasky, S., Hodson, S., Yu, W., Lofstead, J., Abbasi, H., Schwan, K., Wolf, M., Liao, W.-k., Choudhary, A., Parashar, M., Docan, C., and Oldfield, R.: Adaptive IO System (ADIOS), in: Cray User Group (CUG) Workshop, 5–8 May 2008, Helsinki, Finland, 1–8, available at: https://cug.org/5-publications/proceedings_attendee_lists/2008CD/S08_Proceedings/pages/Authors/16-19Thursday/Klasky-White-Thursday18C/Klasky-White-Thursday18C-paper.pdf (last access: 26 October 2017), 2008. a
Joussaume, S., Bellucci, A., Biercamp, J., Budich, R., Dawson, A., Foujols, M., Lawrence, B., Linardikis, L., Masson, S., Meurdesoif, Y., Riley, G., Taylor, K., and Vidale, P.: Modelling the Earth's climate system: data and computing challenges, in: 2012 SC Companion: High Performance Computing, Networking Storage and Analysis, 10–16 November 2012, Salt Lake City, UT, USA, IEEE, 2325–2356, https://doi.org/10.1109/SC.Companion.2012.361, 2012. a, b
Kern, B. and Jöckel, P.: A diagnostic interface for the ICOsahedral Non-hydrostatic (ICON) modelling framework based on the Modular Earth Submodel System (MESSy v2.50), Geosci. Model Dev., 9, 3639–3654, https://doi.org/10.5194/gmd-9-3639-2016, 2016. a
Li, J., Liao, W.-k., Choudhary, A., Ross, R., Thakur, R., Gropp, W., Latham,
R., Siegel, A., Gallagher, B., and Zingale, M.: Parallel netCDF: A
High-Performance Scientific I/O Interface, in: Proceedings of the 2003
ACM/IEEE conference on Supercomputing – SC '03, p. 39, 15–21 November 2003, Phoenix, AZ, USA, ACM Press, Phoenix,
AZ, USA, https://doi.org/10.1145/1048935.1050189, 2003. a
Liu, Z., Wang, B., Wang, T., Tian, Y., Xu, C., Wang, Y., Yu, W., Cruz, C. A.,
Zhou, S., Clune, T., and Klasky, S.: Profiling and Improving I/O Performance
of a Large-Scale Climate Scientific Application, in: 2013 22nd International
Conference on Computer Communication and Networks (ICCCN), 30 July–2 August 2013, Nassau, Bahamas, IEEE, 1–7,
https://doi.org/10.1109/ICCCN.2013.6614174, 2013. a, b
Maisonnave, E., Fast, I., Jahns, T., Biercamp, J., Sénési, S.,
Meurdesoif, Y., and Fladrich, U.: CDI-pio & XIOS I/O servers
compatibility with HR climate models, Tech. rep., CERFACS, available at: https://is.enes.org/archive-1/phase-2/documents/deliverables/is-enes2_d9-4_cdi-pio-xios-i-o-servers-compatibility-with-hr-climate-models/at_download/file (last access: 5 January 2022),
2017. a
Message Passing Interface Forum: MPI-2 : Extensions to the Message-Passing
Interface, Tech. rep., University of Tennessee, available at: https://www.mpi-forum.org/docs/mpi-2.0/mpi2-report.pdf (last access: 8 January 2018), 2003. a
Meurdesoif, Y.: XIOS 2.0 (Revision 1297), Zenodo [code], https://doi.org/10.5281/zenodo.4905653, 2017. a
Nisar, A., Liao, W.-K., and Choudhary, A.: Scaling parallel I/O performance through I/O delegate and caching system, in: SC '08: Proceedings of the 2008 ACM/IEEE Conference on Supercomputing, 15–21 November 2008, Austin, TX, USA, IEEE, 1–12, https://doi.org/10.1109/SC.2008.5214358, 2008. a
Poyraz, E., Xu, H., and Cui, Y.: Application-specific I/O Optimizations on
Petascale Supercomputers, in: ICCS 2014. 14th International Conference on
Computational Science, 10–12 June 2014, Cairns, Australia, Elsevier, vol. 29, 910–923, https://doi.org/10.1016/J.PROCS.2014.05.082, 2014. a, b
Prodhomme, C., Batté, L., Massonnet, F., Davini, P., Bellprat, O.,
Guemas, V., Doblas-Reyes, F. J., Prodhomme, C., Batté, L., Massonnet,
F., Davini, P., Bellprat, O., Guemas, V., and Doblas-Reyes, F. J.: Benefits
of Increasing the Model Resolution for the Seasonal Forecast Quality in
EC-Earth, J. Climate, 29, 9141–9162,
https://doi.org/10.1175/JCLI-D-16-0117.1, 2016. a
Roberts, M. J., Clayton, A., Demory, M. E., Donners, J., Vidale, P. L., Norton, W., Shaffrey, L., Stevens, D. P., Stevens, I., Wood, R. A., and Slingo, J.: Impact of Resolution on the Tropical Pacific Circulation in a Matrix of Coupled Models, J. Climate, 22, 2541–2556,
https://doi.org/10.1175/2008JCLI2537.1, 2009. a
Tseng, Y.-H. and Ding, C.: Efficient Parallel I/O in Community Atmosphere
Model (CAM), Int. J. High Perform. C., 22, 206–218, https://doi.org/10.1177/1094342008090914, 2008. a
Uselton, A., Howison, M., Wright, N. J., Skinner, D., Keen, N., Shalf, J.,
Karavanic, K. L., and Oliker, L.: Parallel I/O performance: From events to
ensembles, in: 2010 IEEE International Symposium on Parallel &
Distributed Processing (IPDPS), 19–23 April 2010, Atlanta, GA, USA, IEEE, 1–11, https://doi.org/10.1109/IPDPS.2010.5470424, 2010. a
van den Oord, G.: XIOS-GRIB compare (v1.0), Zenodo [code], https://doi.org/10.5281/zenodo.4906175, 2021.
a
Vijayakumar, K., Mueller, F., Ma, X., and Roth, P. C.: Scalable I/O tracing
and analysis, in: Proceedings of the 4th Annual Workshop on Petascale Data
Storage – PDSW '09, 14 November 2009, Portland, OR, USA, ACM Press, New York, New York, USA, p. 26, https://doi.org/10.1145/1713072.1713080, 2009. a
Yashiro, H., Terasaki, K., Miyoshi, T., and Tomita, H.: Performance evaluation of a throughput-aware framework for ensemble data assimilation: the case of NICAM-LETKF, Geosci. Model Dev., 9, 2293–2300, https://doi.org/10.5194/gmd-9-2293-2016, 2016. a
Yepes-Arbós, X. and van den Oord, G.: IFS CY43R3 and XIOS 2.0 integration (v1.0), Zenodo [code], https://doi.org/10.5281/zenodo.4905832, 2021. a
Yepes-Arbós, X., van den Oord, G., Acosta, M. C., and Carver, G.: Evaluation and optimisation of the I/O scalability for the next generation of Earth system models: IFS CY43R3 and XIOS 2.0 integration as a case study (1.0), Zenodo [data set], https://doi.org/10.5281/zenodo.4473008, 2021. a
Zhao, M., Held, I. M., Lin, S.-J., and Vecchi, G. A.: Simulations of Global
Hurricane Climatology, Interannual Variability, and Response to Global
Warming Using a 50-km Resolution GCM, J. Climate, 22, 6653–6678,
https://doi.org/10.1175/2009JCLI3049.1, 2009. a
Zou, Y., Xue, W., and Liu, S.: A case study of large-scale parallel I/O
analysis and optimization for numerical weather prediction system, Future
Gener. Comp. Sy., 37, 378–389,
https://doi.org/10.1016/J.FUTURE.2013.12.039, 2014. a
Short summary
Climate prediction models produce a large volume of simulated data that sometimes might not be efficiently managed. In this paper we present an approach to address this issue by reducing the computing time and storage space. As a case study, we analyse the output writing process of the ECMWF atmospheric model called IFS, and we integrate into it a data writing tool called XIOS. The results suggest that the integration between the two components achieves an adequate computational performance.
Climate prediction models produce a large volume of simulated data that sometimes might not be...