SKRIPS v1.0: a regional coupled ocean–atmosphere modeling framework (MITgcm–WRF) using ESMF/NUOPC, description and preliminary results for the Red Sea

A new regional coupled ocean–atmosphere model is developed and its implementation is presented in this paper. The coupled model is based on two open-source community model components: the MITgcm ocean model and the Weather Research and Forecasting (WRF) atmosphere model. The coupling between these components is performed using ESMF (Earth System Modeling Framework) and implemented according to National United Operational Prediction Capability (NUOPC) protocols. The coupled model is named the Scripps–KAUST Regional Integrated Prediction System (SKRIPS). SKRIPS is demonstrated with a real-world example by simulating a 30 d period including a series of extreme heat events occurring on the eastern shore of the Red Sea region in June 2012. The results obtained by using the coupled model, along with those in forced stand-alone oceanic or atmospheric simulations, are compared with observational data and reanalysis products. We show that the coupled model is capable of performing coupled ocean–atmosphere simulations, although all configurations of coupled and uncoupled models have good skill in modeling the heat events. In addition, a scalability test is performed to investigate the parallelization of the coupled model. The results indicate that the coupled model code scales well and the ESMF/NUOPC coupler accounts for less than 5 % of the total computational resources in the Red Sea test case. The coupled model and documentation are available at https://library.ucsd.edu/dc/ collection/bb1847661c (last access: 26 September 2019), and the source code is maintained at https://github.com/iurnus/ scripps_kaust_model (last access: 26 September 2019).


Introduction
Accurate and efficient forecasting of oceanic and atmospheric circulation is essential for a wide variety of high-impact societal needs, including extreme weather and climate events (Kharin and Zwiers, 2000;Chen et al., 2007), environmental protection and coastal management (Warner et al., 2010), management of fisheries (Roessig et al., 2004), marine conservation (Harley et al., 2006), water resources (Fowler and Ekström, 2009), and renewable energy (Barbariol et al., 2013). Effective forecasting 20 relies on high model fidelity and accurate initialization of the models with the observed state of the coupled ocean-atmosphere In the past ten years, many regional coupled models have been developed using modern model toolkits (Zou and Zhou, 2012;Turuncoglu et al., 2013;Turuncoglu, 2019) and include waves (Warner et al., 2010;Chen and Curcic, 2016), sediment transport (Warner et al., 2010), sea ice (Van Pham et al., 2014), and chemistry packages (He et al., 2015). However, it is still desirable and useful to develop a new coupled regional ocean-atmosphere model implemented using an efficient coupling framework and with state estimation capabilities. The goal of this work is to (1) introduce the design of a newly developed 5 regional coupled ocean-atmosphere modeling system, (2) describe the implementation of the modern coupling framework, (3) present preliminary simulation results in the Red Sea region, and (4) demonstrate and discuss the parallelization of the coupled model. In the coupled system, the oceanic model component is the MIT general circulation model (MITgcm) (Marshall et al., 1997) and the atmospheric model component is the Weather Research and Forecasting (WRF) model (Skamarock et al., 2005).
To couple the model components in the present work, the Earth System Modeling Framework (ESMF) (Hill et al., 2004) is 10 used because of its advantages in conservative re-gridding capability, calendar management, logging and error handling, and parallel communications. The National United Operational Prediction Capability (NUOPC) layer in ESMF is also used (Sitz et al., 2017). The additional NUOPC wrapper layer between coupled model and ESMF simplifies the implementations of component synchronization, execution, and other common tasks in the coupling. The innovations in our work are: (1) we use ESMF/NUOPC, which is a community supported computationally efficient coupling software for earth system models, and 15 (2) we used MITgcm together with WRF. The resulting coupled model is being developed as a coupled forecasting tool for coupled data assimilation and subseasonal to seasonal (S2S) forecasting. By coupling WRF and MITgcm for the first time with ESMF, we can provide an alternative regional coupled model resource to a wider community of users. These atmospheric and oceanic model components have an active and well-supported user-base.
After testing of the new coupled model, we demonstrate it on a series of heat wave events that occurred on the eastern 20 shore of the Red Sea region in June 2012. The simulated surface variables of the Red Sea (e.g., sea surface temperature, 2-m temperature, and surface heat fluxes) are examined and validated against available observational data and reanalysis products.
To assess the improvements gained from the coupled simulation, the results are compared with those obtained using standalone oceanic or atmospheric models. This is not a full investigation of the importance of coupling for these extreme events, which is outside of the scope of this paper, which focuses on the technical aspects. In addition, a scalability test of the coupled 25 model is performed to investigate its parallel capability.
The rest of this paper is organized as follows. The description of the individual modeling components and the design of the coupled modeling system are detailed in Section 2. Section 3 introduces the experimental design, validation data, and analysis methodology. Section 4 discusses the results obtained from the coupled model. Section 5 details the parallelization test of the coupled model. The last section concludes the paper and presents an outlook for future work. The newly developed regional coupled modeling system is introduced in this section. The general design of the coupled model, descriptions of individual components, and ESMF/NUOPC coupling framework are presented below.

General design
The schematic description of the coupled model is shown in Fig. 1 to MITgcm, the coupler collects the surface atmospheric variables (i.e., solar radiation, turbulent heat flux, wind velocity, precipitation, evaporation) from WRF and updates the surface forcing variables (net heat flux, wind stress, freshwater flux) to drive MITgcm. From MITgcm to WRF, the coupler collects SST and ocean surface velocity from the MITgcm and uses them as the surface boundary condition in WRF. Re-gridding the data from either model component will be performed by the coupler, in which various coupling intervals and schemes can be specified by the ESMF (Hill et al., 2004).

MITgcm Ocean Model
The MITgcm (Marshall et al., 1997) is a 3-D, finite-volume, general circulation model used by a broad community of researchers for a wide range of applications at various spatial and temporal scales. The model code and documentation, which are under continuous development, are available on the MITgcm webpage http://mitgcm.org/. The 'Checkpoint 66h' (June 2017) version of MITgcm is used in the present work.

5
The MITgcm is designed to run on high-performance computing (HPC) platforms and can run in non-hydrostatic and hydrostatic modes. It integrates the primitive (Navier-Stokes) equations, under the Boussinesq approximation, using finite volume method on a staggered 'Arakawa C-grid'. The MITgcm uses modern physical parameterization schemes for subgridscale horizontal and vertical mixing and tracer properties. The code configuration includes build-time C pre-processor (CPP) options and run-time switches, which allow for great computational modularity in MITgcm to study a variety of oceanic 10 phenomena (Evangelinos and Hill, 2007).
To implement the MITgcm-ESMF interface, we separated the MITgcm main program into three subroutines that handle initialization, running, and finalization, shown in Fig. 2. These subroutines are used by the ESMF/NUOPC coupler that controls the oceanic component in the coupled run. The surface boundary fields on the ocean surface is exchanged online 1 via the MITgcm-ESMF interface during the simulation. The MITgcm SST and ocean surface velocity are the export boundary fields, 15 and the atmospheric surface forcing variables are the import boundary fields (see Fig. 2). These boundary fields are registered in the coupler following NUOPC consortium and timestamps 2 are added to them for the coupling. In addition, MITgcm grid information is also provided for online re-gridding of the exchanged boundary fields. To carry out the high-resolution simulation, the MITgcm-ESMF interface runs in parallel via MPI communications. The implementations of the present MITgcm-ESMF interface are based on the baseline MITgcm-ESMF coupler (Hill, 2005), but we updated it to couple the modern version ESMF/NUOPC with MITgcm. We also modified the baseline coupler to receive atmosphere surface fluxes and send ocean 5 surface variables (i.e., SST and ocean surface velocity).

WRF Atmospheric Model
The Weather Research and Forecasting (WRF) Model (Skamarock et al., 2005) is developed by NCAR/MMM (Mesoscale and Microscale Meteorology Division). It is a 3-D, finite-difference atmospheric model with a variety of physical parameterizations of sub-grid scale processes for predicting a broad spectrum of applications. WRF is used extensively for operational forecasts 10 (http://www.wrf-model.org/plots/wrfrealtime.php) as well as realistic and idealized dynamical studies.
In the present work, the Advanced Research WRF dynamic version (WRF-ARW, version 3.9.1.1) is used. It solves the compressible Euler non-hydrostatic equations, and also includes a run-time hydrostatic option. The WRF-ARW uses a terrainfollowing hydrostatic pressure coordinate system in the vertical direction and utilizes the 'Arakawa C-grid'. WRF incorporates various physical processes including microphysics, cumulus parameterization, planetary boundary layer, surface layer, land 15 surface, and longwave and shortwave radiations, with several options available for each process.
Similar with the implementations in MITgcm, WRF is also separated into initialization, run, and finalization subroutines to enable the WRF-ESMF interface to control the atmosphere model during the coupled simulation, shown in Fig. 2. The implementation of the present WRF-ESMF interface is based on the prototype interface (Henderson and Michalakes, 2005).
In the present work, the prototype WRF-ESMF interface is updated to a modern version of WRF-ARW and a modern version 20 of ESMF, based on the NUOPC layer. This prototype interface is also expanded to interact with the ESMF/NUOPC coupler to receive the ocean surface variables and send the atmosphere surface fluxes. The surface boundary condition fields are registered in the coupler following the NUOPC consortium with timestamps. The WRF grid information is also provided for online re-gridding by ESMF. To carry out the high-resolution simulation, the WRF-ESMF interface also runs in parallel via MPI communications. 25

ESMF/NUOPC Coupler
The coupler is implemented using ESMF version 7.0.0. The ESMF is selected because of its high-performance and flexibility for building and coupling weather, climate, and related Earth science applications (Collins et al., 2005;Turuncoglu et al., 2013;Chen and Curcic, 2016;Turuncoglu and Sannino, 2017). It has a superstructure for representing the model and coupler components and an infrastructure of commonly used utilities, including conservative grid remapping, time management, error 30 handling, and data communications.
The general code structure of the coupler is shown in Fig. 2. To build the ESMF/NUOPC driver, a main program is implemented to control an ESMF parent component, which controls the child components. In the present work, three child components are implemented: (1) the oceanic component; (2) the atmospheric component; and (3) the ESMF coupler. The coupler is used here because it performs the two-way interpolation and data transfer (Hill et al., 2004). In ESMF, the model components can be run in parallel as a group of Persistent Execution Threads (PETs), which are single processing units (i.e. 5 CPU, GPU) defined by ESMF. In the present work, the PETs are created according to the grid decomposition, and each PET is associated with an MPI process running on a separate processor.
The ESMF also allows the PETs running in sequential mode, concurrent mode, or a mixed mode. We selected the sequential mode in the implementations, shown in Fig. 2. In sequential mode, a set of ESMF gridded/coupler components runs in sequence on the same set of PETs. At each coupling time step, the oceanic component is executed when the atmosphere component is 10 completed or vice versa. However, in concurrent mode, the gridded components are created and run on mutually exclusive sets of PETs. There are some advantages of concurrent mode, however the simplicity of sequential mode makes it a natural starting point (Collins et al., 2005), and it is chosen for this work.
In ESMF, the gridded components are used to represent models and coupler components are used to connect these models.
The interfaces and data structures in ESMF have few constraints, providing the flexibility to be adapted to many modeling 15 systems. However, the flexibility of the gridded components can limit the interoperability across different modeling systems.
To address this issue, the NUOPC layer is developed to provide the coupling conventions and the generic representation of the model components (e.g. drivers, models, connectors, mediators). The NUOPC layer in the present coupled model is implemented according to the documentations (Hill et al., 2004;Theurich et al., 2016), and the oceanic/atmospheric component each has: To test the coupled model, we applied it to study a series of heat wave events in the Red Sea region. We selected the extreme heat wave events because of their societally relevant impacts. The simulation of the Red Sea extends from 0000 UTC 01 June 2012 to 0000 UTC 01 July 2012. We select this month because of the record-high surface air temperature observed in the Makkah region, located 70 km inland from the eastern shore of the Red Sea (Abdou, 2014).

5
The computational domain and bathymetry are shown in Fig. 3. The model domain is centered at 20 • N and 40 • E, and the bathymetry is from the 2-minute Gridded Global Relief Data (ETOPO2) (National Geophysical Data Center, 2006). WRF is implemented using a horizontal grid of 256 × 256 points and grid spacing of 0.08 • , using cylindrical equidistant map (latitudelongitude) projection. There are 40 terrain-following vertical levels, more closely spaced in the atmospheric boundary layer.
The time step for atmosphere simulation is 30 seconds. The Morrison 2-moment scheme (Morrison et al., 2009) is used to In the coupling process, the ocean model sends SST and ocean surface velocity to the coupler, and they are used directly as the boundary conditions in the atmosphere model. The atmosphere model sends the surface fields to the coupler, including (1) 20 net surface shortwave/longwave radiation, (2) latent/sensible heat, (3) 10-m wind speed, (4) net precipitation, (5) evaporation.
The ocean model uses the atmosphere surface fields to compute the surface forcing, including (1) total net surface heat flux, (2) surface wind stress, (3) freshwater flux. The total net surface heat flux is computed by adding latent heat flux, sensible heat flux, and net surface shortwave/longwave radiation fluxes. The surface wind stress is computed by using the 10-m wind speed (Large and Yeager, 2004). The freshwater flux is the difference between precipitation and evaporation. The latent sensible heat fluxes 25 are computed by using COARE 3.0 bulk algorithm in WRF (Fairall et al., 2003). In the coupled code, different bulk formulae in WRF or MITgcm can also be used.
To study the air-sea interactions, the following sets of simulations using different surface forcings are performed: 1. Run CPL: a two-way coupled MITgcm-WRF simulation. The coupling interval is 20 minutes to capture the diurnal cycle (Seo et al., 2014). This run tests the performance of the two-way coupled ocean-atmosphere model. 2. Run ATM.STA: a stand-alone WRF simulation with its initial SST kept constant throughout the simulation. This run allows assesment of the WRF model behavior with realistic, but persistent SST. This case serves as a benchmark to highlight the difference between coupled and uncoupled runs. The ocean model uses the assimilated HYCOM/NCODA 1/12 • global analysis data as initial and boundary conditions for ocean temperature, salinity, and horizontal velocities (http://hycom.org/data-server/glb-analysis). The boundary conditions for the ocean are updated on a daily basis and linearly interpolated between two simulation time steps. A sponge layer is 10 applied at the lateral boundaries, with a thickness of 3 grid cells and inner/outer boundary relaxation timescales of 10/0.5 days. In CPL, ATM.STA, and ATM.DYN runs, we used the same initial condition and lateral boundary condition for the atmosphere. The atmosphere is initialized using the ECMWF ERA5 reanalysis dataset, which has a grid resolution of approximately 30 km (Hersbach, 2016). The same data also provide the boundary conditions for air temperature, wind speed, and air humidity every 6 hours. The atmosphere boundary conditions are also linearly interpolated between two simulation time steps. The 15 lateral boundary values are specified in WRF in the 'specified' zone, and the 'relaxation' zone is used to nudge the solution from the domain toward the boundary condition value. Here we used the default width of one point for the specific zone and four points for the relaxation zone. The pressure at the top of the atmosphere is 50 hPa. In ATM.STA run, the SST from the HYCOM/NCODA data is used as initial and persistent SST. The time-varying SST in ATM.DYN run is also generated using HYCOM/NCODA data. We selected HYCOM/NCODA data because the ocean model initial condition and boundary conditions are generated using it. For the OCN.DYN run we select the ERA5 dataset for prescribed atmospheric state because it also provides the atmospheric boundary conditions in the CPL run. The initial condition, boundary condition, and forcing terms of 5 this run are summarized in Table 1.  Martin et al., 2012), and the simulated 2-meter air temperature (T2) is validated against the ECMWF ERA5 dataset. To evaluate the modeling of the heat wave event in three major cities near the 10 eastern shore of Red Sea, the diurnal temperature variation is compared with observed daily maximum and minimum temperatures from NOAA National Climate Data Center (NCDC climate data online at http://cdo.ncdc.noaa.gov/CDO/georegion).
Surface heat fluxes (e.g., latent heat, sensible heat, longwave and shortwave radiations), which are important for oceanatmosphere interactions, are compared with MERRA-2 (Modern-Era Retrospective analysis for Research and Applications, version 2) datasets (Gelaro et al., 2017). The MERRA-2 dataset is selected because it is an independent reanalysis data com-15 pared to the initial and boundary conditions used in the simulations. The MERRA-2 dataset also provides a 0.625 o × 0.5 o (lon × lat) resolution reanalysis fields of turbulent heat fluxes. To compare with validation data, we interpolated the validation data on the lower resolution grid to the higher resolution grid of the regional model. The validation data are summarized in Table 2.

Results and Discussions
The Red Sea is an elongated basin covering the area between 12-30 • N and 32-43 • E. The basin is 2250 km long, extending 20 from the Suez and Aqaba gulfs in the north to the strait of Bal el-Mandeb in the south, which connects the Red Sea and the  shows that all simulations can capture the T2 diurnal variation in the Red Sea region, and this will be further discussed later in this section. 20 The simulation results for the heat wave events on June 10 th and 24 th are shown in Fig. 5 to demonstrate the performance of the coupled model over longer periods of time. It can be seen in Fig. 5(III) and 5(VIII) that the T2 patterns simulated by the coupled run are consistent with the ERA5 dataset. The differences between ATM.STA and ATM.DYN simulation results with (2) 0000 UTC Jun 03 2012 (48 hours from initial time).
respect to the ERA5 data are shown in Fig. 5(IV), 5(V), 5(IX), and 5(X), respectively. It can be seen that the T2 over the sea in CPL simulation has a much smaller difference with the validation ERA5 data (10 th : -1.02 • C; 24 th : -0.84 • C) compared with the ATM.STA run (10 th : -1.56 • C; 24 th : -2.13 • C). Although the difference is still very small compared with the mean T2 (31.12 • C on 10 th ; 32.09 • C on 24 th ), the improvement of the coupled run is comparible to the standard deviation of T2 (2.14 • C on 10 th ; 2.02 • C on 24 th ). The CPL run results are closer to the ERA5 dataset because the oceanic component (MITgcm) is providing 5 updated SST, which warms the T2; the ATM.STA run uses a constant cooler SST from June 1 st , and the T2 is determined by the constant cooler SST. On the other hand, when comparing the CPL run with the ATM.DYN run on June 24 th , the difference is very small (-0.10 • C on June 24 th ). This is because the SST fields from CPL and ATM.DYN runs are similar, which means that the SST in CPL run is tending to be similar to the realistic.
To investigate the diurnal T2 variation in Fig. 4, the time series of T2 in three major cities as simulated in CPL and ATM.STA  (2) 1200 UTC Jun 24 2012 (23.5 days from initial time). are 2.79 • C and 2.83 • C for CPL and ATM.STA runs, respectively. However, the error after June 18 th (simulation lead time > 17 days) is larger for both CPL (3.42 • C) and ATM.STA (3.94 • C) runs. It can be also seen that the CPL run better captures the daily high temperatures in Yanbu (RMSE difference: 2.77 • C) than ERA5 dataset (RMSE: 5.59 • C), which is probably because ERA5 uses a lower resolution grid and is unable to capture the T2 in the coastal city. This is one of the advantages when employing regional simulations using higher resolution. It should be mentioned that both the present simulations and 5 ERA5 dataset reported a T2 that is 4.5 • C lower than observed T2 in Mecca on June 2 nd , though the heat wave events in the other cities are still captured. This may be due to the errors in initial conditions, or WRF physics schemes (e.g., land surface model, the PBL model) are unable to parameterize this extreme event. It can be also seen in the results that taking into account ocean-atmosphere coupling can improve the simulation of T2 in the CPL run. In Fig. 6

Sea Surface Temperature
The simulated SST patterns are compared to the validation data to demonstrate the performance of the coupled model in capturing the ocean surface state. The daily SST fields from CPL run on June 2 nd and 24 th are shown in Fig. 9(I) and Fig. 9(VI).
To validate the CPL run results, the SST fields obtained in OCN.DYN runs are shown in Fig. 9(II) and 9(VII) and the GHRSST fields are shown in Fig. 9(III) and 9(VIII). It can be seen that both OCN.DYN and CPL runs are able to reproduce the SST  To quantitatively compare the errors in SST results, the time history of the SST in the simulations (i.e., OCN.DYN and CPL) and validation datasets (i.e., GHRSST and HYCOM data) are shown in Fig. 10. The mean bias and RMSE between simulation results and validation datasets are also plotted. Again, only the errors between daily SST fields are presented because both observational datasets only provide daily data. It can be seen in Fig. 10 that the bias and RMSE of SST in CPL run (bias: -0.26 • C; RMSE: 0.74 • C) is smaller than that of T2 (bias: -0.47 • C; RMSE: 1.42 • C) shown in Fig. 8. Generally, coupled model in simulating the ocean SST. Compared with the HYCOM dataset, the bias of CPL and OCN.DYN runs are small (CPL: -0.12 • C; OCN.DYN: -0.04 • C) before June 10 th . After June 11 th , the CPL run slightly over-estimated the SST (0.37 • C), but the OCN.DYN run slightly under-estimated it (-0.05 • C). In addition, the RMSEs of both simulations increase in the first 10 days, but the increase is not significant after that. On the other hand, when comparing with the GHRSST, the initial SST patterns in both runs are cooler by 0.8 • C. This is because the HYCOM data is cooler than GHRSST at the start of the 5 simulation. After the first 10 days, the difference between GHRSST data and HYCOM decreases, and likewise the difference between the simulation results and GHRSST also decreases. Before June 10

10
The surface heat budget strongly influences the forecast of the surface temperature fields in the simulations. Here we evaluate the performance of the coupled model in capturing the heat fluxes, as compared to the stand-alone simulations. The results are also compared to the MERRA-2 dataset and their differences are plotted.
The turbulent heat fluxes (THF), including the latent heat and sensible heat, and their differences with the validation dataset are shown in Fig. 11. The snapshots of the turbulent fluxes in the heat wave events on June 2 nd and 24 th are presented. It can June 2 nd , all simulations exhibit similar THF patterns since they have the same initial conditions and air-sea interactions do not significantly impact the THF within two days. On the other hand, for the heat wave event on June 24 th , CPL and ATM.DYN runs exhibit more latent heat fluxes coming out of the ocean (157 and 131 W/m 2 ) than that in ATM.STA run (115 W/m 2 ). The mean biases in ATM.STA, ATM.DYN, and CPL runs are -9.8 w/m2, 5.9 w/m2, and 31.8 w/m2, respectively. This is because the SST fields in stand-alone WRF runs are cooler compared with CPL run. When forced by cooler SST, the evaporation decreases 5 and thus the latent heat is smaller. Compared with the latent heat, the sensible heat in the Red Sea region is much smaller in all simulations (10 W/m 2 ). It should be noted that the MERRA-2 dataset has unrealistically large sensible heat in the coastal regions because its resolution is not adequate to resolve the coastline in the Red Sea region (Gelaro et al., 2017).
The net downward shortwave and longwave heat fluxes are shown in Fig. 12

Surface Wind and Evaporation
To evaluate the simulation of the surface momentum and freshwater fluxes by the coupled model, the surface wind and evaporation patterns obtained from ATM.STA, ATM.DYN, and CPL runs are presented. The MERRA-2 data is used to validate the     Fig. 9(IX). Since there is no precipitation in three major cities (Mecca, Jeddah, Yanbu) near the eastern shore of the Red Sea during the month according to NCDC climate data, the precipitation results are not shown.

Scalability Test
Parallel efficiency is crucial for coupled ocean-atmosphere models for simulating large and complex problems. In this section, the parallel efficiency in the coupled simulations is investigated. This aims to demonstrate the implemented ESMF/NUOPC 10 driver and model interfaces are able to simulate parallel cases effectively. The parallel speed-up of the model is investigated to evaluate its performance for a constant size problem simulated using different numbers of processors (i.e. strong scaling).
Additionally, the CPU time spent on different parts of the coupled model is detailed. The parallel efficiency tests are performed on the COMPAS (Center for Observations, Modeling and Prediction at Scripps) cluster in Scripps Institution of Oceanogra- atmospheric model integration, which accounts for 76% to 93% of the total costs. The ocean model integration is the second most time-consuming process, which is 7% to 14% of the total computational costs. The atmospheric model is much more time-consuming because it solves the entire computational domain, while the ocean model only solves the Red Sea (16% of the domain). The atmospheric model also uses a smaller time step (30 s) than that of the ocean model (120 s) and has more complex physics parameterization packages. If a purely marine region is selected in an ideal case, the cost of ocean and 5 atmosphere models would be more equal. The coupling process takes less than 5% of the total costs when using fewer than 128 processors (40960 grid points per processor). However, when using 256 processors (20480 grid points per processor), the proportion of this cost increases to 10%, though the amount of time spent on the ESMF/NUOPC coupler is similar with using 128 processors. We hypothesis that the cost of the ESMF/NUOPC coupler is communication cost and it becomes important as the amount of computation work is reduced with the number of grid cells in these strong scaling tests. In summary, the  This study describes the development of the Scripps-KAUST Regional Integrated Prediction System (SKRIPS). To build the coupled model, the ESMF coupler is implemented according to NUOPC consortium. The ocean model MITgcm and the atmosphere model WRF are split into initialize, run, and finalize sections, with each of them being called as subroutines of the main function.

5
The development activities has been focused on providing a useful coupled model for realistic application to simulate the heat wave events in the Red Sea region. Results from the coupled and stand-alone simulations are compared to a wide variety of available observational and reanalysis datasets, aiming to demonstrate the overall performance of the coupled model with respect to stand-alone models. The results obtained from various configurations of coupled and stand-alone model simulations all realistically capture the basic characteristics of the ocean-atmosphere state in the Red Sea region over a 30-day simulation 10 period. The surface air temperature variations in three major cities are consistent with the ground observations and the heat wave events are also well captured in the CPL run. The surface flux fields (e.g., surface air temperature, surface heat fluxes, surface evaporations, surface wind) in the CPL run are consistent with the reanalysis data over the simulation period. The SST fields in CPL run are also consistent with the satellite observation data. Improvements of the coupled model over the stand-alone simulation with static SST forcing are observed in capturing the T2, heat fluxes, evaporation, and wind speed. 15 The parallel efficiency of the coupled model is examined by simulating the Red Sea region using increasing number of processors. The coupled model scales linearly for up to 128 CPUs and the parallel efficiency remains about 70% for 256 processors. The CPU time associated with different parts of the coupled simulations is also presented, suggesting good parallel efficiency in both model components and ESMF coupler. Hence the coupled model can be applied for high-resolution coupled regional modeling studies on massively parallel processing supercomputers. 20 These preliminary results motivate further studies in evaluating and improving this new regional coupled ocean-atmosphere model for investigating dynamical processes and forecasting applications in regions around the globe where ocean-atmosphere coupling is important. This regional coupled model can be further improved by developing coupled data assimilation capabilities on initializing coupled forecasts from an assimilated analysis state. In addition, the model physics and model uncertainty representation in the coupled system can be enhanced using advanced techniques, such as stochastic physics parameterizations. 25 Future work will involve exploring these and other aspects of developing a regional coupled modeling system that is best suited for forecasting and process understanding purposes.
Code and data availability. The coupled code, documentation, and tutorial cases used in this work are available at https://github.com/iurnus/ scripps_kaust_model. ECMWF ERA5 dataset is used as the atmospheric initial and boundary conditions. The ocean model uses the assimilated HYCOM/NCODA 1/12 • global analysis data as initial and boundary conditions. To validate the simulated SST data, we use the