the Creative Commons Attribution 4.0 License.
the Creative Commons Attribution 4.0 License.
Interactions between atmospheric composition and climate change – Progress in understanding and future opportunities from AerChemMIP, PDRMIP, and RFMIP
Fiona M. O'Connor
Christopher J. Smith
Robert J. Allen
Daniel M. Westervelt
Laura J. Wilcox
William J. Collins
Piers M. Forster
Abstract. The climate science community aims to improve our understanding of climate change due to anthropogenic influences on atmospheric composition and the Earth's surface. Yet not all climate interactions are fully understood and diversity in climate model experiments persists as assessed in the latest Intergovernmental Panel on Climate Change (IPCC) assessment report. This article synthesizes current challenges and emphasizes opportunities for advancing our understanding of climate change and model diversity. The perspective of this article is based on expert views from three multi-model intercomparison projects (MIPs) – the Precipitation Driver Response MIP (PDRMIP), the Aerosol and Chemistry MIP (AerChemMIP), and the Radiative Forcing MIP (RFMIP). While there are many shared interests and specialisms across the MIPs, they have their own scientific foci and specific approaches. The partial overlap between the MIPs proved useful for advancing the understanding of the perturbation-response paradigm through multi-model ensembles of Earth System Models of varying complexity. It specifically facilitated contributions to the research field through sharing knowledge on best practices for the design of model diagnostics and experimental strategies across MIP boundaries, e.g., for estimating effective radiative forcing. We discuss the challenges of gaining insights from highly complex models that have specific biases and provide guidance from our lessons learned. Promising ideas to overcome some long-standing challenges in the near future are kilometer-scale experiments to better simulate circulation-dependent processes where it is possible, and machine learning approaches for faster and better sub-grid scale parameterizations where they are needed. Both would improve our ability to adopt a smart experimental design with an optimal tradeoff between resolution, complexity and simulation length. Future experiments can be evaluated and improved with sophisticated methods that leverage multiple observational datasets, and thereby, help to advance the understanding of climate change and its impacts.
Stephanie Fiedler et al.
Status: final response (author comments only)
- RC1: 'Comment on gmd-2023-29', Anonymous Referee #1, 02 May 2023
RC2: 'Comment on gmd-2023-29', Anonymous Referee #2, 29 May 2023
To the editor,
I think this work serves several purposes. It reports on the different MIPs, on their interaction, on their challenges and future opportunities, and is a vision paper for the CACTI activity, which might be a future link between the different MIPs. Learning from the past and stressing what are the large/important remaining knowledge gaps, is possibly important to guide future research directions.
I think the work of these MIPs in the past 5 to 10 years has been very valuable, and their importance can hardly be overstated. These MIPs contributed largely to trying to understand conceptually the chain from initial perturbations to climate responses.
In this manuscript, the authors have put the different MIPs in a similar framework, to allow them to characterize what they differ in and what they have in common. It is nice that this manuscript tries to connect the various MIPs - such an approach can never be expected to be fully comprehensive or satisfying, but I very much appreciate the work of the authors.
It is also important to (critically) look backward to earlier existing MIPs, and synthesize ideas on ways to go forward. One can expect that MIPs will be popular for some time, but might loose interest over time. One can also expect that a new generation of scientists will come up with new ideas and areas to focus on. Within this environment, it is nice that these three MIPs have worked (closely) together (e.g., TriMIP), try to reflect on and synthesize their achievements (this manuscript), and make efforts to find synergies for the future (CACTI).
The manuscript comes up with suggestions for future research directions, and I am fully aware that it is not always easy to be precise in general prospective studies. Coming up with a list of possible future directions is brave, and one cannot expect such lists to be complete.
I think this work is valuable and is appropriate for the journal. E.g., the original description papers for RFMIP and AerChemMIP were both published in GMD. Having this prospective paper in the suggested journal is therefore appropriate.
I think however that the manuscript needs modifications before being fit for publication. Below you can first find a list of my main concerns. It is followed by a more detailed list, referring to specific lines in the manuscript. Both the main concerns and the detailed comments should be addressed by the authors.
In the backward looking view of the manuscript, one reports mainly on the collaboration, interaction and links between the different MIPs. However, based on the title of the manuscript, one would expect that it would also report on the general "scientific progress" (or lack of it) made within the field due to these MIPs. In that sense, I think that the title and content of Section 2 (Advancement through MIP's cross-linkages) is rather limited, and should maybe be widened up to also go more deeply in the scientific progress made. I think Figures 4 and 5 cover part of the scientific progress, but I think more achievements can/should be mentioned to motivate the existence of these MIPs. This should not be very long, but possibly a table with main scientific lessons learned from the MIP experiments might be an idea.
The study is generally well written and structured, although specific sections should be improved for better understanding. I also think that there should be an effort in making the style more homogeneous over the whole manuscript. Also, one sometimes gets the impression that some paragraphs (or sentences) do not really belong in a section : they should be brought more in harmony with the text around them. This is indicated in the detailed list below.
Some synergies between the MIPs are mentioned, but one should maybe make an effort to elaborate more on this. Some examples have been given (use the same ERF definition, estimate length of simulations, diagnostics, complementary simulations, ...), but maybe the authors could try to analyze and refine it more.
I am wondering whether the topics which are treated under Section 4 (Methodological opportunities) are representative for the future research environment and directions. I also do not know why it should be limited to "methodological" opportunities only - that is certainly not what the title of the manuscript suggests. I have the impression that the mentioned opportunities are all relevant, but some of them are maybe rather specific, and their total does not seem to cover the broad opportunity space. E.g., although kilometer-scale experiments might be important, they seem rather far in the future for most of the ESMs which have resolutions in the order of 50-200 km. As MIPs are designed for (many) models to participate, one should, e.g., take into consideration that part of the models are maybe not at the forefront of the scientific progress. Although the 5 topics mentioned in Section 4 are relevant, I could imagine a longer, more comprehensive and equilibrated list of relevant research questions and directions. My general impression is that these 5 opportunities do not cover well enough the directions in which these MIPs probably will go. Possibly an extended list of these opportunities could be put in a table.
In Section 3.2 open research questions are mentioned. However, that section is rather short, and I have the impression that it is possibly under-valued. To maybe stress the ideas in this section, it might be an idea to elaborate them a bit more, and possibly list them in a table in the manuscript. Some of these topics are : non-DMS marine volatile organic compounds, natural primary biological aerosol particles, fire, dust, natural aerosol, aerosol optical properties, and aerosol optical depth.
I miss in the manuscript some perspective on what we have learned from approaches which did not work (if so) in these three MIPs. Although this should not be the focus of the manuscript, these questions might come up for a reader. In addition, such reflections might contain important lessons for future activities, and improve the effectiveness of future research and MIPs. I list here some thoughts which might be covered.
- Were there MIP experiments with a too small signal-to-noise ratio to be useful? Did the suggestions from Forster et al.  appear to be correct? Is it valid for all variables, on all scales : regional and seasonal, or global and annual? Were the simulation lengths and number of simulations appropriate in the MIPs? Were the ensemble sizes (number of identical setups by individual model) large enough?
- RFMIP and AerChemMIP had quite some distinct and complementary experiments. Possibly, there were also some experiments which better would have been combined into one experiment. Is that something that should be better organized in the future?
- Was the degree of participation in these MIPs sufficient? I assume RFMIP and AerChemMIP (and maybe also PDRMIP) were aimed to explain the results/behaviour of the CMIP models (explain better the final model response and its diversity). However, not all CMIP models contributed to (all) RFMIP and AerChemMIP experiments. Should having more CMIP models participating in AerChemMIP/RFMIP/PDRMIP be a priority?
- Which protocols and experiments were not popular or successful : easy-aerosol? Double calls for IRF calculations? Long perturbed historical simulations? Why?
- Why did certain protocols appear easier or harder to follow? E.g., PDRMIP was partially driven in emission-driven and partially in concentration-driven mode, ...
- Did the use of Tiers help?
- Maybe some vision on whether MIPs and their experimental demand should maybe remain limited. Should these MIPs merge into one MIP? Are there benefits in keeping (small) separate MIPs?
- Is the multi-model approach put into question? Should models be selected (even more) on their key performance before they can go into an assessment? Does the model spread in the results represent our current uncertainty in understanding, or is it partially caused by lacking model selection?
- In consecutive CMIP rounds, one sees the number of ESM models increase. Is this an efficient way to progress science, or would reducing the number of different models, and trying to build one exceptionally high resolution and very competent model, be a better way? Would such a uniformity block/hamper scientific creativity? (CERN of climate science)
- How is the activity rate over the lifespan of a MIP: when do most model results come in? How long does analysis go on?
- Is the use of more observations to constrain the models the way forward? This is slightly mentioned when discussing "sophisticated methods". However, this point is maybe worth more focus and ideas.
- In the manuscript nothing is said about emerging constraints. Is that a way forward?
Also section 4.2 might benefit from tables containing all the suggested new diagnostics (e.g., IRF, ...) and experiments (e.g., fixed land surface temperature experiments, additional experiments for the impact community, ...).
Would this be a good paper to introduce the 3 MIPs to a reader? Possibly not, there is very little description of the experiments suggested in the 3 different.
Maybe explain and characterize the MIPs better, possibly in a table/matrix.
Below, you can find specific comments on the text of the manuscript. I have tried to indicate as well as possible the line numbers.
- line 3 : "in climate model experiments" : this gives the impression that it refers to the "setup" of experiments, whereas I assume it should refer to the difference in results between models.
- line 4 : "this article" : I would not use the word article in the text (or abstract)
- line 9 : "of varying complexity" : is it mentioned because it played an important role in advancing science? I think the varying complexity was not an item/issue in itself. However, the MIPs were designed in such a way that even contributions from less complex models could contribute to certain parts of the analysis. E.g., models containing interactive aerosol but no interactive ozone could still contribute to the analysis of aerosol forcing, but not to the analysis of stratospheric ozone forcing. In addition, that aspect is not so much related to "the partial overlap between the MIPs". But it is true, estimating the feedback from natural emissions (which was an AerChemMIP activity), requires models to contain those interactive emissions.
- line 9-11 : "It specifically ... for estimating effective radiative forcing." This seems rather limited if this is the main synergy coming from the 3 MIPs.
- line 12 : "... that have specific biases ..." : I don't have the impression that this gets much attention in the text later. Therefore I don't know whether it should be mentioned in the abstract.
- line 13 : are global kilometer-scale experiments in view in the next 5 to 10 years in the context of the relation between composition and climate change? In the last decade, global climate models only doubled their resolution (e.g., from 2x2 degrees horizontal resolution to 1x1 degrees). Do the authors expect to arrive at a kilometer-scale resolution in 5 to 10 years on a MIP-wide scale?
- line 16 : "can" be evaluated -> "should" be evaluated
- line 16-18 : although I think this is important, "sophisticated methods" is a bit vague. It is also used in a few other places in the manuscript.
- line 25, 26 : why "concentrations" for GHGs but "burdens" for aerosols?
- line 29 : "... direct impact ... instantaneous radiative forcing ..." : to a novice in the field, this should possibly be slightly better explained. Now it seems more like defining one expression by another expression.
- line 33, 34 : "... can take several hundreds years depending on the magnitude" : as long as one stays within a linear regime, perturbations (whether small or large) disappear with the same timescale. If there is interannual variability, however, the smaller perturbation will sooner disappear behind the detection limit. Maybe one can be more precise.
- line 38 : "due to changes in emissions of reactive trace gases" : I assume one refers, e.g., to DMS and biogenic VOC emissions which are precursors of (radiatively active) O3. It seems to exclude "emissions of species which have a direct impact on radiation", e.g., dust - however, I think they fall into the same category.
- line 40-41 : The second part of the sentence contains twice "response" and twice "changes". Cannot this be said in an easier way?
- line 41 : "steps" : is "steps" the appropriate word in the context of this paradigm?
- line 43-44 : "Understanding and quantification ... derived" : "derived" does not go well together with "quantification".
- line 43-44 : "typically derived" : some parts in the paradigm can possibly be derived by other models than ESMs. I think line 51 is on the contrary correct : for climate response and feedbacks one needs the ESMs.
- line 44 : Heavens et al. (2013) : I have the impression that the text of Heavens et al. (2013) is more about running and verifying the ESMs, and not so much about disentangling the perturbation-response paradigm.
- line 46-47 : "... simulate [aerosol and their precursor] emissions, [?] transport, and deposition [of aerosols]" : when one reads this sentence, one gets the feeling that something is missing on the location of the [?]. One could maybe change it into : "... simulate aerosol and their precursor emissions, and transport and deposition of aerosols".
- line 44, 50 : What is meant by "design"? I would think that "process complexity" (line 49) is part of it.
- line 47 : "collaborates regularly" : "collaborates" gives the impression of a continuous process, whereas "regularly" gives the impression of a process with several breaks.
- line 47 : "multi-model ensembles of a common set of experiments" : this is clear and referts to multi-model. However, on line 50 in "ensembles of ESM experiments", "ensembles" probably refers to a group of different experimental setups. Maybe this should be formulated more clearly to avoid confusion.
- line 49 : "This diversity in response may be due to differences in process complexity within the respective ESMs, and/or may be due to the design and coupling of different model components" : aren't there more reasons for model diversity? E.g., different parameterisations (without difference in complexity), different parameter values, different but equally complex dynamical cores, ...
- line 57 : "The principle idea of MIPs" : maybe change into "The principle idea of a MIP"
- line 57 : "The principle idea" : what is the principle idea? Do MIPs exist in other sciences?
- line 64 : "both the MIPs" -> "both MIPs"
- line 67 : "of the three MIPs" -> "of three MIPs"
- line 67 : the summing up after ":" is later followed by a continuation of the sentence with ",". This is a bit strange.
- line 67 : what do the authors mean by "diagnostic tools"? I think one could be more specific.
- line 66-68 : three aims are mentioned in this sentence as if they constitute a complete set. However, a few sentences later some extra aims appear.
- line 73 : "... in this area" : this is a bit vague. Does it refer to "understanding multi-model climate responses"?
2. ADVANCEMENT THROUGH MIP'S CROSS-LINKAGES
- line 75-76 : "considering structural differences between ESMs" : how should "structural" be interpreted? Isn't it more often on the level of parameterizations that differences arise? Having a process in or not, should that be seen as a "structural" difference? For me, "structural" refers to the broad technical design choices of an ESM (order of processes when numerically solving, how do components technically interact, is a coupler like OASIS used (or something else), is the ESM one model or an assembly of models, ...), whereas I do not think that those differences contribute most in the end.
- line 77 : "components" in the paradigm. I would rather say that ESMs have components. Possibly use a different word for what constitutes the paradigm.
- line 77-78 : "RFMIP focuses on an improved understanding of the (role of) radiative forcing diversity (for the climate response)" : leaving out a few words, I thought it better described the aims of RFMIP.
- line 79 : "... on precipitation response to idealized atmospheric composition ..." : I would maybe skip "idealized". I think the focus was on "precipitation response to atmospheric composition change", and the tool was indeed "using idealized atmospheric composition changes".
- line 80 : "earlier" : this gives the impression that there is a time dimension in the paradigm approach.
- line 80-83 : "... making these models more complex ..." : this gives the impression that AerChemMIP uses a specific class of models, which is confusing. E.g., I think that in RFMIP a large portion of the models and in PDRMIP at least half of the models also start from emissions.
- line 83 : "inspired by each other" : PDRMIP was set up before AerChemMIP and RFMIP, so it was maybe unidirectional.
- line 84 : "with a certain class of model in mind" : were they aiming for very different types of models? It is true that the natural feedback quantification of AerChemMIP needs interactive natural emissions, but in general the models were possibly reasonably similar?
- line 84-85 : "ensembles of ESM experiments of different complexity ...." : this is a bit confusing. Did the protocols differ in their demand for complexity, resolution, ..., or did the results just finally appear to be like that?
- line 86 : "A major advancement from the synergy between the three MIPs was the widespread adoption ..." : Is meant adoption outside the three MIPs mentioned here, or only within the framework of these 3 MIPs?
- line 88 : "consistent calculation" -> "consistent diagnosis"
- line 97-98 : "experiments where the atmospheric composition represents the values in 1850." : I think this formulation is not very nice.
- line 99 : "diagnostic calls" -> "additional diagnostic calls".
- line 100 : "in CMIP6 models" : this sounds a bit sloppy, so maybe it is better to write : in the ESMs used in CMIP6.
- line 107 : concerning the practice of estimating ERF : add "already mentioned"
- line 113-114 : "more relevant" : If one uses "more", I would think one needs to mention what it is compared with.
3. CHALLENGES IN THE MIPs RESEARCH
- line 118 : "components along the perturbation-response paradigm" : what is meant by "components"? On line 41 and 43, the word "steps" was used in the perturbation-response paradigm ... ("component" is also used on line 122). What is further meant by "model differences"?
- line 119-121 : it is not clear what difference one wants to stress here : (1) difference in forcing for the same composition change, (2) different climate response for same forcing, or (3) different feedbacks? Possibly write it clearer.
- line 123 : What is meant by "joint strength" : common approach? (which then facilitates comparison)
- line 127 : Is here carefully the word "provided" used (leaving it open whether the data were actually used), as 50% of the models still used their own aerosol emission in PDRMIP?
- line 131 : This is a bit confusing as the other MIPs also did atmosphere-only experiments. Possibly "atmosphere-only" is not the correct terminology, as probably the landmodel is also active in these simulations.
- line 131 and 133 : "atmosphere-only" versus "prescribed sea-surface conditions" : I would use one terminology.
- line 135 : "in a complementary manner" : I am certainly aware that there has been a lot of synergy between the MIPs. However, this gives the impression that together, the MIPs covered almost (all) the topical research questions.
3.1 COMPUTATIONAL CAPACITY ABYSS
- In general I think that Section 3.1 is not so well written, and should be improved.
- The three axis approach is nice and illustrative, but I have some concerns: (i) I would think that the triangle area is not a correct representation of the computational needs. I would rather think that, given a specific configuration, the product of the distances along the 3 axis is representative for the computational need (geometrically that would correspond to the volume of a block). Possibly, the volume behind the triangle (tetrahedron formed by the triangle and part of the 3 axes) could also be seen as proportional with the computational needs (and could be a better quantification of the computational needs than the area of the triangle). (ii) If one wants the volume to be representative for the computational needs, I would choose the axis linear. E.g., one places a 1x1 resolution simulation 4 times further from the origin than a 2x2 simulation. (iii) Setups with the same computational cost, will not lie on flat surfaces but rather on hyperbolic type of surfaces (I would think).
- line 156-157 : "For some research questions, the complexity of ESMs can be reduced to a large degree" : please give an example.
- line 137 : "Available computational capacity ..." -> "Limited (available) computational capacity ..."
- line 137 : "modeling center" : maybe specify what this is for people not in the field.
- line 138 : "on short timescales" -> "in a short period of time"
- line 138 : "choice" and "defined" is a strange combination. It is also used on line 139.
- line 138 : where is freedom left in the experimental setup? Isn't it that rather some models do not have interactions? E.g., in some models in an abrupt-4xCO2 experiment, vegetated area can reduce/increase which can have an impact on dust emissions (in addition to the impact of dryer/wetter/windier conditions). In other models, the vegetated area is not allowed to change.
- line 139 : "exact experimental design" : what is meant by exact?
- line 139 : "Taken together, there were inevitable tradeoffs in the exact experimental design." -> "... there are ..."
- line 142 : "in an ensemble of experiments" : unclear whether "experiments" refers to identical setup or not.
- line 143 : "area of triangle" -> "volume of tetrahedron"
- line 145 : does not scale linearly : I would still put the real cost on a linear scale; e.g., 2x2 degree horizontal resolution corresponds with 1, 1x1 degree horizontal resolution corresponds with 4, such that the volume calculation is still correct.
- line 146 : "doubling the simulation length or number" -> doubling the simulation length or number of simulations
- line 147 : "but this is not true for the model resolution" : I would think that it can be made true (see above)
- line 148 : "for instances" -> "for instance", or "e.g."
- line 149 : "has become available" -> "continues to grow"
- line 150 : I agree with "interactive chemistry" but I find "competition for priority of experiments" strange. I would say that "chemistry" competes with "resolution" or with the "number of simulations", but not with the "priority or experiments".
- line 151, line 154 : I would not say "the most complex ESMs" -> "complex ESMs"
- line 157-159 : I assume it is true, but it is so general that it would be nice to have an example.
- line 159-161 : such that "model-internal variability" can be separated from the "mean radiative forcing", "climate response" and "impacts on air-quality" : although I think I understand what is meant, I think it lacks some better description. I think one wants to split, e.g., the TOA imbalance in a "mean radiative forcing" and a contribution from "internal variability", and the same for the "climate response" and "impacts on air-quality".
- line 158 versus line 162 : what is written here seems to contradict itself : "high complexity can be reduced" (line 158) <-> "high process complexity, ... and needed" (line 162)
- line 162 : "but also poses" -> "also poses"
- line 162 : what is meant by "needed"?
- line 162-163 : in the first sentence, one mentions apparently one "challenge". In the next sentence, two challenges appear.
- line 164 : "the number of interacting processes" : I don't know whether one can express this in a number for an ESM.
- line 165 : "specification of the model resolution" : does this refer to the model center's choice of resolution, or the MIP-imposed resolution?
- line 167 : "prescribeid aerosols such as the spatial distribution" : this should be better formulated
- line 168 and 169 : this sentence refers to both types of differences (model capabilities and experimental setup). However, my impression is that the former sentences only refer to model capabilities (not about experimental setup).
- line 170 : what is "model diversity" in the design of a MIP?
- line 170-171 : some models can simulate processes that others cannot : isn't that the same as "diversity in the level of complexity"?
3.2 PROCESS UNDERSTANDING ABYSS
- line 176 : I would not use "most complex", but just "complex".
- line 177 : I don't think that advancing climate science is limited because not "all" processes are represented. One can never know or represent them all, but one can make progress by adding the ones which we think are relevant. What the authors possibly want to say : we cannot reproduce observed climate change and simulate future climate change because we miss or did not represent some processes. (This corresponds probably more with what is said in the second sentence of this section, line 178.)
- line 179 : "are not represented represented differently" -> "are not represented or represented differently"
- line 183 : "primary organic aerosols" : does this refer to (interactive) marine primary organic aerosols?
- line 183 : "can be represented" -> "are represented"
- line 186 : "... with potential health impacts." : this makes one think that no other impacts will be mentioned, but in the next sentence also the impact on clouds is mentioned.
- line 199 : ".. model consensus and smaller in magnitude might suggest that they are irrelevant ..." : maybe formulate in a different way.
- line 200 : "dust trends" : possibly add "over the historical period"
- line 201 : "so much that they are of opposite sign" : a change in sign from a small negative to a small positive value is not automatically a dramatic change.
- line 201 : "so much so that" : maybe formulate differently
- line 201 : I would think that a small dust feedback does not imply automatically that the dust emission changes are small.
- line 203-223 : This section seems to focus on natural processes. It is however not sure, as it is mixed with information which is maybe not only related to natural aerosol. E.g., mentioning that trends in aerosol and ozone do not fit the observations can also be related to errors in anthropogenic emission estimates; the effect that optical properties or size distribution of aerosol is biased can possibly also apply to anthropogenic aerosol. In general, I find this paragraph a bit difficult to follow, and it should be improved.
- line 206 : secondary organic aerosols : mainly natural?
- line 214 : this sentence discusses dust again, whereas some aspects of dust had already been mentioned in line 198-202. Maybe this could be combined.
- line 219 : Although it is true, I don't know whether the tuning is relevant in this context and should be mentioned here.
4. METHODOLOGICAL OPPORTUNITIES
- line 230 : "and and finally" -> "and finally"
- line 231 : "to understand the causes of model diversity" : I don't think one needs observations to understand the differences. However, observations might constrain the models.
- line 233 : "the further development of the method for radiative forcing calculations" : maybe reformulate
- line 232-234 : I suggest to improve the sentence
4.1 AUGMENTED ESMs
4.1.1 EMULATORS WHERE INFORMATIVE
- line 238 : "are informed" : this is a strange expression, and not the same as on line 243; is the first one similar to "trained" (as "training" on line 244)?
- line 237-242 : "reduces computational demand", "fast calculation", "massively reduced computational cost" : the same concept is repeated several times. Maybe avoid repetition.
- line 245-248 : (informed by CMIP, idealized experiments, or PPEs) is broader than line 238-239 (informed by CMIP) : maybe it should be consistent.
- line 253-254 : "and explore climate responses to different forcing agents" : seems rather similar to what is mentioned on line 243
- line 254 : what is meant by "different" forcing agents? Is it the same as on line 243 ("different forcings")?
4.1.2 KILOMETER-SCALE EXPERIMENTS WHERE POSSIBLE
- line 258 : "which can be enabled" : this gives the impression that this process/evolution is not difficult. Isn't that an underestimation?
- line 264 : "Simpkins (2018)": why not referring to the real paper, i.e., McCoy et al. (2018)?
- line 267 : "that involve" -> "that involves"
- line 272 : "periods of a weeks to years" : "a week" or "weeks"
- line 272 : "they are promising to better simulate clouds, precipitation and circulation" : already mentioned on line 259-261.
4.1.3 MACHINE LEARNING WHERE NEEDED
4.2 IMPROVED DIAGNOSTICS AND ANALYSIS
4.2.1 RADIATIVE FORCING CALCULATIONS
- line 299 : "e.g. Sherwood et al., 2015" -> "e.g.,"
- line 299-307 : Some of this (IRF) has been mentioned earlier in the text (Section 1, line 20-42). Maybe some connection should be made to those earlier mentions.
- line 315 : "double calls" : possibly explain this
- line 322-324 : these two methods have been mentioned earlier in the text (Section 2, line 90-95). At least mention that in the text.
- line 326 : "atmosphere-only" : in such simulations the land model is also active, together with possibly parts of the sea-ice model. So maybe one should use another term to describe this setup.
- line 334-336 : is it realistic to expect the fixed land-surface temperature method to be implemented in many ESMs soon (and thus on a MIP-wide scale)?
- line 337-344 : this last paragraph is a bit different from the rest of the text in this section. It is not a radiative forcing calculation, but appears in a section with the title "Radiative forcing calculation". It probably has its place in this section, but it should be better integrated/introduced.
- line 337 : "yet another area" : this is not a nice expression
- line 338, 342 : "sophisticated ways (of analysis)" : this should be more specific.
- line 339-340 : "diversity ... limit" -> limits
4.2.2 SYNERGIES WITH IMPACT ASSESSMENTS
- line 346-363 : although I certainly see the value of this, implementation of these extra diagnostics should not only happen in the RFMIP/PDRMIP/AerChemMIP simulations, but also in other CMIP simulations. There is maybe a task for this community, to promote the importance of these diagnostics to the wider climate change modelling community.
- line 353 : "combination of species" -> "combinations of species"
- line 363-364 : "considerations" : What is meant by "considerations"? What is meant by this sentence?
- line 366 : "the usage of MIPs" : What is meant by this : the use of data from several MIPs? Or use the concept to start extra MIPs?
- line 373 : some ideas appear which have not been mentioned earlier, e.g., that the paradigm does not work for precipitation.
- line 371 : the understanding "with" Earth System Models : maybe the understanding of climate change with Earth System models
- line 374-375 : "In part, this is related to the grand challenge of representing clouds and circulation, which can be addressed with newly evolving capabilities." : it is not clear what "this" refers to.
- line 394-396 : this last sentence (about GIANT) appears to be quite different from the rest of the paragraph (which was mainly about experimental design). I would suggest trying to integrate this better.
- line 400-405 : I don't know whether an experiment involving 2 models should be mentioned in the conclusions. The conclusions should have a broad general view.
SHORT SENTENCES :
I found a few short sentences, which broke the nice reading flow. I would try to modify them, and better integrate them in the text.
- line 80 : "AerChemMIP also focuses on quantifying radiative forcing and responses."
- line 163 : "AerChemMIP emphasized two such challenges."
- line 199 : "Dust is one such example."
- line 281-282 : "Proofs of concept from single ESMs exist."
I have the impression that AerChemMIP stresses a bit more on its achievements than the other two MIPs. So I would suggest trying to reformulate a few sentences. They are :
- line 163 : "AerChemMIP emphasized two such challenges ..."
- line 181-183 : "AerChemMIP showed that including previously missing interactive sources of chemical species in an ESM ..."
- line 189 : "Of the three MIPs, AerChemMIP played a unique role ..."
- line 205 : "AerChemMIP further points to model difference ..."
- line 482 : "JOURNAL OF CLIMATE" -> "Journal of Climate"
- line 633 : "GEOSCIENTIFIC MODEL DEVELOPMENT" -> "Geoscientific Model Development"
- Figure 1 : does "Earth System Model" in the red box refer to the AOGCM (component of an ESM)? Maybe this figure can be improved.
- Figure 2 : "The main goals of AerChemMIP are ..." : "The main goal of AerChemMIP is ..."
- Figure 2 : "... where the emissions or concentrations of the species of interest is perturbed" -> "are" perturbed
- Figure 3 : some arrows have points in colours, some in black
- Figure 3 : in the upper right figure, the temperature in the troposphere should not change (I assume). However, the lines do not completely overlap in the troposphere, which is confusing.Citation: https://doi.org/
Stephanie Fiedler et al.
Stephanie Fiedler et al.
- HTML: 382
- PDF: 114
- XML: 11
- Total: 507
- BibTeX: 2
- EndNote: 4
Viewed (geographical distribution)
- Metadata XML
Review of Interactions between atmospheric composition and climate change - Progress in understanding and future opportunities from AerChemMIP, PDRMIP, and RFMIP, by Fiedler et al.
As much as I applaud the motivation behind this paper, to show the connections between the three MIP’s and thinking about ways forward, I can’t recommend publication of this article as it is missing any meaningful conclusions. I don’t want to discourage the authors from writing this paper, however I would like to encourage them to further develop ideas and summarize more developed ideas about ways forward.
Overall, the paper summarizes what the three MIP’s are about (which has been already done in other papers), followed by presenting some new aspects to be included in future MIP’s. However, all those ideas stay at the surface by just naming them, and giving short paragraphs on what those buzz words are, without really discussing how they can be connected to future MIPs. As such the paper does not give any new information and unfortunately is not useful in its current form.
Emulators: they are just mentioned, and it should be pointed out that some very successful work including emulators, including perturbed physics experiments have already been carried out under CMIP6. What is the future vision here? How should this be more integrated into CMIP? Challenges, possibilities etc?
Km- scale modeling: This is a very big topic. Again, here it is just mentioned without giving any perspective? How would km-scale modeling be integrated for CMIP? What are the challenges? Aka not resolving chemistry while this paper discusses composition related MIPs. A much more critical assessment is needed. Can these experiments even be done using coupled oceans? How would that be integrated? How does that relate to TriMIP climate change experiments? What is the real use as high resolution weather models already exist since a long time. How would pushing to even higher resolution solve any climate issues? There is a lot to be discussed, like connecting climate change to impacts etc. Connecting weather modeling to CMIP modelling, etc. However this paper discusses no real issues, just mentions general topics.
Machine learning: Again just mentioning a topic, without going deeper. There is so much to discuss. What about ML is useful for CMIP, what parts are not useful. Replacing models physics/chemistry with ML has very big problematic sides, creating not understandable black boxes. At the same time using ML might be unavoidable in the future, if higher resolution is necessary for other processes. How does this topic fit into the CMIP framework? What are possible ways forward? Again, no answers or ideas are discussed here, just giving a buzz word.
Figures: Figure 1 contains no information that is not conveyed already in the text. Figure 2, this could be a question of preference, but again here I don’t know why a figure is needed, as the arguments between, resolution vs. complexity vs simulation length (maybe add here also ensemble size) is obvious.
A discussion could be useful to think if the CMIP6 TriMIPs asked for too many experiments. Was it useful to have so many tiers and experiments asked for? Would a simpler set of runs be more useful? Or was this the correct amount to ask for? A critical assessment would fit into this paper and would allow this paper to go purely summarizing the previous experiments.
On page 5, L 125 challenges of MIP research does point out aerosol diversity, and only cites one paper using an extremely overly simplified methods, and ignoring all the work that has been done over decades by the Aerocom community, where a deep understanding has been collected on model processes and diversity.
The abstract reads like an introduction, but maybe that is caused by the fact that no solid conclusions are drawn in the paper.
Page 6 Line 162 etc, This is referring to the comment: Process complexity is not reducing uncertainty. It should not be the goal to have all models to agree with each other. Model diversity should be the goal. Increased ‘uncertainty’ comes with the territory of increased complexity. Understanding ‘uncertainty’ model diversity should be the goal. The community will never get to a place where all models agree, neither should they.
Page 8, the reason behind differences in CMIP modelled dust trends and observational evidence, lies in the fact that most/all CMIP models are not coupled to dynamic terrestrial/dynamic vegetation models, and as such missing many feedbacks that lead to changes in dust emissions. As surface process modeling on CMIP time scales is extremely difficult, hybrid approaches could be investigated to get more interactions between emissions and ESM processes included in future CMIP experiments. This goes beyond dust, and is true for many other emission sources.
The Radiative forcing paragraph, what is new here that has not already been summarized in the papers cited in this section?
Conclusion Nr 1: using prescribed SST’s, this is AMIP, which is part of CMIP.
In summary, I feel this paper can’t be fixed by adding some more aspects to the chapters, I feel it needs much more holistic ideas how to push forward and eventually needs a complete rewrite.