Environment Counts | National Academies propose a national strategy for advancing climate observation and modeling
Author: Geoff Zeiss – Published At: 2012-09-11 13:00 – (837 Reads)
Weather and climate modeling is one of the most computationally intensive applications of computer technology. In the U.S. there are several large global climate modeling projects and many smaller groups running smaller regional climate models. The National Academies are proposing a unification of these efforts in which software, data standard, tools, and model components would be shared among modeling groups across the country. This would not mean a single model, but a common modeling framework in which institutions could develop their own methodologies, but share data, output, software, and components in an interoperable environment. A key recommendation is to continue and expand long-term recording of experimental observation. National Academies: National Strategy for Advancing Climate Modeling 2012
In the U.S. there are several large global climate modeling projects and many smaller groups running smaller regional climate models. The National Academies are proposing a unification of these efforts in which software, data standard, tools, and model components would be shared among modeling groups across the country.
The National Academies (National Academy of Sciences, National Academy of Engineering, Institute of Medicine, National Research Council) have released a report that makes important recommendations that could provide a basis for a national strategy for advancing climate modeling.
The key recommendations include
Sustained experimental observation – long-term recording of observational data on temperature, precipitation, clouds, snow and ice, and ecosystem change is critical for understanding processes that drive the climate system. It will be important to maintain existing long-term datasets of essential climate variables, and to initiate innovative new climate measurements that help characterize Earth system processes.
Observation and calibration – Developing models that function across both weather and climate timescales would allow the testing and calibration of climate models on weather timescales where there is more observational data.
Common modeling framework – different climate modeling institutions could pursue their own methodologies but work within a common modeling framework in which software, data standard, tools, and model components are shared by all major modeling groups nationwide.
Parallel computing – more computing power will be required and this will most likely involve connecting huge numbers of CPUs together in parallel. This will also require ensuring that climate modeling software is compatible with the new hardware platform.
Shared software infrastructure – The U.S. supports several climate models each with components assembled with slightly different software and data output standards. If all U.S. climate models employed a single software system, it could simplify testing and migration to new computing hardware, and allow scientists to compare and interchange climate model components, such as land surface or ocean models.
Usability – It is important to ensure that ensuring that the output generated by climate models is understandable to all users. Developing a national education and accreditation program to train climate model interpreters in how to use output from climate models would help ensure a high standard in how output from climate models is interpreted and conveyed.
Dedicated computing resources – a two-pronged approach that involves the continued use and upgrading of existing climate- dedicated computing resources at modeling centers, together with research on how to exploit new computer hardware systems that will be developed over the next 10 to 20 years.
Recruiting researchers – The number of climate model developers is not growing in the United States, and wasy need to be found to attract more high caliber computer and climate scientists to become climate model developers.
National information technology for “big climate data” – Ever larger amounts of climate model and observational data are being generated. Enabling broad access to these large volumes of data for researchers, data users, and decision makers is a critical challenge. The National Academies are recommending that the U.S. develop a national information technology infrastructure to support data display, visualization, and analysis, for both experts and the lay user community.