Presenter: Liam Bindle; Washington University at St. Louis

We recently added a grid refinement capability to the GEOS-Chem High-Performance model that uses grid-stretching to refine the model grid in a user-defined region. Grid-stretching is a nimble technique because it is controlled by simple runtime parameters and does not require lateral boundary conditions. In this talk, I will discuss grid-stretching along with considerations for using stretched grids for simulations of atmospheric chemistry.

Presentation Slides

Watch webinar on YouTube

Presenter: David Novak, Director, NOAA/NWS Weather Prediction Center

The impacts from extreme precipitation are deadly, damaging, and increasing in a warming climate. Knowing when it will rain and how much will fall is crucial to every person and business in the U.S. Further, precipitation processes are an integration of many atmospheric processes and have direct impacts to the ocean, ecosystems, hydrology and the cryosphere. Thus, precipitation is a unifying theme across the Weather, Water, and Climate communities. However, partner expectations of accuracy, specificity, and lead time often exceed current capabilities. Further, NOAA models (global models in particular) have seen marginal improvement in precipitation skill (~15%) over the past 2 decades. In response, NOAA has developed a strategy for a decadal effort to improve forecast precipitation (from mesoscale weather to seasonal timescales) – called the NOAA Precipitation Prediction Grand Challenge.

The goal of this effort is to dramatically improve precipitation forecasts in terms of accuracy, extent in time, and reliability. The challenge demands investment across the value chain from basic understanding of precipitation processes and predictability limits, to enhanced observations, data assimilation, improved models, post-processing and tools for the human forecaster, culminating in understandable, actionable, and equitable services — these services being informed by stakeholder engagement. Thus, it is a grand Research-to-Operations challenge. This initiative and early studies will be described, with attention focused on opportunities for community involvement in R2O.

Presentation Slides

Watch webinar recording on YouTube

Presenter: V. Balaji; Princeton University and NOAA/GFDL

The advent of digital computing in the 1950s sparked a revolution in the science of weather and climate. Meteorology, long based on extrapolating patterns in space and time, gave way to computational methods in a decade of advances in numerical weather forecasting. Those same methods also gave rise to computational climate science, studying the behaviour of those same numerical equations over intervals much longer than weather events, and changes in external boundary conditions. Several subsequent decades of exponential growth in the power of computing have brought us to the present day, where models ever grow in resolution and complexity, capable of mastery of many small-scale phenomena with global repercussions, and ever more intricate feedbacks in the Earth system.

The current juncture in computing, seven decades later, heralds an end to what is called Dennard scaling, the physics behind ever smaller computational units and ever faster arithmetic. This is prompting a fundamental change in our approach to the simulation of weather and climate, potentially as revolutionary as that wrought by John von Neumann in the 1950s. One approach could return us to an earlier era of pattern recognition and extrapolation, this time aided by computational power. Another approach could lead us to insights that continue to be expressed in mathematical equations. In either approach, or any synthesis of those, it is clearly no longer the steady march of the last few decades, continuing to add detail to ever more elaborate models. In this prospectus, we attempt to show the outlines of how this may unfold in the coming decades, a new harnessing of physical knowledge, computation, and data.

Presentation Slides

Watch webinar recording on YouTube

Presenter: Michael Michaud; University of Delaware, NOAA Lapenta Intern 2021

Over the last several decades, community modeling has become more prevalent within earth system science and is seen as a way to solve our wicked forecasting problems. Most community modeling focuses on the technical infrastructure of designing models overlooking the needed social infrastructure to nurture and connect the people developing the models. This presentation examines key stakeholder perceptions on the meaning of a sense of community within a community model. The key stakeholders interviewed included individuals involved with the Unified Forecast System (UFS) and the Earth Prediction Innovation Center (EPIC) across various sectors of the weather, climate, and water enterprise. Through data from interviews and utilizing theoretical and practical frameworks, this presentation makes recommendations on how EPIC can cultivate necessary social infrastructure through structures like a Community of Practice to continuously engage members to develop a sense of community. In order to utilize the full power of a technical system, members need a sense of community and constant engagement.

Presentation Slides

Watch webinar recording on YouTube

Presenter: Tara Jensen – NCAR/RAL and DTC

The enhanced Model Evaluation Tools (METplus) is a very comprehensive verification and diagnostic software package for use with a wide range of temporal and spatial prediction and available to the Unified Forecast System (UFS) developers and users via the Developmental Testbed Center (DTC). Recent emphasis has been on addressing the needs of the UFS Research to Operations (R2O) project, the findings from the 2021 DTC UFS Evaluation Metrics Workshop, the DTC Testing and Evaluation activities, and numerous other projects. This talk will review what has been added, what is in the development pipeline, and some of the challenges the METplus team is facing while trying to ensure that all of the additions will be able to move into operations successfully.

Presentation Slides

Watch webinar recording on YouTube

Presenter: Sam Ephraim – 2021 La Penta Intern in NOAA/WPO/EPIC

The goal of the Earth Prediction Innovation Center (EPIC) is to enable the most accurate and reliable operational numerical weather prediction (NWP) forecast model in the world. EPIC will achieve this goal through community engagement where students, researchers, professors, and other community members can collaborate to develop open-source code for the Unified Forecast System (UFS). One way to spark the interest of new community members is through Graduate Student Tests (GST), which are usability tests that entail running, modifying, rerunning, and comparing outputs of the UFS code and its applications.

In order to increase accessibility of the UFS GSTs, cloud versions were developed as part of a William M. Lapenta internship project at NOAA. The cloud-based GSTs include documentation with instructions to run containerized versions of the UFS usability tests on the Amazon Web Services (AWS) platform that utilize new plotting python scripts to visualize results, as well as a FAQ document. Running the GSTs on the cloud is important for increasing accessibility because community members without access to HPCs will now be able to run the GST quickly and cheaply using any device that can connect to the internet. Cloud-based GSTs are expected to increase the number of people running the UFS. This will in turn increase the pool of people contributing to the UFS, which will help NOAA develop the most accurate and reliable operational numerical forecast model in the world.

This talk will discuss the GSTs in depth and explain some of the challenges of deploying the GSTs in the AWS cloud. Performance metrics from running the GSTs in the cloud along with opportunities of future engagement with the UFS will also be discussed.

Presentation Slides

Watch webinar recording on YouTube

Presenter: Clara Draper, NOAA/OAR, PSL, Boulder, Colorado

The land data assimilation (DA) used in NOAA’s global numerical weather prediction (NWP) system is much less advanced than that used at other major international NWP centers, and as a part of the GFSv17 upgrade we are developing a new state-of-the-art land data assimilation system. This seminar will review the planned design of the new land data assimilation system, and progress towards its development and implementation. The first priority for the new land data assimilation system is to replace the current snow depth analysis. The current analysis is quite outdated, and consists of a simple rule-based merging of an externally generated snow depth product. This is being replaced with an Optimal Interpolation (OI) snow depth analysis, based directly on the methods used at other NWP centers. Tests of the snow depth OI with GFSv16 (with the current land surface model, Noah) showed that it significantly improves the model simulated snow depth, while generating small but consistent improvements to the simulated atmospheric temperatures over snow-affected land. Based on these tests, we are preparing the snow depth OI for use in GFSv17. This has included adapting the OI to the Noah-MP land surface model (which will replace Noah in GFSv17), and also implementing the OI within the JCSDA’s JEDI data assimilation platform. The second priority of the new land data assimilation system is to introduce a soil moisture and soil temperature analysis. Currently, NOAA does not apply a snow analysis in our global NWP systems, while other centers have done so for decades. For the soil analyses, we are developing a Local Ensemble Transform Kalman Filter (LETKF) assimilation, initially based on assimilation of screen level temperature (T2m) and specific humidity (q2m). Early tests with GFSv16 using the LETKF to update the model soil temperature from T2m observations show very small improvements in the subsequent simulations of T2m, with negligible effect above the surface. Additionally, the impact of the assimilation is limited by the difficulty of obtaining sufficient ensemble spread without introducing biases into the ensemble mean. Work is ongoing to address this issue.

Presentation Slides

Watch webinar recording on YouTube

Presenter: Lisa Bengtsson – University of Colorado, NOAA/ESRL, Boulder, Colorado

Abstract : The weather in the tropics is important for the Earth’s atmospheric circulation pattern, therefore, correctly modeling the seasonal and year-to-year variations in this region is crucial for improving predictions of weather and climate across the world. Weather and climate variability in the tropics is primarily driven by equatorial waves interacting with smaller scale atmospheric convection. These ‘convectively coupled’ equatorial waves are important for global weather prediction because a better description of the weather in the tropics will lead to a better description of the weather in other places, such as the United States. Convectively coupled equatorial waves have been a major modeling challenge from weather to climate scales because the onset and propagation of these waves depends on processes that are only partially accounted for in global weather prediction systems. In this talk I will present recent research that highlights some key aspects needed in the NOAA GFS description of atmospheric convection to improve the interaction between small scale physics and large scale waves. These aspects include improvements in moisture-convection coupling, stochasticity and sub-grid (and cross-grid) convective organization feedbacks. I will also address aspects related to representing cumulus convection in the so-called “convective grey zone regime”, and discuss scale adaptive representation of cumulus convection needed to prepare the GFS for higher global resolution.

Presentation Slides

Watch webinar on YouTube

Presenter: Sue Ellen Haupt, National Center for Atmospheric Research (NCAR)

Abstract: This talk will review how AI/ML has been advancing our capabilities to model the weather and climate. Modern forecast systems benefit from having AI blended with physics approaches to optimize forecast accuracy, speed, and applicability. First, we’ll discuss ML postprocessing, which allows us to merge model output with historical observations to improve real-time forecasts. This widely applied method has been a boon to forecast improvement for a variety of applications. We’ll review NCAR’s Dynamic Integrated Forecast System (DICast) as an example of a successful post processing system. We’ll describe how we blend that and other AI methods with numerical models for renewable energy applications among others. Secondly, ML replacements and emulations of model physical parameterizations has the potential to not only greatly speed computations, but when built with observational data, to also provide more accurate solutions. Work on multiple parameterizations have shown major potential to advance modeling capabilities, including one for surface layer parameterizations that has proven to provide advances beyond standard methods, even at sites where it was not trained. Finally, using AI/ML to actually produce model output is beginning to show real potential. One example is using deep learning to provide high-resolution features conditioned on a coarser simulation. After being trained on high-resolution model output, this approach can provide plausible high-resolution images even on regions where it was not originally trained. These applications suggest a potential for future fully learned AI/ML modeling capability. These types of advances are in the midst of revolutionizing how we model the atmosphere.

Presentation Slides

Watch webinar on YouTube

Presenter: Hendrik Tolman, NOAA / WNS / STI

Abstract: This is a repeat of a presentation I did at the Issac Newton Institute of the University of Cambridge on October 26. It will touch on the topics of multi-disciplinary community modeling, the use of AI and ML learning, and collaboration with the private sector for the UFS, using the history of the WAVEWATCH III wind wave model as an example.

Presentation Slides

Watch webinar on YouTube