User Tools

Site Tools


start

An introduction to InScAPE

Three dimensional volume rendering of simulated cumulus clouds at the JOYCE site on 5 June 2013

The research group for Integrated Scale-Adaptive Parameterization and Evaluation (InScAPE) of Prof. Roel Neggers aims to develop scale-adaptive parameterizations of small-scale turbulent/convective processes and clouds for larger-scale models, and to constrain those with relevant measurements as obtained from permanent meteorological “supersites”. In addition, we investigate the role of boundary layer clouds in a larger context, including their coupling to the Earth's surface and the role they play in global climate change.

Parameterization

General Circulation Models (GCMs) used for numerical weather prediction and climate simulation generally consist of a set of primitive equations describing atmospheric flow that are discretized in time and space. The current efficiency of supercomputers only allows spatial resolutions in the order of kilometers (weather prediction) or tens of kilometers (climate simulation). As a result of this limitation many atmospheric processes remain unresolved. These processes include turbulence, convection, clouds and precipitation. As a consequence, the impact of subgrid-scale processes on the larger-scale circulation and climate has to be represented through parameterization (see the image on the right for a classic example from the literature). The evaluation and improvement of such parameterization schemes has been an active research field ever since the start of operational numerical weather forecasting in the middle of the last century. Recent results in climate science have rekindled the scientific interest in the parameterization of low level clouds in climate models, by highlighting their important contribution to the current uncertainties about future climate change.

Simulation

Paraview visualization of DALES clouds in Arctic conditions during the M-PACE case (Period B). The cloud condensate is visualized using volume rendering, with the coloring reflecting the condensate amount (g/kg). The precipitation in the domain is visualized as a solid green contour (at 1.e-6 g/kg). The white box represents the simulation domain, with sizes 12.6 x 12.6 x 5 km

To achieve its science goals the InScAPE group makes use of a hierarchy of atmospheric models. These can be run on platforms ranging from simple workstations to supercomputers. One of the main “working horse” models is the Large-Eddy Simulation (LES) model, which simulates the atmospheric flow in a limited domain (~10km) at high resolutions (~25m). These resolutions are fine enough to resolve atmospheric phenomena such as turbulence and convection, including the associated clouds like cumulus and stratocumulus. Processes at smaller scales are still parameterized, including small-scale turbulence and cloud microphysics. LES has emerged as an important research tool in the last decades, as it can provide virtual information on four dimensional fields (in time and space) of many relevant atmospheric variables. In practice these are still hard to measure by instrumentation, so that the LES can act as a “virtual laboratory” for investigating a phenomenon of interest. Such information is essential for the effective evaluation and improvement of parameterizations for large-scale models.

In the InScAPE working group various LES codes are operational. Our LES codes include the Dutch Atmospheric Large-Eddy Simulation (DALES) model and LES version of the Icosahedral Non-hydrostatic model (ICON) as developed by DWD and MPI-M. Both codes have been thoroughly tested for a range of prototype situations, and have participated in various model intercomparison studies.

More details of our LES codes can be found in the overview of models.

Parameterization Testbeds

An overview of the key ingredients in a parameterization testbed. From left to right: Clouds and precipitation in an LES, the 200m high meteorological tower at Cabauw in The Netherlands, a radiosonde, the MIRA cloud radar at JOYCE, and a schematic illustration of a Single Column Model (SCM).

An important tool in our research strategy is the so-called “parameterization testbed”. This stands for a platform or environment where model and observational datastreams come together and can easily be inter-compared. The aim is to facilitate the parameterization development process, by confronting models with data in a structural and informed manner. While such evaluation can give confidence in the realism of a model simulation, on the other hand the models can help to gain insight into processes as they act in nature, which are not yet fully understood. In this synergy, LES models act as virtual laboratory, Single-Column Model (SCM) simulations are used to better understand and improve parameterizations for GCMs, and GCM output is used to drive process models but also to act as background reference. A relatively new use of LES is to use its three-dimensional domain to virtually test measurement strategies a priori their deployment in the field during observational campaigns.

The InScAPE parameterization testbed is described in more detail on testbed info page.

Current Research

EDMF We are continuously working on the improvement and development on the EDMF (Eddy Diffusivity Mass Flux) parameterization towards a bin macrophysics scheme that is at the moment implemented and tested in an LES framework.

Cloud scheme Within the HDCP2 project we currently work on the development and implementation of a PDF cloud parameterization for global climate model.

Greyzone A big challenge for parameterization development are the increasing resolutions of NWP models, which start to resolve important processes for cloud formation and organization - the so called “Greyzone”. Parameterizations have to take the resolution into account and find a consistent way to reduce their own activity.

Cloud size distribution Understanding patterns in shallow cumulus cloud populations is essential for the development of scale-aware cloud schemes. The organization of this type of clouds is studied by using cloud size distributions and nearest-neighbour spacing.

High Performance Computing (HPC)

To perform the various model simulations we make use of HPC facilities. Thesse include the CHEOPS cluster at the Regional Computing Center of the university of Cologne (RRZK), the JURECA cluster at the Jülich Supercomputing Centre (JSC), and the MISTRAL cluster at the Deutsches Klima Rechenzentrum (DKRZ).

start.txt · Last modified: 2018/10/09 14:19 by neggers