National Review on Physically-based Modelling and Simulation of the Atmosphere and Hydrosphere

to be held

25-26 February 2008

at the

Finnish Meteorological Institute

in Helsinki

bringing together researchers, research administrators, public or commercial service providers and funding agencies with an interest in physically-based modelling and simulation as a means for studying, monitoring, predicting or managing physical, chemical, and biological aspects of the atmosphere and hydrosphere.




The meeting will be organized in a sequence of thematic sessions, each containing contributed papers and a general discussion. The themes are chosen so as to highlight issues of common interest to the modelling community and eliminating traditional boundaries between disciplines and application areas.


1. Modelling methods and model development

Devoted to numerical methods, modelling and parameterization of individual processes, design of entire models, and coupling of models.


An overview of the global climate modelling activities in Finland. 

Heikki Järvinen, Finnish Meteorological Institute

In 2004, Finnish Earth System Modelling (ESM) partners (Finnish Meteorological Institute, University of Helsinki, University of Kuopio, Finnish Institute of Marine Research) joined the COSMOS ESM network led by the Max-Planck-Institute for Meteorology. The aim is to feed Finnish ESM expertise in selected areas to the COSMOS ESM. Several implementations have been completed, such as the stochastic treatment of subgrid-scale cloud-radiative effects and SALSA - a size-segregated micro-physical aerosol module for large-scale applications. Current work areas and plans are presented.



Tests of Monte Carlo Independent Column Approximation (McICA) in ECHAM5.

Petri Räisänen:, Finnish Meteorological Institute

The typical horizontal resolution of atmospheric general circulation  models (GCMs) used in climate modeling is 200-300 km, but cloud  properties vary substantially at scales smaller than this.  These  unresolved cloud structures can affect radiatie fluxes  systematically, even when averaged over the grid glolumn of the GCM.

The Monte Carlo Independent Column Approximation (McICA) was recently  introduced as a new approach for parametrizing broadband radiative  fluxes in GCMs.  While conventional GCM radiation schemes embed the  assumptions about unresolved cloud structure within the radiative  transfer solver, McICA separates the description of cloud field  structure from the radiative transfer solver by dividing the cloud field into a set of subcolumns. This provides much more flexibility in  the description of cloud field properties, and thereby a possibility  to reduce systematic errors. However, the radiative fluxes and heating  rates produced by McICA contain random errors, or noise.

In this talk I introduce the McICA method and give a brief description of tests of McICA in the ECHAM5 atmospheric GCM.



Development of aerosol and cloud modules for large scale applications

Sami Romakkaniemi, University of Kuopio

Aerosol and cloud schemes are an important part of regional and climate models. The representation of aerosol particles and cloud droplets in such large scale applications is always a compromise between the detail of the description and the computational efficiency. Because of this many important processes, such as the formation of cloud droplets, are strongly parameterized in these models. However, to get all the relevant aerosol microphysics into large scale applications, the level of paramerization needs to be decreased. With increasing computing resources this has already become possible and in the future, improvement in the representation of aerosol related processes is one of the most important single factors to improve the quality of climate forecasting.

We have developed a new Sectional Aerosol module for Large Scale Applications (SALSA) with multiple methods to reduce the computational burden of different aerosol processes in order to optimize the model performance without losing physical features relevant to the climate. In the test simulations the new module has been found to give good results when compared to more explicit models with a reasonable increase in calculation time compared to existing simple aerosol module used in ECHAM5 general circulation model. Similar philosophy as used in SALSA will be also extended for cloud droplet formation. A new explicit model will be developed that can be used to simulate cloud droplet formation in the case of complex aerosol mixtures, a process that is currently poorly presented in large scale applications.

Do we need to account for lakes in climate and NWP modelling?

P. Samuelsson,(SMHI) E. Kourzeneva (RSHU) and D. Mironov (DWD)

One of the most important issues in regional climate and weather prediction models is the
interaction of the atmosphere with the underlying surface. For decades, the interaction with
land and sea surface has gotten much attention, but lakes have many times been disregarded
or treated in a very simplistic way. The reason for this is of course that land and sea surface
dominate the surface of the earth while lakes are only regionally important. In regions where
lakes represent a non-negligible fraction of the surface their large thermal inertia, when
compared to the land surface, may cause them to have a substantial impact on the regional
climate. This is particularly the case in Fennoscandia, Russia and in Canada.

In RCM and NWP modelling the lower boundary condition for the atmosphere with respect to
lakes must be described. The boundary condition is represented by the energy fluxes of
radiation, heat and momentum. Thus, the lake interior is really not of importance per se. As
long as the surface temperature (including ice) is well simulated the lake model can be made
simple. For climate simulations, a computationally cheap model is also of high priority. A
lake model that fulfils these criteria is FLake (see http://lakemodel.net and references therein).
FLake is a two-layer model based on a self-similar representation (assumed shape) of the
temperature profile in the mixed layer and in the thermocline. The model incorporates (i) a
flexible parameterisation of the evolving temperature profile, (ii) an advanced formulation to
compute the mixed-layer depth, including the equation of convective entrainment and a
relaxation-type equation for the depth of a wind-mixed layer, (iii) an improved module to
describe the vertical temperature structure of the thermally active layer of bottom sediments
and the interaction of the water column with bottom sediments, and (iv) a snow-ice module.
The ability of Flake to predict the temperature structure in lakes of various depths on diurnal
and seasonal time scales has been successfully tested against data through single-column
numerical experiments. Today FLake is implemented into the NWP model COSMO-LM
(DWD) and into the regional climate models RCA (SMHI) and CLM (GKSS). It is also on its
way into the NWP model HIRLAM.

We will present results based on simulations with the Rossby Centre regional climate model
RCA coupled to FLake. Two 30-years simulations with RCA-FLake set up over Europe has
been analysed for the period 1961-1990 using ERA40 as lateral and SST boundary conditions.
In the first simulation lakes were present (applying FLAke) while in the second simulation all
lakes were replaced by open land. A comparison of the two simulations shows that the
presence of lakes has a warming effect on the climate for all seasons except spring. In cold
winter climates the warming effect during winter is explained by the fact that the ice covered
period usually extends from mid winter until mid spring. Thus, during the first half of the
winter the lakes are warmer than a corresponding open land area would be. During summer
the warming effect of lakes is due to a relatively warm lake surface temperature during night
time. The results also show that many small lakes (as in Southern Finland) act differently on
the summer climate than a few big lakes. Many small, and relatively warm, lakes enhance the
summer precipitation due to more evaporation while big, and relatively cool, lakes suppress
evaporation and consequently also the precipitation.

The answer to the title question is: Yes we should account for lakes in climate and NWP
modelling, at least in Northern Europe, where they make the surrounding mean temperature
climate warmer for most seasons.


Modelling of the Arctic atmosphere: state of the art and challenges

Timo Vihma, Finnish Meteorological Institute


HIRLAM EXPERIMENTS ON SURFACE ENERGY BALANCE ACROSS VATNAJÖKULL


Laura Rontu1, Friedrich Obleitner2, Stefan Gollvik3, Christoph Zingerle4, Sander Tijm5

1 Finnish Meteorological Institute, Helsinki, Finland
2 Institute of Meteorology and Geophysics, Innsbruck University, Austria
3 Swedish Meteorological and Hydrological Institute, Norrköping, Sweden
4 Central Institute for Meteorology and Geodynamics, Innsbruck, Austria
5 KNMI, De Bilt, the Netherlands

The skill of the High Resolution Limited Area Model (HIRLAM) to reproduce the near-surface atmospheric conditions across the Vatnajökull Ice Sheet in Iceland was investigated. The model-observation comparison study is based on a mesoscale glaciometeorological observation campaign, performed during summer 1996. Fine-scale hydrostatic HIRLAM experiments are used for downscaling ERA40 analyses and for the upper-air and surface data assimilation. The simulation results are compared to a subset of observations across Breidamerkurjökull, a southern outlet glacier of Vatnajökull. After introduction of improvements, suggested by a comparison of a reference run with observations, HIRLAM successfully simulates the surface energy balance and near-surface meteorological parameters. A proper description of the physical properties of the underlying surface turned out to be crucial. The results are valuable regarding forthcoming glaciological applications and further improvement of operational mesoscale NWP in mountainous and high latitude environments.



Parameterizations of snow and ice albedo

Roberta Pirazzini, University of Helsinki

The clear-sky diurnal evolution and the seasonal evolution of snow and ice albedo have been parameterized on the basis of broadband albedo measurements collected over the Antarctic continent and on the Bay of Bothnia, Sweden. The clear-sky parameterizations represent the hourly evolution in albedo due to the variation of the solar zenith angle and to the snow metamorphism, sublimation during the day, and refreezing and/or crystal formation/precipitation during the night. The seasonal evolution of snow and ice albedo was found to be correlated to the surface temperature over the Antarctic continent, but not in the Bay of Bothnia, where albedo variations were mostly driven by changes in snow thickness. In presence of strong contrasts in the surface albedo, the downwelling irradiance at one site is affected by the surface albedo of the neighbouring regions. A simple method to derive the broadband effective albedo under overcast conditions in regions with high spatial variability of the surface albedo is illustrated. The method accounts for the effect of multiple reflections between the surface and the cloud base, and can be applied in the marginal sea ice zone, or in patchy terrain with forests and snow-covered fields.



Modeling snow and sea ice thermodynamics in the Baltic Sea and the Arctic Ocean

Bin Cheng, Finnish Institute of Marine Research

A High resolution snow and sea ice thermodynamic model (HIGHTSI) has been developed in the Finnish Institute of Marine Research. The model is targeted for process studies, i.e. to simulate evolution of snow/ice surface temperature, in-snow/ice temperature and snow/ice thickness. Special attention is paid to the parameterization turbulent heat fluxes with the atmospheric boundary layer (ABL) stratification taken into account. The penetrating global radiation through the surface is parameterized, making the model capable to calculate sub-surface melting quantitatively. The model has been validated against several in situ data sets and, in general, yields good results of the surface heat balance, snow/ice mass balance and snow/ice temperature regimes. It has been found that successful simulation of temperature regimes within snow/ice during a melt season with large solar radiation requires a high vertical resolution. Recent developments of HIGHTIS have focused on processes during the melt season, i.e. superimposed ice formation (ice refrozen from melt water), snow/ice sub-surface melting and re-freezing, the variability effect of snow and the impact of surface albedo on snow and ice mass balance. We made model validation against the SHEBA (Surface Heat budget of the Arctic Ocean) and CHINARE (Chinese National Arctic Research Expedition) 2003 field measurements. We also carried out the model sensitivities study in terms of the variations of model forcing, model spatial resolution, thermal properties of snow and ice as well as the imapct of the surface albedo on snow and ice mass balance. The model has been utilized to produce operational thermodynamic sea ice growth in the Baltic Sea. In this presentation, we will review various HIGHTSI modeling activities and introduce several ongoing activities of HIGHTSI development and application.


Modelling of the ice thickness distribution of the Arctic Ocean 

Jari Haapala , Finnish Institute of Marine Research
 
 
Global climate models predict a maximum warming in the high latitudes of the Northern Hemisphere. Polar amplification is mainly due to the positive feedback mechanisms related to the insulation and albedo effects of the snow and sea-ice. True magnitude of the feedback effect of the ice/snow surface is still unknown because of the incomplete physical description of snow and ice physics and several others unresolved processes of the Arctic climate. A clear indicator of this is that the projected changes exhibit larges range of warming scenarios in the Arctic. In this talk, the effect of the physical description of the sea-ice on the modelled mean state of the sea-ice conditions are examined with a multi-category sea ice model with the prescribed atmospheric and oceanic conditions. The model has a global coverage, however, the focus is on the Arctic Ocean because it employs orthogonal curvilinear coordinates. The co-ordinate system is equivalent to MPI-OM1 where the poles are located over the Canada and the Western Siberia.
 
The evolution of the ice pack were simulated with the thermodynamical only model (TDM), free drift model (FDM), viscous-plastic (VPM), viscous-plastic model with island in the North Pole (VP-NPM) and with a multicategory model (MCM). In the multicategory model five underformed and two deformed ice categories were used. All simulations begin from the same initial conditions and a stationary conditions were obtained after ten years integration. It has been found that the modelled annual maximum ice extent is rather insensitive to ice dynamics or ice thickness distribution used. This indicates that the maximum annual ice extent is determined by the thermodynamical growth of new ice and correct modelling of sea surface temperature is the most important factor. In addition to the inaccuracies in the surface heat balance, an overestimation of the modelled ice extent in climate models can be caused due to underestimation of the Atlantic heat transport or vertical mixing. On a conrary, minimum ice extent is sensitive to the modelled ice thickness, which in turn, is highly dependent on the ice dynamics or thickness distribution used. Overestimation/underestimation of the dynamical growth of sea ice leads to overestimation/underestimation of ice mass and to a situation where ice isn't  melting/surviving during the summer stage. This clearly shows that a mass-momentum coupling is essential for sea ice and only the plastics models are physically realistic in climate simulations. All other models generate highly unrealistic ice thicknesses which are commonly hidden away introducing a numerical diffusion or an artificial upper limit for ice thickness. Sensitivity experiments show that the response of the sea-ice model to the changes of the thermal forcing depends on the modelled mean state of the control climate and thickness distribution used. In particular, beyond certain degree of warming, two-level model may predict total disappearance of sea ice in the Arctic, but multicategory ice model may predict existence of thick ridged ice also during the summer season.


AEROPOL – A SIMPLE GAUSSIAN URBAN SCALE AIR POLLUTION MODEL
Marko Kaasik, University of Tartu, Estonia


This paper presents AEROPOL, a simple Gaussian air pollution dispersion model that has been used in Estonia during more than a decade already to assess the environmental impact of urban, road, industrial and agricultural emissions.

In urban scale the simple steady-state Gaussian plume models have still advantages against more complicated and physically justified numerical models. First, in order to achieve considerably better performance, a detailed numerical model needs the meteorological and emission data with much higher spatial resolution than really available. Second, a Gaussian model requires a few decimal orders less computing time for the same task than a complicated numerical one, assuming a large number of sources and output grid points. Thus, a Gaussian model is cost-efficient for urban applications.

The AEROPOL model (developed jointly by University of Tartu and Hendrikson & Ko Environmental Consults) applies a steady-state Gaussian plume with Briggs´ dispersion parameters (on the basis of Pasquill stability classification) for dispersion calculations; initial plume rise for hot emissions, dry and wet removal of admixtures are included. AEROPOL supports massive numbers of sources and output grid points, representing e.g. an urban area. The sources are classified into three types: (1) point sources, e.g. industrial stacks; (2) line sources, e.g. streets and roads; (3) area sources, e.g. locally heated dwelling areas. Meteorological data include wind direction and speed and parameters to estimate the atmospheric stability: either (1) net radiation flux or (2) cloud amount and solar elevation derived from date and hour.

Since 1996 AEROPOL has been used for numerous environmental impact assessments and expertises for industrial enterprises (incl. a strategic assessment for Narva Power Plants), animal farms, urban areas and road construction/reconstruction projects. Assuming several minor changes in the code, it is suitable for offline coupling with an advanced mesoscale air pollution model, e.g. SILAM. Benefits from such a coupling are (1) sub-grid preprocessing of urban emission distribution for SILAM and (2) including realistic regional background from SILAM to urban scale runs by AEROPOL


CTM based global aerosol modeling in the marine environment

Hannele Korhonen, University of Kuopio


The largest uncertainties in forward calculations of the currently observed warming of the Earth's climate are due to our limited knowledge of the radiative forcing associated with atmospheric aerosol particles. Fundamental gaps still remain in our understanding of the processes that control the present natural state of aerosols, and this severely limits our ability to predict how aerosols respond to future climate change and hence feed back to climate. The long time (several days to weeks) and large spatial scales (up to thousands of kilometres) involved mean that the driving processes that shape the aerosol physical and chemical properties at a certain location are very difficult to identify solely on the basis of measurement data from fixed sites. Fortunately, in the past few years advances in computing resources have made it possible to develop a new generation of global aerosol models which for the first time offer a high enough size and chemistry resolution for studies of all climatically relevant aerosol properties in detail. The use of analysed wind and cloud fields (i.e. the best possible description of the actual weather conditions and transport routes at the time of the measurement) makes these models ideal for testing our understanding against specific observations.

We present one such new generation global aerosol model GLOMAP that will be being installed in Finland during this spring, and discuss its possibilities for future research. As an example, we show results from a recent study concerning the role of DMS emissions and other oceanic sources in the formation of cloud condensation nuclei (CCN) in the remote southern hemisphere marine boundary layer.



Revision of the soil carbon model of JSBACH using Yasso07

T. Thum1, S. Sevanto2, P. Räisänen1, M. Tuomi3,4, T. Vesala2, J. Liski3

1Climate Change Research, Finnish Meteorological Institute, Helsinki, Finland
2Department of Physics, University of Helsinki, Helsinki, Finland
3Finnish Environmental Institute, Helsinki, Finland
4Department of Mathematics and Statistics, University of Helsinki, Helsinki, Finland

The largest reservoir of terrestrial carbon is located in soil. The increase in temperature caused by the global climate change may release part of this carbon storage to the atmosphere with severe consequences. Therefore, modeling of the temperature sensitivity of soil carbon cycling is very important.

The aim of this study is to improve the temperature sensitivity of the soil carbon cycling description in the JSBACH model, which is the biosphere sub-model of the global climate modeling system COSMOS developed by MPI-Hamburg.

To do that we have implemented a new litter and soil carbon decomposition model Yasso07 (www.environment.fi/syke/yasso) into the JSBACH carbon pool module. Yasso is developed by Jari Liski and Mikko Tuomi at the Finnish Environmental Institute and it has been parameterized against a large dataset with wide geographical spread. In Yasso the soil carbon pools are also described in a more detailed manner than in the original soil carbon model of JSBACH.

In this presentation we show the first results of the comparison of soil carbon release predictions between Yasso and the original JSBACH soil carbon cycling model.


2. Observation usage

Devoted to observations and observing systems, data assimilation, verification and diagnosis of simulations, validation of models, and other kinds of data usage.




Validation of atmospheric simulations: some unconventional examples

Hannu Savijärvi, University of Helsinki

Some case studies are demonstrated where either unconventional observations
or unconventional simulation set-ups (or both) have contributed towards
increase of our understanding small-scale atmospheric phenomena and their
physics. Examples run from Antarctica via the tropics to Finland, and
beyond to planet Mars.

Some views of verification and validation in NWP-models.

Kalle Eerola, Finnish meteorological institute

The verification of NWP products can be looked at from different directions. On one hand, a duty forecaster is interested how much he can trust on the NWP products in his daily work. He may also want to know, what are the typical errors. On the other hand, a modeller is more interested in the reason for errors: if he can understand what process causes the errors, he can possibly improve the model and get better forecasts.

The focus in the presentation is mainly on statistical verification of operational Hirlam forecasts at FMI. L'ong-term time-series of verification scores reveals a general improvement in the Hirlam system over years. Sometimes even the change of the Hirlam version can be seen in the monthly verification scores.

The presentation also outlines the plan of developing a common verification system in Hirlam. The general purpose is to be able to compare different operational Hirlam implementations in a scientifically sound way in near real-time. At the same time it should be suitable to be used in development work.




Use of GPS phase delay observations in numerical weather prediction

Reima Eresmaa, Finnish Meteorological Institute

The potential of microwave phase delay observations, as processed from ground-based measurements of the Global Positioning System (GPS), is investigated in the framework of the High Resolution Limited Area Model (HIRLAM). Microwave phase delay is an integral measure of refractivity, which relates to atmospheric pressure, temperature and humidity. In the context of numerical weather prediction (NWP), the phase delay observations are interpreted as an additional source of humidity information. Since dense networks of continuously-operating ground-based GPS receivers can be built with a relatively low cost, these phase delay observations are considered to constitute a promising observing system for convective-scale NWP systems in future.


Exploiting Doppler radar radial wind observations in NWP

Kirsti Salonen, Finnsih Meteorological Institute

Mesoscale numerical weather prediction (NWP) models call for high
resolution observations. In many countries, such as in Finland,
weather radar network has a good geographical coverage and radial wind
velocity measuring capability. Doppler radar wind measurements are
potentially an excellent source of wind observations for NWP models.

This presentation discusses the Doppler radar radial wind measurement
from the data assimilation point of view, and gives an overview of the
tools developed for radar radial wind observation processing,
modelling and monitoring in the High Resolution Limited Area Model
(HIRLAM) framework.

3. Applications of models and model output

Devoted to modelling and simulation applied to studying, monitoring and forecasting the atmosphere and hydrosphere, climate studies, urban planning and other applications.



Probabilistic climate forecasts based on multi-model ensemble simulations

Jouni Räisänen, Department of Physics, University of Helsinki, jouni.raisanen@helsinki.fi

The global climate is changing. The climatic conditions that can be expected in the future cannot therefore be inferred from observations alone. In fact, even estimates of the prevailing present-day climate should be adjusted for ongoing climate changes, since statistics derived from past observations (e.g., during the current “normal period” 1971-2000) lag behind their time.

Estimates (“forecasts”) of future and present climate can be derived by combining past observations with model-based estimates of climate change. In doing this, it is important to take into account at least the most important uncertainties. For estimates of present-day and near-future climate, these include the internal (“random”) natural variability of climate and the differences between climate models in their response to increasing greenhouse gas concentrations.

In our research, we are using available multi-model ensembles of global climate change simulations to form estimates of the actual present-day climate and probabilistic forecasts of climate change in the near future. One interesting result from our studies is that the extremely mild winter months of December 2006 and March 2007 were, in fact, much less exceptional events for the present climate than would be inferred directly from past observations.







Meso-scale numerical weather prediction at FMI

Sami Niemelä, Finnish Meteorological Institute

Sea level forecasting

Antti Kangas, Laura Tuomi, Pekka Alenius, Hanna Boman, Finnish Institute of Marine Research

The storms in January 2005 and 2007 causing exceptionally high sea levels at the Finish coasts have put emphasis on further developing the sea level forecasting system at Finnish Institute of Marine Research. For operational forecasting three different 2D models are used. These models are run using forcing data from two different atmospheric models (by FMI and ECMWF) to reduce the uncertanties in the forecast. The water exchange between the North Sea and the Baltic Sea is taken into account by forcing sea level at Danish Straits to forecasted values by larger scale models available through BOOS co-operation. The nowcast sea level has the RMSE error between 7 and 14 cm compared to the Finnish tide gauges, however, the error increases with longer forecast periods. When the level difference between the model and the measurements is corrected the error in nowcast reduces to 4...11 cm. Due to the model errors and uncertainties in the forecast, the sea level forecasts have to be analysed and corrected by expert before they can be publicly distributed.




High resolution wave modelling in the Baltic Sea

Laura Tuomi and Heidi Pettersson. Finnish Institute of Marine Research


The resolutions of wave models and the atmospheric models which supply the forcing data to wave models have decreased significantly in the past 10 years. This has greatly improved the accuracy of the wave forecasts in the Baltic Sea. Typical resolutions until now for basin scale modelling have been around 10km, but as the computing power is constantly increasing it has now become possible to run the wave models for limeted areas with resolution of hundreds of meters. These small resolution implementations open possibilities to use 3rd generation wave models like the WAM model for forecasting waves in archipelago areas and also in designing of new waterways and harbors. However, to gain good results with these small resolution applications high resolution ocean bottom topographies and high resolution wind fields are needed. As the high resolution applications of 3rd generation wave models are rather new a throughout validation of the capabilities of the model to predict the properties of the wave field is needed. When verifying small resolution applications it is important to have datasets with high spatial and temporal coverage. This is not usually the case in the present operational ocean observing systems in the Baltic Sea. The wave buoys are located in open sea areas where there is no need for high resolution applications and altimeters tend to have too big footprints to give good data near the irregular coastlines of the norhtern Baltic Sea.

Measurement campaign made by Finnish Institute of Marine Reseach, University of Miami and University of Uppsala near Gotland island in 2003 gives an excellent possibility to validate high resolution WAM model results. During this campaign wave measurements were made for two months period with two directional wave buoys and an air-sea interaction spar buoy (ASIS). A high resolution grid with ~300m resolution was made for the WAM model for Gotland area and two months simulations were run using FMI-HIRLAM fields and measured wind speeds from Östergarnsholm weather mast as forcing data. The first results indicate that small resolution implementation of WAM can reasonably well produce significant wave heights, peak periods and peak wave directions in most cases and spatial features of wave field around shallow areas are well described by the model. However, even though going to high resolution improves the wave model results considerably it does not solve all problems.




Description of Watershed Simulation and Forecasting System of SYKE with focus on Estimation of Areal Values for Temperature and Precipitation
Antti Taskinen, Finland's environmental administration (SYKE)


Watershed Simulation and Forecasting System (WSFS) of SYKE simulates the main components of the hydrological cycle using standard meteorological data. It covers the whole land area of Finland and the border watersheds with the neighbouring countries consisting of altogether 74 main watersheds (600-60 000 km2) and over 5000 sub-basins (50-500 km2). The number of forecasted discharge and water level observation points is about 800. The operational use of the system includes weather and watershed data collection, basic simulation run, updating of model according to observations, runs with different regulation rules for regulated lakes, forecasting run with weather forecast and statistics and delivery of forecasts.


Hydrological simulations of WSFS are performed by a conceptual model based on the HBV model developed originally in Sweden in the 1970's. The basic model components are represented with some examples of output variables and products. More focus is targeted on temperature and precipitation which are the main input variables of the system. To date, their areal values have been estimated using the inverse distance weighting method but now, alternative methods have been developed and tested. These methods, viz. general linear regression model for temperature and Tweedie distributions with the generalized linear regression model for precipitation are shortly described and their results are compared to the ones of the present method. In terms of accuracy of the estimation, there are no major differences between the present and new methods. However, the more solid physical basis of the new methods can be useful particularly in the research purposes.



Model studies on nitrogen and sulphur deposition and concentrations 2000-2007 in Scandinavia.

Marke Hongisto, Finnish Meteorological Institute

Deposition and concentration patterns calculated with the nested Hilatar air pollution model system of the FMI over the period 2000-2007 are presented. Hilatar is an Eulerian grid-point model in which the time change of sulphur and nitrogen pollutant concentrations at in each grid cell are calculated by numerically solving the transport equation containing terms for emissions, advection, vertical turbulence, chemical transformation to other compounds, and sink terms for dry deposition as well as scavenging by rain. The models cover Europe and Baltic Sea and its surroundings with varying grid resolution (0.5o – 0.08o). Long-range transported compounds to the high-resolution Baltic Sea sub-model domain are evaluated with the European Hilatar, and added to the air flowing in at the model boundaries. The model is accompanied with meteorological data base which includes preprocessed time series of three-dimensional 6th hour HIRLAM forecasts over the period 1993-2007. The high-resolution model uses Finnish stack- and areal-source emission inventory, AIS-based ship emissions for sulphur and nitrogen over the Baltic Sea, and EMEP-data base emissions in the other areas. The model has been validated by model-measurement inter-comparisons using monthly, weekly or daily measurement data (concentrations in air and precipitation) of over 90 European EMEP-stations (www.emep.int), using measurements from field campaigns (including ship measurements over the Baltic Sea) and by model-model inter-comparison.


The model structure and development of it’s parameterization scemes are shortly reviewed. Model applications in estimating background area pollutant concentrations and depositions since 2000, comparison to their estimates in the 1990’s, correlation with forest damages in Finland, trends in the airborne load to the Baltic Sea as well as impact of instant weather situations and annual/seasonal variation of meteorological fields and emission development to the airborne pollution loads are discussed.





An overview of the air quality and emergency modelling system SILAM

Sofiev, M., Siljamo, P., Valkama, I., Soares, J., Prank, M., Vira, J., Lanne, M.

The current presentation shortly outlines the main features and applications of the SILAM modelling systems and sets a context for the follow-up talks about the model specific features.

SILAM modelling system was developed as a flexible framework supporting a wide variety of atmospheric composition and pollution dispersion problems. Following the standards of such types of the models, SILAM incorporates an extensive meteorological pre-processor, dynamic core currently including both Eulerian and Lagrangian advection-diffusion formalisms, 8 physico-chemical modules that cover basic SOx-NOx-NHx-O3 chemistry, linearised SOx transformations, radioactive decay of 500 nuclides, several types of size-segregated aerosols, such as sea salt, natural pollen, anthropogenic primary aerosols, toxic persistent pollutants and a passive tracer used for probabilistic computations. The emission composer covers point, area, and nuclear-bomb types of sources. A stand-alone Fire Assimilation System handles the information on wild-land fires observed by satellites. Finally, a set of interfaces and IO-coders links these parts of the system together and allows communication with the outside world.

With the above features, SILAM serves as both operational and research tool. Operationally, it is used as one of corner stones of the Finnish emergency preparedness activities, as well as for air quality forecasting over Europe and Finland. Research studies more concentrated on development of the model chemical and physical modules, data assimilation, model intercomparison and re-analysis of the past-time episodes and trends.


EMERGENCY PREPAREDNESS AND DECISION SUPPORT IN RADIOACTIVE AND CHEMICAL ACCIDENTS

Ilkka Valkama, Finnish Meteorological Institute

In the present world every country must prepare itself against both environmental emergencies, catastrophes, accidents and also terrorist actions. Most countries have adopted modeling systems to forecast the consequences of atmospheric dispersion of airborne radioactive and chemical materials at various scales. The long range transport and dispersion (LRTD) models are used to forecast the dispersion of large emissions of harmful pollutants from point sources, e.g. nuclear power plants (NPP), nuclear detonations (“bombs”) or Radiological Dispersal Devises (RDD, also known as “dirty bombs”).

The focus of this paper is on the Finnish Meteorological Institute’s (FMI) role in the Preparadness for, Prevention of and Response to threats to Finnish population from accidental releases of Nuclear, Biological and Chemical agents (NBC-agents) as well as from terrorist actions. The State rescue authorities in Finland are the Ministry of the Interior and State Provincial Offices. The Ministry of the Interior’s Department for Rescue Services directs and oversees regional rescue departments, is in charge of emergency preparedness on the national level and coordinates the activities of the national authorities and bodies under various ministries. The role and duties of various authorities are listed in the Rescue Act (468/2003) and their respective roles in the Rescue Statute (787/2003).

The FMI duty forecasters provide the rescue services with current meteorological data and weather forecasts during any NBC-emergency. In case of accidents, e.g. warehouse fires, where chemical agents are released, the local rescue services (fire brigades etc.) have at their disposal a Gaussian dispersion model ESCAPE, developed by FMI. In addition FMI’s experts can provide estimates of dispersion of toxic chemicals using more sophisticated dispersion models (e.g. FMI-BUOYANT). During radiological incidents and accidents FMI operationally predicts the dispersal of radioactive materials with an advanced dispersion and dose model SILAM. These products are made available to STUK and other authorities via a dedicated web-site.

The radiation situation in Finland in monitored in real-time by the Radiation and Nuclear Safety Authority (STUK). The aim is to identify radiation hazards quickly and to take effective measures to protect the population and to mitigate the health effects of radiation. In addition to STUK, the Finnish Meteorological Institute, the Finnish Defense Forces and the Ministry of the Interior participate in radiation monitoring.

Inter-organizational exercises are held both on national and international level annually, to maintain the level of efficiency of the rescue forces and other authorities. On the international scene both STUK and FM are closely connected to the respective organizations in other countries via the International Atomic Energy Agency (IAEA) and the World Meteorological Organization (WMO). Under both organizations there exist several systems to facilitate real-time, fast and reliable change of information both during “normal” times an in emergencies.

On the radiological side IAEA runs a Radioactive Environmental Monitoring (REM) network to provide qualified information to the European Commission, the European Parliament and the Member States on the levels of radioactive contamination of the various compartments of the environment (air, water, foodstuff). Routine measurements are managed in the REM database, which contains qualified environmental radioactivity data from all EU Members. In the case of a nuclear or radiological emergency, REM provides support for the exchange of essential data and information

- Messages notifying that an accident has happened, as well as all subsequent information about the current status of the accident and its consequences, which is sent through the official EC emergency network ECURIE (European Community Urgent Radiological Information Exchange).

- Real-time monitoring information collected from national automatic surveillance systems by the EURDEP (European Radiological Data Exchange Platform). EURDEP makes radiological monitoring data (daily gamma dose-rates) from 29 European countries available in nearly real-time. Countries sending their national data have access to the data of all the other participating countries. The data delivery will continue during an emergency but with a higher data transmission frequency.

- Atmospheric dispersion forecasts model results that are exchanged and intercompared within EU-ENSEMBLE.

On the meteorological side WMO maintains a system of eight Regional Specialized Meteorological Centres (RSMCs) which are prepared at all times to provide computer-based predictions of the long-range movement of air-borne radioactivity (Beijing, Exeter, Melbourne, Montreal, Obninsk, Tokyo, Toulouse and Washington). The WMO system is linked to the notification and real-time information system of the Incident and Emergency Centre of the International Atomic Energy Agency (IAEA). When requested, the centres will provide the specialized products within three hours to National Meteorological Centres and the IAEA.

EU-ENSEMBLE (A System to Reconcile Disparate National Forecasts of Medium and Long-range Atmospheric Dispersion) is an approach to treatment and analysis of long-range transport and dispersion model forecasts. The method is called multi-model ensemble dispersion and is based on the simultaneous analysis of several model simulations by means of ad-hoc statistical treatments and parameters. The ensemble dispersion approach and indicators provide a way to reduce several model results to few concise representations that include an estimate of the models’ agreement in predicting a specific scenario. EU-ENSEMBLE is an internet-based server-side system that collects in real-time the long-range dispersion forecast produced by some 40 models run by 23 institutes from 20 countries (16 European + USA, Canada, Japan and Korea). The forecasts are produced by operational long-range transport and dispersion models based on different concepts using meteorological fields produced by different NWP models (ECMWF, various versions of HIRLAM, ARPEGE, ALADIN, RAMS, GME, UM and LM). There have been over 20 international exercises arranged under the EU-ENSEMBLE during 2001-1007. FMI is an active member of the community.

The Nordic countries (plus Canada) support a separate MetNet (Nordic Network of Meteorological Services Engaged in Nuclear Emergency Preparedness) for fast exchange of dispersion information during accidents. The system was established under the Nordic Nuclear Safety Research Program (NKS) in 2003. Under MetNet there has been two emergency modelling exercises per year during the 2003-2006 project period. About half of these exercises have been co-ordinated with major national or international exercises with participation by national radiation protection authorities within the Nordic countries (SSI in Sweden and STUK in Finland). The MetNet will shift under the umbrella of the Nordic Co-operation in Meteorology (NORDMET) from 2008. FMI is an active member of the community.

There are also two dedicated decision support systems (DSS) used by the Nordic countries; RODOS and ARGOS.

RODOS (Real-time On-line Decision Support system for off-site emergency management) provides consistent and comprehensive information on the present and future radiological situation, the extent and the benefits and drawbacks of emergency actions and countermeasures, and methodological support for taking decisions on emergency response strategies. RODOS includes detailed dose assessment modules that utilise the results of atmospheric dispersion module. The system is in operational use in 14 EU-countries. Only Nordic user is Finland (STUK).

ARGOS (Accident Reporting and Guidance Operational System) a commercial DSS capable of requesting long-range simulations automatically carried out at remotely located collaborating national meteorological services via internet. Upon completion of model runs, ARGOS receives the simulation results, displaying them by means of sophisticated graphics and performing certain numerical analyses, e.g. dose calculations, based on the model results. The members of ARGOS Consortium are mainly national organizations responsible for emergency management. There is no actual license fee to the system, but rather an annual fee, set relative to the GDP of each country, to cover development and maintenance of system. ARGOS is currently used in 11 countries: Australia, Brasil, Canada, Denmark, Estonia, Ireland, Latvia, Lithuania, Norway, Poland and Sweden. ARGOS is not used in Finland.





ANALYSIS AND FORECASTS OF THE BIRCH POLLEN SEASON IN EUROPE USING ATMOSPHERIC AND BIOLOGICAL MODELS

Pilvi Siljamo¹*, Mikhail Sofiev¹, Hanna Ranta², Tapio Linkosalo³

1 Finnish Meteorological Institute
2 University of Turku
3 University of Helsinki


The paper presents a forecasting system of birch pollen long-range transport. The system development was funded by the Academy of Finland and performed at Finnish Meteorological Institute (FMI) together with the Aerobiology Unit of the University of Turku and the Department of Forest Ecology of the University of Helsinki and in close collaboration with European Aeroallergen Network and 5 other European institutes.

Pollen is a known source of allergy-related deceases. The overall prevalence of seasonal allergic rhinitis in Europe is approximately 15%. Observational evidence and a theoretical ground are mounting that the pollen grains of the wind-pollinating plants, despite their large size, can be transported over hundreds and even thousands of kilometres and significantly affect pollen concentration in many regions making it less dependent on the local conditions.

Conventional predictions of pollen concentrations are made using phenological and pollen observations, pollen calendar and weather forecasts. The method works well when the local flowering has started, but is not able to forecast long-range transported pollen before or after local pollination. However, allergic persons should start their medication in advance of exposure to allergens and this can happen even weeks before start of local flowering. Because pollen does not know territorial borders, European wide numerical pollen concentrations forecasts are needed.

The pollen forecasting system consists of several sub-models. The system is based on a numerical weather prediction model (HIRLAM or ECMWF) which gives information to an atmospheric dispersion model (SILAM), to a phenological model (thermal time type) for starting date of flowering and to a pollen release model.

Numerical forecasts of birch pollen concentration in springs have been done at FMI since 2005 and model has been developed throughout these years. User experience at the University of Turku, Aerobiology Unit is positive and the model has improved pollen forecasts, especially in cases of long-range transport.

The recent status of the system and results for a few past years will be presented and their main features and quality will be discussed.


Analysis of the observational campaigns by means of inverse modeling

M.Prank, M.Kaasik, O.Tarasova, M.Sofiev


The current paper presents the results of inferring the origin of peculiar observations of atmospheric aerosols or gases measured during different measurement campaigns. It is an illustration of the inverse- (adjoint) mode simulations made with SILAM.

Two examples will be presented.

The first field experiments were carried out at the SMEAR I site (Station for Measuring Forest Ecosystem – Atmosphere Relation, 67o46’N, 29o35’E), located in Värriö nature park in eastern Lapland, less than 10 km from the border of Russia (Ruuskanen et al., 2007). Campaign included measurements of aerosol particle size distributions with EAS (electric aerosol spectrometer, Tammet et al., 2002) from April 28 to May 11, 2003.

Inverse model computations were performed in order to clarify the probable source area for high concentration peaks and to study the footprints of nucleation events. As an unexpected result an error was found in the EMEP database, where the location a large emission source, town Nikel where a large metallurgy plant is situated, was misplaced about 100-150 km eastwards, in the Murmansk area.

For a set of nucleation events observed during the campaign, inverse simulations revealed the origins of the aerosol pre-cursors, gave a clue on their chemical composition, and provided an estimate of temporal and spatial scales of the nucleation events.

In the other measurement campaign the concentration and radiocarbon (14C) and stable isotope (13C and 18O) content of CO were determined in air samples collected across Russia (about 8,500 km) and along the Ob river during the summer of 1999 to study the CO sources and sinks. SILAM was used to delineate the source areas of all samples used in the isotope analysis. Particular attention was paid to two samples with elevated 14CO concentrations. The modeling results show that the enrichment in 14CO in those two samples may be related to nuclear industry. The area with high probability to include the source of air mass enriched with 14C is quite limited in space and covers practically only one major industrial area with radioactivity-related processes: the Tomsk-7 factory located a few tens of km north–west of the Tomsk city, there was a small-scale release of volatile radioactive nuclides a few weeks before the current campaign. Most probably, the trailing part of the release is the very reason for the observed rises of 14 C concentrations.


On impact of wild-land fires on atmospheric composition

Mikhail Sofiev1, Roman Vankevich2, Milla Lanne1, Valeriy Petuckhov2

1 Finnish Meteorological Institute, Helsinki, Finland, FI-00101

2 Russian State Hydrometeorological University, St.Petersburg, Russia


This paper considers an impact of wild-land fires on atmospheric composition in Europe as seen with two versions of the Fire Assimilation System (FAS). FAS is jointly developed by Finnish Meteorological Institute and Russian State Hydrometeorological University. The system versions are based on (partly) independent satellite products from the MODIS instrument: Temperature Anomalies (TA) of the Rapid Response systems (hot-spot counts) and the Fire Radiative Power (FRP). The observed quantities – the pixel absolute temperature and radiative emissivity – are converted to emission fluxes via empirical emission factors. Both versions of FAS are integrated with the Air Quality and Emergency Modelling System SILAM, which uses the estimated emissions for the atmospheric composition simulations merging them with the anthropogenic and natural emission fluxes.Using the SILAM simulations of selected episodes and MODIS aerosol optical density observations for comparison, the recalibration of the literature-available emission factors has been done.

Several episodes have been analyzed and the impact of fires with contribution from anthropogenic sources have been compared. Furthermore, the model predictions have been compared with available information from ground-based monitoring sites and the satellites retrievals.

The results of the operational air quality forecasts with the integrated system are available from http://silam.fmi.fi.


Chemical weather forecasting: acidification, ozone and aerosols

Soares, J., Sofiev, M.

Tropospheric aerosols and ozone are the key pollutants of the greatest concern for the public health. Ozone is involved in a photochemical cycle governed by NOx and VOC, as well as by strong atmospheric oxidants, such as OH and HO2. Aerosols constitute of a large variety of chemically active and inert species, being partly produced from the main photochemical transformations, formed by aerosol-dynamics processes from organic precursors, and emitted from anthropogenic and natural sources. Persistent toxic species are characterized by long environmental life time and, in many cases, capability to cycle between the compartments (air, water, soil, biota).

This paper presents the chemical modules of SILAM, their relation to the existing chemical transport models and to-date verification against observed concentrations in Europe and Finland.

The current SILAM basic-chemistry scheme covers 21 transported and 5 short-lived substances, which are inter-related via ~60 chemical reactions. In comparison with the other existing schemes, this mechanisms covers the most important transformation chains for the photochemical smog formation (except for some of the VOC oxidation pathways parameterized in a simplified manner) and takes into account the main reservoir substances. Apart from the main list of anthropogenic pollutants, SILAM also considers production and loss of the natural aerosols, such as sea salt.

To reach a sufficient efficiency of the scheme, all reactions are segregated in accordance with their actual rates (in relation to the model time step), split to day- and night-time processes and treated via forced-equilibrium, first-order explicit or third-order implicit numerical algorithms. Apart from chemistry, modeling ozone brings other challenges to the model: emission factors for different species, boundary conditions for long-living substances such as ozone (in clean environment), carbon monoxide and some toxic pollutants.

The scheme development and evaluation is based on a reference year of 2000, for which the preliminary results of the model applications and comparison with the European observations will be shown.


Data assimilation for air quality analysis and forecasting

M.Sofiev, J.Vira, P.Siljamo, I.Valkama, K.Riikonen

In numerical weather prediction systems, observations are routinely utilized to initialize the model in a numerical process known as data assimilation. This presentation discusses the possibilities of applying the same computational methods to various air quality and dispersion problems.

While the methods used in weather forecasting are, in general, readily applicable, some features specific to air quality problems require changes to the classical formulations. In AQ forecast problems one of the key issues is extending the data assimilation to the estimation of emission sources. In particular, we consider the four-dimensional variational assimilation method (4D-VAR), which is currently being implemented to the SILAM model.

In order to demonstrate the usefulness, and, possibly, limits of data assimilation in dispersion problems, some previous case studies on the transport of air pollutants and pollen are presented.

For the first trial application, the dataset from the first European Tracer Experiment ETEX has been taken. It consists of 3-hour mean observations from ~150 stations spread over Europe, which followed the 3-day dispersion of tracer released in France during 23-24.10.1994.

The second trial application of 4D-VAR for the source apportionment constituted an inverse problem study of the Chernobyl accident. Contrary to the ETEX-1 dataset, in 1986 the observations had daily time resolution or worse, which reduced the quality of information. Low density of the stations created extra challenges for the source-apportionment studies. As a result, the convergence of the data assimilation iterations for the Chernobyl source was not guaranteed. However, the solution was still possible providing that a full set of observations is utilised.

More examples of a simplified data assimilation for source-apportionment for the airborne pollutants, such as pollen and benzene, will be presented.


A construction and evaluation of a new Eulerian dynamic core for the emergency and air quality modelling system SILAM

Sofiev1, M., Galperin2, M., Genikhovich3, E.

1Finnish Meteorological Institute, Air Quality Research, Finland, mikhail.sofiev@fmi.fi

2Independent researcher, Russia, halperin@rdm.ru

3Main Geophysical Observatory, Russia, ego@mailbox.alcor.ru

The paper presents a new dynamic core of the SILAM modelling system and compares its performance with the Lagrangian Monte-Carlo random walk particle model. The objective of the study was to develop a dynamic core, which is capable of handling the strongly non-linear processes of atmospheric chemistry and aerosol dynamics but does not compromise the accuracy and efficiency of emergency simulations dealing with strong point sources and thus often referred as no-step area for Eulerian systems. The new SILAM core is based on original Eulerian advection algorithm combined with extended resistance scheme for vertical diffusion. Apart from the standard advantages of Eulerian environment, the new system has several unique features fully meeting the above requirements: (i) advection has exactly zero numerical viscosity and a possibility to utilise the sub-grid information about the location of centre of mass inside a grid cell; (ii) it is robust to sharp gradients of concentrations and does not dilute them during the transport, (iii) it can operate at high Courant numbers (tests up to C=10 will be presented), (iv) horizontal diffusion can be prescribed or dynamically evaluated during the run; (v) the vertical diffusion scheme operates with thick layers providing exact flux values at the layer boundaries and utilising the sub-grid information provided by the advection. Comparison with the currently operational Lagrangian advection for real cases showed generally similar patterns of the concentrations and depositions but also revealed substantial differences in detailed features of the vertical and, to a less extent, horizontal mass distribution. The former is attributed to more accurate computations of vertical profile of diffusion coefficient and ability of the dynamic core to utilise this information.

4. Computing

Devoted to high-performance computing, data management, and technological issues of modelling in general

Operational numerical weather prediction at FMI
Markku Kangas, Finnish Meteorological Institute

Numerical weather forecast production is a complicated system, which consists of not only the
actual forecast model, but which also need subsystems to manage an utilize observations as well
as methods to control and monitor the process. In this presentation, the operational setup and
procedures for running limited area weather model HIRLAM at FMI are briefly described.

Climate modelling services at CSC
Tommi Bergman, The Finnish IT center for science, CSC

In 2007 the Finnish modelling commmunity gained more computational power when the new Cray supercomputer and HP supercluster procurement was completed. The presentation will provide an overview of CSCs services for the modelling community. Also a few on-going and finished climate modelling projects are presented.

Large node supercomputers solving the most demanding compute and data-intensive problems

Seppo Jarimo , Atea Finland Oy
   
Modern High Performance Computing has evolved to larger and larger clusters consisting thousands of CPU
cores. The initial cluster hw-cost may be tempting compared to large node supercomputers, but the labor cost to get a traditional cluster installed and applications tuned may well go far beyond the hw-costs.

SGI Altix large node supercomputers support all main application programming models:
- Traditional symmetric shared memory multiprocessing with compilers supporting parallel processing
– OpenMP
– MPI

All these programming models benefit the low latency global shared memory, that scales to several terabytes per node. The intelligent blade architecture supports large number of I/O-blades per node and thus providing best I/O scalability.

This presentation will explain SGI Altix supercomputer architecture on fairly high level and also give insight to
major present installations in Finland.




5. Visions and opportunities

Devoted to funding mechanisms, emerging research programmes and visions for future co-operation involving the modelling community


Elements of the GMES Marine Core Service in the Baltic

Tapani Stipa, Finnish Institute of Marine Research

The GMES Marine Core Service is approaching implementation phase. The presentation will provide an overview of the MCS and its implementation as presently planned. The challenges that lie ahead are addressed as well.


Visions of the Future of Modelling in Finland

David M. Schultz, Division of Atmospheric Sciences and Geophysics, Department of Physics, University of Helsinki, and Finnish Meteorological Institute

The purpose of this talk is to discuss some possible visions for the future of environmental modelling in Finland.  This talk will focus on three issues: formation of a consortium for modeling, formation of a national environmental data center, and the development of diagnostic tools.  As has happened in other locations (e.g., Seattle), FMI should lead the formation of modeling consortium for environmental modelling.  The consortium would share resources and fund basic research on the effect of domain size, model resolution, and boundary conditions on predictability in order to produce better forecasts of environmental conditions.  The involvement of commercial companies and the development of customer-specific applications would be a part of such a consortium.  Second, a national data center for environmental data should be created.  One component of such a center is to grow the Helsinki Testbed from a local resource into a national or Nordic resource.  Such data could be used to initialize and verify environmental models in northern Europe, as well as create opportunities for commercialization.  Third, to support interdisciplinary research, collaboration between research and operations, and transition of research products into operations, diagnostic tools need to be developed.  These tools should be shared among many groups, yet remain flexible.  I present the existence of several such tools already elsewhere that could easily be adapted for use in Finland.


Role of CSC in environmental modelling
Pirjo-Leena Forsström, The Finnish IT center for science, CSC

The availability of computing resources is increasingly becoming a critical factor for success in science.  As a response, national and transnational e-Infrastructures in Europe are evolving. A European model of a sustainable high performance ecosystem consists of a small number of supercomputer centres offering computing service at the highest performance level; national and regional centres with supercomputers offering a the performance to run most of the advanced computing; and the local computing centres in universities, research labs or in other organizations strengthening software development and researchers' competence in computational science.

The transnational infrastructure is being devoped via EU-level roadmaps (e-Infrastructures Roadmap by e-IRG) and projects like PRACE. The national infrastructures ecosystem for hardware, software, services and support needs to defined to serve all reserchers involved with environmental modelling. This includes issues like computers, data storage and long term storage, networks, grids and middleware, services, operation and support. Environmental modelling needs several of the above-mentioned, and strong national cross-disciplinary collaboration is needed to build the infrastructure.



































.