Grid Technology for Geosciences - Scenarios

Related pages (edit)
Project Scenarios
Project Outputs
System Specification
Issue Tracking
Project Journal
APAC Technical Discussion
Team Meetings
Development Toolkits
External Links
APAC computational infrastructure project
Gridsphere Portal toolkit
Globus toolkit

EarthByte 4D Data Portal general scenario

EarthByte will allow users to seamlessly connect geological and geophysical observations coded by tectonic plate and geological time to simulation software and visualisation/multimedia tools

Rough Scenario:
  1. Access site/data on APAC Grid
  2. Download from site
  3. Manage and visualise
  4. Use as input/boundary constraints for modelling run - modelling run performed on the Computational Services described above
  5. Get results and use subset of data to validate model.

Visualizing and interpreting mantle convection model outputs with plate kinematic data

  1. Access time-sequence of global mantle convection model outputs on EarthByte server (GMT netcdf files). Outputs could include mantle density anomaly at a given depth or predicted dynamic topography. (I have a set of preliminary images along these lines from Bernhard Steinberger's analytical flow models). Together they are several gigabytes in size.
  2. Extract selected global GPML data from EarthByte server, coded by tectonic plate and geological age (eg outlines of continents, sedimentary basins, cratons, and foldbelts, igneous provinces)
  3. Visualise mantle density anomaly in upper mantle and/or dynamic topography in a paleogeographic context. Investigate adjacency associations between:
  4. igneous provinces and mantle upwellings/plumes through time
  5. negative dynamic topography with sedimentary basins through time
-- DietmarMuller - 19 Oct 2004

Interactive inversion using Nimrod/oi

Nimrod/oi ( is a web tool which allows a user to interact with computational forward models to steer an inversion using a mixture of visual feedback and quantitative fit. Boschetti and Moresi demonstrated that this approach can be used successfully by geological experts to run small-dimensional inversions without having to learn to operate complicated codes. The user simply has to pick preferred models from the 10-20 models which are presented to him/her at each iteration cycle.

The computational challenge is to provide sufficient resources that 10-20 meaningfully resolved calculations with rich physics can be run in the concentration span of a typical geoscientist (for argument's sake we might imagine that someone will be prepared to sit in front of a screen for a few minutes / come back after a coffee (20 minutes) / rank some runs each morning and evening (12 hours)). Nimrod/Oi is based upon the Nimrod portal which knows how to distribute jobs across the computational grid to maximize the resources available to the inversion.

Typical workflow

Say we want to model extension of a basin that formed close to a mantle plume (in 2d for starters). The data we have to constrain a coupled plate-mantle evolution model may include:

  1. tectonic subsidence curves
  2. thermal history data
  3. observed crustal thickness and
  4. geochemical data from volcanics
  5. data on the brittle reactivation of faults.

The semi-interactive inversion process takes the form of constraining the model by a mix of a formal minimization of the modeled and observed time history of tectonic subsidence/heatflow and a visual inspection of the model results after each batch of model runs, in which the total parameter space is explored by random perturbations via genetic algorithms.

Nimrod/oi provides the access to codes which are run remotely on the (APAC) computing grid, and also to access a cache of previous models which match the current request to a required tolerance so that the user does not need to worry about where the models are run or how to manage the remote resources. Some additional functionality for Nimrod/oi is needed to record the model parameterization, model templates, and the modeled scenario in the EarthByte model library alongside the rankings for all the computed models.

The richness of this approach becomes clearer once a number of users have begun to populate the model cache. Along with a growing library of models comes a collection of goodness-of-fit rankings for geological scenarios for each of these users. This would provide us with the first steps in a way to "file" and retrieve numerical models according to their potential for geological application.


  • Snark - we should use Snark for this which requires a lot of attention to computational efficiency, suitable physics built in, and model libraries.

-- DietmarMuller - 19 Oct 2004 / LouisMoresi - 12 Nov 2004

Modelling time-dependent global mantle flow in 3D

  1. Use global plate kinematic model from EarthByte GPlates software to create surface plate velocity grids as top layer boundary input into CitcomS 3D FEM software (into a convecting, viscous mantle)
  2. Palaeo-oceanic age grids will also be used as boundary conditions to incorporate the thermal properties of the slab into the models.
  3. The process of creating global mantle flow fields constrained by surface kinematics and lithospheric boundary conditions will be iterative. The quality of any given model will be evaluated based on comparing the present-day observed mantle density distribution from seismic tomography with modelled mantle density distributions. A series of models will be run until a best-fit solution is reached. The 3D global mantle flow models will enable us to examine the long-term insertion of slabs into the mantle. These models will be used to predict the structure of mantle heterogeneity/buoyancy, examine the relationship between subduction zone parameters and slab dynamics and to examine the link between mantle convection and changes in global plate motions.
  4. Model inputs and outputs will be stored in EarthByte data base.
-- DietmarMuller - 19 Oct 2004

Implicit 3D geological modelling technology linking data and computing grid

A proposed scenario in the exploration area, applicable to both minerals and energy is to develop an implicit 3D geological modelling technology that is tightly linked to forward modelling, inversion and uncertainty. Details required from JoanEsterle

Modelling large deformation and fluid flow in crust using supercomputer

  1. Simulation of large plasic deformation using the Mohr-Coulomb rheology and prediction of associated dilatancy in crustal levels
  2. Coupling mechanical deformation/dilation development with fluid flow: 2a) porous flow - the first target; 2b) preferential flow along discrete structures (faults or fractures) - a desirable target.
  3. Incorporation of viscous rheology (Newtonian viscous and/or power-law viscous material flow) for deeper levels (lower crust and upper mantle)
  4. Use of existing data (rock strength parameters) in these models
-- YanhuaZhang - 22 Oct 2004

Scenarios required for Finley and Snark (Louis Moresi)

Initially a mantle convection problem but this is only a precursor to enabling broader geoscience simulation services

Geoscience Data Repositories

Earthbytes in one example and the various Australian geological surveys have publicly available datasets as well (see Infosrvices). Individual researchers will have their own private research results for use by computational services (either as part of input or output). There is likley a need for some of these datasets to be housed on the APAC Grid data infrastructure. Comments are sought from Geoscience researchers on this issue to develop additional scenarios in this area.

Management of modelling inputs and resulting data sets is also an important area to address from the start. Meta-data on the content of these archives is requried for efficient data managemnt and can be collected and stored as part of the modelling workflow. There is a fairly broad need for this type of Scientific data management across potential APAC users and this is a likely candidate for inclusion in the e-Science layer rather than specifically in the Geosciences. From a geoscience perspective the collection of meta-data and data archiving combined with a common geosciene information model presents opportunities for data mining activities across the combined modelling outputs. This datamining could be utilise to build up a picture of what modelling parameters are used to produce a particular geological outcome or phenomenom.

Creation of a standard service and portal facility for scientific data management presents the opportunity for automated archiving of data to be built into future research workflows will minimal effort.

-- RobertWoodcock - 23 Oct 2004

Modelling global thermochemical evolution in 3-D (FINLEY, TERRA, SNARK)

Keywords: coupled models of different scales, mantle convection, chemical tracing & feedback

Understanding of major global mineralisation events

  • requires understanding of Precambrian scenarios, efficient parameterisations of mechanisms such as melt extraction and its effect on the dynamics of the largest scale.

  • Comparisons with planets such as Mars and Venus will help to understand the dynamics and peculiarity of the Earth. Mars has a very old stagnant lid and is a good candidate to study the outcome of the crystallisation of an initial magma ocean.

The success of models for such processes hinges decisively on the strategy to couple processes on different scales in an efficient and robust manner.

  • This requires that models defined on successive scales have to be constrained by the results of the model from the next larger scale (1st step).

  • The next step than is to enable feedback from the small to the larger scales as well. Strategy: Obtain parameterisations for small scale processes and use them in large scale models. The mutual refinement of models is an iterative process. A more active feedback would be the direct coupling of a small scale model to the critical zones of a large scale model, eg a fault model to a convection model.

-- KlausGottschaldt -- HansMuhlhaus - 01 Nov 2004

escriot/finley (another view)

The modelframe environment for escript/finley is designed to create XML interfaces to models implemented in python. The XML gives a structural representation of the models and their coupling and allows to recover a simulation script from XML. Through a suitable web interface it will be suitable to set set model input parameters and to link existing models. The XML file can then be processed to any "modelframe" service.

-- LutzGross
Topic revision: r16 - 15 Oct 2010, UnknownUser

Current license: All material on this collaboration platform is licensed under a Creative Commons Attribution 3.0 Australia Licence (CC BY 3.0).