"Seegrid will be due for a migration to confluence on the 1st of August. Any update on or after the 1st of August will NOT be migrated"

Test Page for the Apac Geoscience Grid

Geoscience Grid Test Tools

Grid Tests

Introduction

Why run these tests? So we can assure that these services are available to our users and in the event they fail, we can act. Tests essentially check site is operational and is working for our workflow.

The Tests The tests are run from a test user on a internal and external machine. Essentially tests are checked out of svn and a cron.daily script has been created to automate the daily execution of the tests. "csiro-gridtest.py" is used to execute the tests, test reports are outputed to the apache html dir

Code repo https://cgsrv1.arrc.csiro.au/subversion/apacgrid/trunk/GatewayTests/

Daily Test Results (CSIRO external network) - http://ngportal.ivec.org/output.html
(CSIRO internal network) - http://geotest.arrc.csiro.au/

GT4 Service Tests

Introduction

Why run these tests? So we can assure that these GT4 services we developed are available to our users and in the event they fail, we can determine the problem early. Tests essentially check site is operational and is working for our workflow.

The Tests The tests are run from a test user on a internal machine. Essentially tests are checked out of svn and a cron.daily script has been created to automate the daily execution of the tests. "ant" is used to execute the service tests which produce reports using junit.

Code repo https://cgsrv1.arrc.csiro.au/subversion/apacgrid/trunk/GT4ServiceTests/

Daily Test Results

(CSIRO internal network) http://geotest.arrc.csiro.au/reports/

Geoscience Grid Test Matrix (outdated)

Output from recent tests:
(CSIRO external network) - http://ngportal.ivec.org/output.html
(CSIRO internal network) - (gt4 installs) http://geotest.arrc.csiro.au/ & (service tests) http://geotest.arrc.csiro.au/reports/

Test scripts and input files are available at: https://cgsrv1.arrc.csiro.au/subversion/apacgrid/trunk/GatewayTests/

NOTE: Tests are currently configured to run via the test user fra283, they expect data to be on ngdata@ivec

This test require automation and automatic execution daily in morning

Test Number Test Name Test File Description Status
GT001 Globusrun-ws iVEC   Tests the availability of the ALL gridftp servers (ng2, Cognac & ngdata) and ws-gram at iVEC. Runs simple job (hostname), stages data from ngdata  
GT002 Globusrun-ws APAC-NF   Tests the availability of the ALL gridftp servers (ng2, "ac" & ngdata@iVEC) and ws-gram at APAC-NF. Runs simple job (hostname), stages data from ngdata@iVEC  
GT003 Globusrun-ws VPAC   Tests the availability of the ALL gridftp servers (ng2, "ac" & ngdata@iVEC) and ws-gram at VPAC. Runs simple job (hostname), stages data from ngdata@iVEC  
GT004 Globusrun-ws HPSC   Tests the availability of the ALL gridftp servers (ng2, "ac" & ngdata@iVEC) and ws-gram at HPSC. Runs simple job (hostname), stages data from ngdata@iVEC  
GT005 Globusrun-ws iVEC FastFlo ivecPythonTest.xml, ivecFastFlowTest.xml, ivecFastFlowTest2.xml Tests the availability and functionality of FastFlo, ALL gridftp servers (ng2, "Cognac" & ngdata@iVEC) and ws-gram at iVEC. Runs simple job FastFlo job. Data is staged from ngdata@iVEC, job runs on Cognac, data is staged back out to ngdata@iVEC. Test validates that output files match expected output of run (compare output files of known run Vs output files of Test run  
GT006 Globusrun-ws iVEC Finley ivecFinley.xml Tests the availability and functionality of Finley/EScript, ALL gridftp servers (ng2, "Cognac" & ngdata@iVEC) and ws-gram at iVEC. Runs simple job Finley job. Data is staged from ngdata@iVEC, job runs on Cognac, data is staged back out to ngdata@iVEC. Test validates that output files match expected output of run (compare output files of known run Vs output files of Test run  
GT007 Globusrun-ws iVEC Snark ivecUnderworldTest.xml Tests the availability and functionality of Snark/Underworld, ALL gridftp servers (ng2, "Cognac" & ngdata@iVEC) and ws-gram at iVEC. Runs simple job Snark/Underworld job. Data is staged from ngdata@iVEC, job runs on Cognac, data is staged back out to ngdata@iVEC. Test validates that output files match expected output of run (compare output files of known run Vs output files of Test run  
GT008 Globusrun-ws HPSC FastFlo hpscFastFlowTest.xml Tests the availability and functionality of FastFlo, ALL gridftp servers (ng2, "Burnet" & ngdata@iVEC) and ws-gram at HPSC. Runs simple job FastFlo job. Data is staged from ngdata@iVEC, job runs on Burnet, data is staged back out to ngdata@iVEC. Test validates that output files match expected output of run (compare output files of known run Vs output files of Test run  
GT009 Globusrun-ws APAC-NF Finley apacFinley.xml Tests the availability and functionality of Finley, ALL gridftp servers (ng2, "ac" & ngdata@iVEC) and ws-gram at APAC-NF. Runs simple job Finley job. Data is staged from ngdata@iVEC, job runs on "ac", data is staged back out to ngdata@iVEC. Test validates that output files match expected output of run (compare output files of known run Vs output files of Test run  
GT010 Globusrun-ws APAC-NF Snark apacUnderworldTest.xml Tests the availability and functionality of Snark, ALL gridftp servers (ng2, "ac" & ngdata@iVEC) and ws-gram at APAC-NF. Runs simple job Snark job. Data is staged from ngdata@iVEC, job runs on "ac", data is staged back out to ngdata@iVEC. Test validates that output files match expected output of run (compare output files of known run Vs output files of Test run  
GT011 Globusrun-ws VPAC Snark vpacUnderworldTest.xml Tests the availability and functionality of Snark, ALL gridftp servers (ng2, "ac" & ngdata@iVEC) and ws-gram at VPAC. Runs simple job Snark job. Data is staged from ngdata@iVEC, job runs on "brecca", data is staged back out to ngdata@iVEC. Test validates that output files match expected output of run (compare output files of known run Vs output files of Test run  
GT012 SRB@CSIRO-ARRC ? simple tests, checks SRB is alive, runs a dummy transfer - Sputs a file, then Sget's the file back, compares that returned file is same as original  
GT013 SRB@CSIRO-HPSC ? simple tests, checks SRB is alive, runs a dummy transfer - Sputs a file, then Sget's the file back, compares that returned file is same as original  
GT014 Snark-WS@iVEC RF TODO Unit test 1: Runs a snark job via the service - compares output files; 2: Runs job, waits till active then kills the job, asserts that job dies. 3: Runs a job, waits till active, then gets results if there are any  
GT015 Finley-WS@iVEC RF TODO Unit test 1: Runs a finley job via the service - compares output files; 2: Runs job, waits till active then kills the job, asserts that job dies. 3: Runs a job, waits till active, then gets results if there are any  
GT016 FastFlo-WS@iVEC RF TODO Unit test 1: Runs a fastflo job via the service - compares output files; 2: Runs job, waits till active then kills the job, asserts that job dies. 3: Runs a job, waits till active, then gets results if there are any  
GT017 Snark-WS@APAC-NF RF TODO Unit test 1: Runs a Snark job via the service - compares output files; 2: Runs job, waits till active then kills the job, asserts that job dies. 3: Runs a job, waits till active, then gets results if there are any  
GT018 Finley-WS@APAC-NF RF TODO Unit test 1: Runs a finley job via the service - compares output files; 2: Runs job, waits till active then kills the job, asserts that job dies. 3: Runs a job, waits till active, then gets results if there are any  
GT019 Snark-WS@VPAC RF TODO Unit test 1: Runs a snark job via the service - compares output files; 2: Runs job, waits till active then kills the job, asserts that job dies. 3: Runs a job, waits till active, then gets results if there are any  

Test date: 23/5/06

Tests are validated at 4/6/06 - Completed

%EDITTABLE{ header="| # | Test No * | *Test Name | Test Description | * Input Files* | *Test Result * | " format="| row, -1 | text, 25, Unknown | label, 0, 24 Mar 2019 01:53 | textarea, 3x25, Component(s) | textarea, 3x25, Description | textarea, 3x25, Proposal | select, 1, Unaddressed, Progressing, Resolved |" changerows="on" }%
Test No Test Name Test Description Input Files Test Result Result text
1 Underworld PC -> iVEC Submit Underworld Job from PC to iVEC. First copy input files to ngdata@ivec using globus-url-copy, submit job to ng2@iVEC with staging from ngdata@iVEC; user should be able to retrieve results from ngdata@iVEC using globus-url-copy - Use test script 1 and ONLY change host names and queue details arr-brecca.xml - Workflow works, MOST parts of the work except:
1.1 the local check for directory fails with GLOBUS_SCRATCH_DIR (if this doesn't exist on ng2, job fails - can we bypass this check?
1.2 verify GRID env vars work ok required in script
1.3 mismatch in versions of Underworld between iVEC and VPAC, please resolve this so that the same Underworld script can work at BOTH sites
1.4 Single module file: currently many module files across sites - iVEC's "snark" module file appears to load all necessary for St_FEM whereas ANU's "stgermain" file does not - can we have a single module that loads ALL necessary for ALL things St Germain??
1.1 - Directory Check removed. TR-2006-05-24
1.2 - Grid Vars. Handed to Darran - resolution proposed by Steve McMahan here.
1.3 Versioning handed to Dave Bannon @ vpac.
1.4 module standardisation handed to David Bannon and Ben Evans
Progressing
2 Underworld PC -> VPAC Submit Snark Job from PC to VPAC. Copy input files to ngdata@ivec using globus-url-copy. Submit job to ng2@VPAC with staging from ngdata@iVEC. User should be able to retrieve results from ngdata@iVEC using globus-url-copy - Use test script 1 and ONLY change host names and queue details arr-brecca.xml This Works! no actions needed Resolved
3 Snark PC -> iVEC Submit Snark (StG_FEM) Job from PC to iVEC. First copy input files to ngdata@ivec using globus-url-copy, submit job to ng2@iVEC with staging from ngdata@iVEC; user should be able to retrieve results from ngdata@iVEC using globus-url-copy. Use test script 3 and ONLY change host names and queue details In SRB /CSIRO-COMPGEO/home/fra283.
CSIRO-COMPGEO/inputs/IvecSnarkTest
This works - except:
see 1.1 & 1.2
Resolution is as above. Progressing
4 Snark PC -> VPAC Submit Snark (StG_FEM) Job from PC to iVEC. First copy input files to ngdata@ivec using globus-url-copy, submit job to ng2@iVEC with staging from ngdata@iVEC; user should be able to retrieve results from ngdata@iVEC using globus-url-copy. Use test script 3 and ONLY change host names and queue details In SRB /CSIRO-COMPGEO/home/fra283.
CSIRO-COMPGEO/inputs/IvecSnarkTest
This does not work
VPAC DOES NOT have a "snark" module file, therefore the "stgermain" module is loaded in replace of the "snark" module, however this module DOES NOT setup the correct env for StG_FEM and has errors such as:
"Cannot find default constructor function for type 'AdvDiffResidual'"
Suggestion: 1 single module file is created for St Germain and including parts of both the "snark" and current "stgermain" modules and dist. across grid
David Bannon has promised along with patrick and alan to sort out a release, which will be rolled out to developers. Progressing
5 Fastflo PC -> iVEC Submit Fastflo Job from PC to iVEC. First copy input files to ngdata@ivec using globus-url-copy, submit job to ng2@iVEC with staging from ngdata@iVEC; user should be able to retrieve results from ngdata@iVEC using globus-url-copy. Use test script 2 and ONLY change host names and queue details * poiss_ci.py.txt:
* poiss_ci.msh:
This works!!! NO DRAMAS here
ok Resolved
6 Fastflo PC -> HPSC Submit Fastflo Job from PC to VPAC. First copy input files to ngdata@iVEC using globus-url-copy, submit job to ng2@HPSC with staging from ngdata@iVEC; user should be able to retrieve results from ngdata@iVEC using globus-url-copy. Use test script 2 and ONLY change host names and queue details * poiss_ci.py.txt:
* poiss_ci.msh:
NOT WORKING - can not connect to ng2@HPSC Spoke to Jeroen, will be fixed soon Progressing

These tests are a precursor to running the unit tests for the Snark and Fastflo services. Once the gridftp and https job submission comms between CSIRO local computers and external networks such as iVEC and VPAC are verified, unit tests on services will commence.

Test script 1: This script will stage data from a directory in "store" on ngdata@ivec to the $GLOBUS_SCRATCH_DIR on host; run a Snark job (Underworld) and finally stage files back out to "store" on ngdata@ivec Input files:

<!--
  Firstly:
    grid-proxy-init;
    globus-credential-delegate -h 'hostname' /tmp/jobepr

  Usage:
    globusrun-ws -submit -Jf /tmp/jobepr -Sf /tmp/jobepr -Tf /tmp/jobepr -f testscript1.xml
-->


<job>
  <factoryEndpoint xmlns:gram="http://www.globus.org/namespaces/2004/10/gram/job"
                   xmlns:wsa="http://schemas.xmlsoap.org/ws/2004/03/addressing">
    <wsa:Address>
      https://ng2.vpac.org:8443/wsrf/services/ManagedJobFactoryService
    </wsa:Address>
    <wsa:ReferenceProperties>
      <gram:ResourceID>PBS</gram:ResourceID>
    </wsa:ReferenceProperties>
  </factoryEndpoint>
  <executable>Underworld</executable>
   <directory>${GLOBUS_SCRATCH_DIR}</directory>
<argument>${GLOBUS_SCRATCH_DIR}/arr-brecca.xml</argument>
  <argument>--interactive=False</argument>
  <environment>
      <name>MODULE_LOAD</name>
      <value>stgermain</value>
  </environment>
  <stdout>${GLOBUS_SCRATCH_DIR}/output/stdout</stdout>
 <stderr>${GLOBUS_SCRATCH_DIR}/output/stderr</stderr>

  <count>2</count>
  <queue>sque@brecca-m</queue>
  <maxWallTime>15</maxWallTime>
  <jobType>mpi</jobType>

<fileStageIn>
        <transfer>
                <sourceUrl>gsiftp://ngdata.ivec.org:2811/store/cg01/fra283/inputs/vpac/arr-brecca.xml</sourceUrl>
                <destinationUrl>file:///${GLOBUS_SCRATCH_DIR}/arr-brecca.xml</destinationUrl>
        </transfer>
</fileStageIn>

<fileStageOut>
        <transfer>
                <sourceUrl>file:///${GLOBUS_SCRATCH_DIR}/output/</sourceUrl>
                <destinationUrl>gsiftp://ngdata.ivec.org:2811/store/cg01/fra283/outputs/</destinationUrl>
        </transfer>
</fileStageOut>



</job>




Test script 2: This script will stage data from a directory in "store" on ngdata@ivec to the $GLOBUS_SCRATCH_DIR on host; run a python/Fastflo job and finally stage files back out to "store" on ngdata@ivec Input file:


<!-- Firstly: grid-proxy-init;  globus-credential-delegate -h 'hostname' /tmp/jobepr-->
<!-- Usage: globusrun-ws -submit -Jf /tmp/jobepr -Sf /tmp/jobepr -Tf /tmp/jobepr -f testscript2.xml-->
<job>
  <factoryEndpoint xmlns:gram="http://www.globus.org/namespaces/2004/10/gram/job"
                   xmlns:wsa="http://schemas.xmlsoap.org/ws/2004/03/addressing">
    <wsa:Address>
      https://ng2.ivec.org:8443/wsrf/services/ManagedJobFactoryService
    </wsa:Address>
    <wsa:ReferenceProperties>
      <gram:ResourceID>PBS</gram:ResourceID>
    </wsa:ReferenceProperties>
  </factoryEndpoint>
  <executable>python</executable>
   <directory>${GLOBUS_SCRATCH_DIR}</directory>
<argument>${GLOBUS_SCRATCH_DIR}/poiss_ci.py</argument>
  <environment>
      <name>MODULE_LOAD</name>
      <value>RT/0.3</value>
  </environment>
 <stdout>/short/cg01/fra283/python/stdout</stdout>
  <stderr>/short/cg01/fra283/python/stderr</stderr>

<project>cg01</project>

  <queue>express</queue>
  <maxWallTime>15</maxWallTime>

<fileStageIn>
        <transfer>
                <sourceUrl>gsiftp://ngdata.ivec.org:2811/store/cg01/fra283/inputs/fastflo1/</sourceUrl>
                <destinationUrl>gsiftp://ng2.ivec.org:2811/${GLOBUS_SCRATCH_DIR}/</destinationUrl>
        </transfer>
</fileStageIn>

<fileStageOut>
        <transfer>
                <sourceUrl>gsiftp://ng2.ivec.org:2811/${GLOBUS_SCRATCH_DIR}/</sourceUrl>
                <destinationUrl>gsiftp://ngdata.ivec.org:2811/store/cg01/fra283/outputs/</destinationUrl>
        </transfer>
</fileStageOut>



Test script 3: This script will stage data from a directory in "store" on ngdata@ivec to the $GLOBUS_SCRATCH_DIR on host; run a StG_FEM job and finally stage files back out to "store" on ngdata@ivec

Input files: /CSIRO-COMPGEO/home/fra283.CSIRO-COMPGEO/inputs/IvecSnarkTest

<job>
  <factoryEndpoint xmlns:gram="http://www.globus.org/namespaces/2004/10/gram/job"
                   xmlns:wsa="http://schemas.xmlsoap.org/ws/2004/03/addressing">
    <wsa:Address>
      https://ng2.ivec.org:8443/wsrf/services/ManagedJobFactoryService
    </wsa:Address>
    <wsa:ReferenceProperties>
      <gram:ResourceID>PBS</gram:ResourceID>
    </wsa:ReferenceProperties>
  </factoryEndpoint>
  <executable>StG_FEM</executable>
   <directory>${GLOBUS_SCRATCH_DIR}</directory>
   <argument>${GLOBUS_SCRATCH_DIR}/demo.xml</argument>
  <environment>
      <name>MODULE_LOAD</name>
      <value>snark</value>
  </environment>
  <stdout>${GLOBUS_SCRATCH_DIR}/stdout</stdout>
 <stderr>${GLOBUS_SCRATCH_DIR}/stderr</stderr>

 <count>2</count>
<hostCount>2</hostCount>
<project>cg01</project>
<queue>express</queue>
<maxWallTime>1000</maxWallTime>
<maxMemory>2000</maxMemory>
<jobType>mpi</jobType>

<fileStageIn>
        <transfer>
                <sourceUrl>gsiftp://ngdata.ivec.org:2811/store/cg01/fra283/inputs/snark/</sourceUrl>
                <destinationUrl>file:///${GLOBUS_SCRATCH_DIR}</destinationUrl>
        </transfer>
</fileStageIn>

<fileStageOut>
        <transfer>
                <sourceUrl>file:///${GLOBUS_SCRATCH_DIR}/</sourceUrl>
                <destinationUrl>gsiftp://ngdata.ivec.org:2811/store/cg01/fra283/outputs/</destinationUrl>
        </transfer>
</fileStageOut>

        <extensions>
                <globusrunAnnotation>
                        <automaticJobDelegation>true</automaticJobDelegation>
                        <automaticStagingDelegation>true</automaticStagingDelegation>
                        <automaticStageInDelegation>true</automaticStageInDelegation>
                        <automaticStageOutDelegation>true</automaticStageOutDelegation>
                        <automaticCleanUpDelegation>true</automaticCleanUpDelegation>
                </globusrunAnnotation>
        </extensions>

</job>
-- RyanFraser - 23 May 2006

Action list

This is a summary of actions to resolve some of the issues:
  • action 1
  • action 2 ....
Topic attachments
I Attachment Action Size Date Who Comment
arr-brecca.xmlxml arr-brecca.xml manage 22.3 K 23 May 2006 - 11:01 RyanFraser  
poiss_ci.mshmsh poiss_ci.msh manage 119.3 K 23 May 2006 - 11:01 RyanFraser  
poiss_ci.py.txttxt poiss_ci.py.txt manage 0.8 K 23 May 2006 - 11:01 RyanFraser  
Topic revision: r10 - 15 Oct 2010, UnknownUser
 

Current license: All material on this collaboration platform is licensed under a Creative Commons Attribution 3.0 Australia Licence (CC BY 3.0).