Compliance Checking and CEDA-CC


For certain projects we run the CEDA-CC tool to check that the files received are compliant with the format specification for that project. This can check file names, time ranges across groups of files and metadata inside files.

This page provides examples of problems identified with files and how they were tackled.

Note about installation of CEDA-CC on ingest1 server

On , CEDA-CC is deployed in the standard virtual environment (venv27). You can update the CEDA-CC version on this server by running a single script:


Typical workflow

The typical compliance checking process is:

  1. Data Providers places a batch of files in arrivals space.
  2. CEDA runs the CEDA-CC tool on that batch.
  3. CEDA runs the CEDA-CC tool in summary mode to examine the logs of the main run.
  4. If no errors:
    • CEDA ingests the data into the archive
    • CEDA publishes the datasets to ESGF
  5. If errors:
    • Inform the Data Provider and then go back to step 1.

Example problems identified with files

This section lists a number of real-life example problems that have been found with input files. It has been written to provide hints on how to diagnose issues identified by CEDA-CC and how to communicate them back to the Data Provider - who will be responsible for fixing files.

NOTE: CEDA does not undertake to fix problems identified by CEDA-CC - that is the responsibility of the Data Provider (who should also run a local copy of CEDA-CC before sending us data).

1. Units errors in NetCDF files

In this example, the errors showed up as follows:

$ ceda-cc --sum specs_CNRM-CM5_batch2/
############################ /datacentre/processing/specs/CCCC/trunk/ceda_cc
Summarising error reports from 73018 log file

C4.002.005:  1511  [variable_ncattribute_mipvalues] :Variable [hus] has incorrect attributes: units="1" [correct:"kg kg-1"]
Number of files with no errors: 71507

I did some grepping and counting to double-checked the errors are all the same:

$ cd specs_CNRM-CM5_batch2/
$ for i in *_CNRM*.txt ; do grep FAILED $i >> ../FAILED.txt ; done
$ wc -l ../FAILED.txt
1511 ../FAILED.txt
$ sort -u ../FAILED.txt
C4.002.005: [variable_ncattribute_mipvalues]: FAILED:: Variable [hus] has incorrect attributes: units="1" [correct: "kg kg-1"]

This showed that there was a common error across all failures.

Verified the error by looking inside one of the data files:

$ ncdump -h /group_workspaces/jasmin/specs/CNRM-CM5/batch2/CNRM/CNRM-CM/seaIceInit/S19790501/mon/atmos/hus/r1i1p1/ | grep hus | grep units
                hus:units = "1" ;

Checked the CF standard name table at:

Said "hus" should have units of "1".

Checked in ceda-cc MIP table:

$ head -1329 /usr/local/ingest_software/venv27/config/specs_vocabs/mip/SPECS_Amon | tail -8
variable_entry:    hus
modeling_realm:    atmos
! Variable attributes:
standard_name:     specific_humidity
units:             kg kg-1

So, even though the standard name table says the units should be "1", in SPECS the scientists have decided to use "kg kg-1".


We asked the Data Provider to change the files.

Recommend that she changes files.

Note that the code they used to fix this was quite simple:

# Find affected files, grab the units of "hus" and count them:
$ find /group_workspaces/jasmin/specs/CNRM-CM5/batch2/CNRM/CNRM-CM/seaIceInit -type f -name "hus_*.nc" -exec ncdump -h {} \; | grep hus:units | wc -l
# Fix all the files using ncatted   
$ find /group_workspaces/jasmin/specs/CNRM-CM5/batch2/CNRM/CNRM-CM/seaIceInit -type f -name "hus_*.nc" -exec ncatted -a units,hus,m,c,"kg kg-1" {} \;

Pierre-Antoine fixed the files himself.

2. Variable not in the MIP Table

For some CORDEX files the summary output reported:

C4.002.002: 22
  --- [variable_in_group] :Variable hurs not in table day: 10

This means that the variable represented was not in the MIP Table. A MIP Table is one of the configuration tables used to drive large experimental model runs. It is used in CEDA-CC to check the outputs.


This issue was raised with the Data Providers. They identified that the project had recently updated the MIP Tables. Contact was made with Martin Juckes and the new version of the CORDEX MIP Tables was added to CEDA-CC trunk and rolled out on the ingest1 server.

Once the new MIP Tables were in place this error ceased to be reported by the checker.

3. Incorrect global attribute value

In one CORDEX case many files had an unknown value for an expected global variable. The summary said:

C4.002.006: 1830
  --- [global_ncattribute_cv] :Global attributes do not match constraints:[('driving_model_id', 'ERAINT', "['ECMWF-ERAINT', 'BCC-bcc-csm1-1', 'BCC-bcc-csm1-1-m', 'BNU-BNU-ESM']")]: 1824

From the output it is clear that the value given was 'ERAINT' and the list of expected values included 'ECMWF-ERAINT'.


The resolution was simply to ask to the data provider to modify the files (using NCO) so that the global attribute 'driving_model_id' had the value 'ECMWF-ERAINT'. This fixed the problem.

4. Inconsistent file metadata

In one CORDEX case some files had global attributes that did not match the corresponding filename attributes. The summary said:

---  [filename_filemetadata_consistency] :File name segments do not match corresponding global attributes:[(2, 'model_id'), (4, 'startdate'), (5, '@ensemble'), (6, '@forecast_reference_time:4:')]

Looking at the file attribute model_id:

ncdump -h | grep model_id
                :model_id = "CNRM-CM5-LRA-LRO" ;

We can see that the model_id in the filename and in the attributes are not the same where "-LRA-" should be "-HRA-". Similarly the attributes 'startdate', 'forecast_reference_time' and 'associated_experiment' (which maps to @ensemble) are checked.


The resolution was simply to ask to the data provider to modify the files (using NCO) so that the global attributes 'model_id', 'startdate', 'forecast_reference_time' and 'associated_experiment' have values that are consistent with the filename. This fixed the problem.

Did this answer your question? Thanks for the feedback There was a problem submitting your feedback. Please try again later.

Still need help? Contact Us Contact Us