The CDS and ADS have a regression testing system that can be used to verify that...
Data comparisons are intelligent and, when data is not as expected, the system can specify exactly how it differs from expectation.
The system can be run either from the command line or as part of an ecFlow suite – useful for ad hoc and regular testing respectively. It is designed to bypass the caching system so data retrievals are properly tested.
All the code is in the cds-regression-testing repository.
~/.cdsapirc_dir
cdsapirc_<forms_branch>
, e.g. cdsapirc_c3sprodcdsapirc_<portal>_<JSON_branch>
, e.g. cdsapirc_c3s_prodregression_test
command (below), which takes stack names as input(s), a stack named X on the command line will use login credentials from ~/.cdsapirc_dir/cdsapirc_X
bin/regression_test c3sprod-c3s_prod
" to compare old and new stacks. Result files will be compared and the test will fail if they're different.bin/regression_test c3s_prod
" to run tests for just one stack. Result files will be be checked against information on the expected result provided in the test function.tests/test_<dataset_name>.py
files.sample.json
files as the tests (-k samples
), generate random requests with a max size of N as tests (-k random:N
) or take real-world user requests from the brokerdb (-k broker:c3sprod
)-d
option. The argument is interpreted as a regex so only a name fragment is required, e.g. "bin/regression_test -d era5 c3s_prod
"-m
to limit the tests to a particular adaptor (e.g. -m mars
or -m url
)-a <dataset>:<testname>
(or -A <dataset>:<testname>
) to restart from (or after) a given test. Useful after investigating a failed test.-h
to see all options