Difference between revisions of "ZDTM API"
Line 125: | Line 125: | ||
All logs from test, including the pass/fail status would be in the <code>$test.out</code> file. | All logs from test, including the pass/fail status would be in the <code>$test.out</code> file. | ||
− | == | + | == Test's .desc file == |
− | + | Put the <code>${test_name}.desc</code> next to the test's .c file to configure the test. The .desc file can have the following fields in JSON format: | |
− | + | ; 'flavor' | |
+ | : Add space-separated 'h', 'ns' or 'uns' list here to limit the test by host, namespaces or (namespaces and user namespaces) run. | ||
− | + | ; 'flags' | |
+ | : Space-separated values. <code>suid</code>: test needs root prio to run. <code>noauto</code>: test is excluded from <code>--all</code> run of zdtm.py. <code>crfail</code>: dump should fail for this test. <code>nouser</code>: test cannot be checkpointed in [[user mode]]. <code>excl</code>: test should not be run in parallel with any other test. <code>samens</code>: no <code>--join-ns</code> testing. | ||
− | |||
− | |||
− | |||
[[Category: Development]] | [[Category: Development]] | ||
[[Category: Testing]] | [[Category: Testing]] |
Revision as of 13:29, 1 September 2016
This page describes the API one should use to write new test in ZDTM.
Overview
The ZDTM test-suite is the automated suite that launches individual subtests as processes, waits for them to get prepared, then dumps the process(es) started, then restores them and asks to check whether the restored state is what test thinks to be proper. In order to write more subtests developer should use the API declared below. Tests themselves live in one of test/zdtm/
sub-directory. The full-cycle suite is the test/zdtm.py
script.
API for test itself
Test body
The test should clearly describe 3 stages:
- Preparations
- These are actions that test does before the process(es) get checkpointed
- Actions
- During this the process(es) will be interrupted at arbitrary place and get dumped
- Checks
- After restore test should verify whether everything is OK or not and report the result
This is achieved by using the following API calls:
test_init(argc, argv)
- Just initializes the test subsystem. After this call the preparations stage starts.
test_daemon()
- This one says that we finished preparations and ready to get dumped at any time. I.e. -- the actions begin.
test_go()
- This routine checks whether the process was dumped and restored or not yet, i.e. the actions stage still goes on.
test_waitsig()
- Calling this blocks the task till it's restored. After this the checks stage starts.
pass()
- Calling this marks test as PASSED
fail(message)
- Calling this would report test failure. The message argument will be written into logs
err(message)
- Use this function to mark the test as being unable to work at all.
test_msg(message)
- This is for logging.
From the code perspective this looks like this:
int main(int argc, char **argv) { test_init(argc, argv); /* Preparations */ test_daemon(); #if test want to act and get c/r-ed unexpectedly while (test_go()) { /* Actions go here */ } #else test just want to wait for c/r to happen /* Actions go here. */ test_waitsig(); #endif /* checks */ if (test_passed()) pass(); else fail("Something went wrong"); return 0; }
Options
Sometimes a test might need an option to work. E.g. a file or directory to work with or some string to operate on. This can be done by declaring the options.
TEST_OPTION(name, type, description);
The name is some name for the option. The variable of the same name should be declared and the --$name
CLI argument would be passed upon test start (this option would be parsed by test_init()
call, so nothing more should be done to have the option available).
The type is data type. Currently supported types are string, bool, int, uint, long and ulong.
Description is just arbitrary text.
Building and launching
Then the test should be added into Makefile. There are several sections where simple tests can be added to.
TST_NOFILE
- Add your test to this list if the test accepts no options.
TST_FILE
- Tests here will be started with the
--filename $some_name
option. The file will not be created by launcher, test should take care of it itself.
TST_DIR
- These will be started with the
--dirname $some_name
option. The directory will not be created by launcher, test should take care of it itself.
TST
- These are tests with custom options. For those one should also declare the starting rule in the Makefile like this
test.pid: test $(<D)/$(<F) --pidfile=$@ --outfile=$<.out --OPTIONS-HERE
Checking test works
To make sure test itself works do these steps:
make cleanout # to clean the directory from old stuff make $test # to build the test make $test.pid # to start the test make $test.out # to stop the test and ask it for "checking" stage and results
All logs from test, including the pass/fail status would be in the $test.out
file.
Test's .desc file
Put the ${test_name}.desc
next to the test's .c file to configure the test. The .desc file can have the following fields in JSON format:
- 'flavor'
- Add space-separated 'h', 'ns' or 'uns' list here to limit the test by host, namespaces or (namespaces and user namespaces) run.
- 'flags'
- Space-separated values.
suid
: test needs root prio to run.noauto
: test is excluded from--all
run of zdtm.py.crfail
: dump should fail for this test.nouser
: test cannot be checkpointed in user mode.excl
: test should not be run in parallel with any other test.samens
: no--join-ns
testing.