Difference between revisions of "ZDTM API"

From CRIU
Jump to navigation Jump to search
 
(2 intermediate revisions by the same user not shown)
Line 125: Line 125:
 
All logs from test, including the pass/fail status would be in the <code>$test.out</code> file.
 
All logs from test, including the pass/fail status would be in the <code>$test.out</code> file.
  
== Adding test to automatic suite ==
+
== Test's .desc file ==
  
The above part of the article describes what should be done to make the test work itself. Now, to check how the test is dumped and restored one should add the test to the suite. This is done simply by editing the <code>test/zdtm.sh</code> script and adding the test name to the <code>TEST_LIST</code> variable. Some tricks apply though.
+
Put the <code>${test_name}.desc</code> next to the test's .c file to configure the test. The .desc file can have the following fields in JSON format:
  
=== Caring about namespaces ===
+
; 'flavor'
 +
: Add space-separated 'h', 'ns' or 'uns' list here to limit the test by host, namespaces or (namespaces and user namespaces) run.
  
If your test behaves very differently when run inside sub-namespace or tests namespaces stuff, then the test should not be added to the very first definition of the <code>TEST_LIST</code> list, but instead to the one that goes after the <code># These ones are not in ns</code> comment, optionally with the <code>ns/</code> prefix.
+
; 'flags'
 +
: Space-separated values. <code>suid</code>: test needs root prio to run. <code>noauto</code>: test is excluded from <code>--all</code> run of zdtm.py. <code>crfail</code>: dump should fail for this test. <code>nouser</code>: test cannot be checkpointed in [[user-mode]]. <code>excl</code>: test should not be run in parallel with any other test. <code>samens</code>: no <code>--join-ns</code> testing.
  
=== Requiring special features ===
+
; 'opts'
 +
: Any test-specific options for criu dump/restore actions should go here
  
If in order to C/R the test some special support from the kernel is required, then CRIU itself should be taught to check this support with the <code>check --feature=$name</code> option (see cr-check.c file for details). The test itself should be conditionally added to the <code>TEST_LIST</code> variable. Check how the <code>static/tun</code> test is added for reference.
+
; 'feature'
 +
: The feature that should be [[Checking the kernel|checked]] before running the test.
 +
 
 +
; 'arch'
 +
: The test checks architecture-specific thing (regs/TLS/etc.) and should only be run on the arch specified.
 +
 
 +
== See also ==
 +
[[ZDTM Test Suite]]
  
 
[[Category: Development]]
 
[[Category: Development]]
 
[[Category: Testing]]
 
[[Category: Testing]]

Latest revision as of 13:32, 1 September 2016

This page describes the API one should use to write new test in ZDTM.

Overview[edit]

The ZDTM test-suite is the automated suite that launches individual subtests as processes, waits for them to get prepared, then dumps the process(es) started, then restores them and asks to check whether the restored state is what test thinks to be proper. In order to write more subtests developer should use the API declared below. Tests themselves live in one of test/zdtm/ sub-directory. The full-cycle suite is the test/zdtm.py script.

API for test itself[edit]

Test body[edit]

The test should clearly describe 3 stages:

Preparations
These are actions that test does before the process(es) get checkpointed
Actions
During this the process(es) will be interrupted at arbitrary place and get dumped
Checks
After restore test should verify whether everything is OK or not and report the result

This is achieved by using the following API calls:

test_init(argc, argv)
Just initializes the test subsystem. After this call the preparations stage starts.
test_daemon()
This one says that we finished preparations and ready to get dumped at any time. I.e. -- the actions begin.
test_go()
This routine checks whether the process was dumped and restored or not yet, i.e. the actions stage still goes on.
test_waitsig()
Calling this blocks the task till it's restored. After this the checks stage starts.
pass()
Calling this marks test as PASSED
fail(message)
Calling this would report test failure. The message argument will be written into logs
err(message)
Use this function to mark the test as being unable to work at all.
test_msg(message)
This is for logging.


From the code perspective this looks like this:

int main(int argc, char **argv)
{
	test_init(argc, argv);

	/* Preparations */

	test_daemon();

#if test want to act and get c/r-ed unexpectedly
	while (test_go()) {
		/* Actions go here */
	}
#else test just want to wait for c/r to happen
	/* Actions go here. */

	test_waitsig();
#endif

	/* checks */

	if (test_passed())
		pass();
	else
		fail("Something went wrong");

	return 0;
}

Options[edit]

Sometimes a test might need an option to work. E.g. a file or directory to work with or some string to operate on. This can be done by declaring the options.

TEST_OPTION(name, type, description);

The name is some name for the option. The variable of the same name should be declared and the --$name CLI argument would be passed upon test start (this option would be parsed by test_init() call, so nothing more should be done to have the option available).

The type is data type. Currently supported types are string, bool, int, uint, long and ulong.

Description is just arbitrary text.

Building and launching[edit]

Then the test should be added into Makefile. There are several sections where simple tests can be added to.

TST_NOFILE
Add your test to this list if the test accepts no options.
TST_FILE
Tests here will be started with the --filename $some_name option. The file will not be created by launcher, test should take care of it itself.
TST_DIR
These will be started with the --dirname $some_name option. The directory will not be created by launcher, test should take care of it itself.
TST
These are tests with custom options. For those one should also declare the starting rule in the Makefile like this
test.pid: test
	$(<D)/$(<F) --pidfile=$@ --outfile=$<.out --OPTIONS-HERE

Checking test works[edit]

To make sure test itself works do these steps:

make cleanout  # to clean the directory from old stuff
make $test     # to build the test
make $test.pid # to start the test
make $test.out # to stop the test and ask it for "checking" stage and results

All logs from test, including the pass/fail status would be in the $test.out file.

Test's .desc file[edit]

Put the ${test_name}.desc next to the test's .c file to configure the test. The .desc file can have the following fields in JSON format:

'flavor'
Add space-separated 'h', 'ns' or 'uns' list here to limit the test by host, namespaces or (namespaces and user namespaces) run.
'flags'
Space-separated values. suid: test needs root prio to run. noauto: test is excluded from --all run of zdtm.py. crfail: dump should fail for this test. nouser: test cannot be checkpointed in user-mode. excl: test should not be run in parallel with any other test. samens: no --join-ns testing.
'opts'
Any test-specific options for criu dump/restore actions should go here
'feature'
The feature that should be checked before running the test.
'arch'
The test checks architecture-specific thing (regs/TLS/etc.) and should only be run on the arch specified.

See also[edit]

ZDTM Test Suite