Attila Fazekas | 23fdf1d | 2013-06-09 16:35:23 +0200 | [diff] [blame] | 1 | Tempest Coding Guide |
| 2 | ==================== |
| 3 | |
Joe Gordon | 1374f88 | 2013-07-12 17:00:34 +0100 | [diff] [blame] | 4 | - Step 1: Read the OpenStack Style Commandments |
Matthew Treinish | 97072c8 | 2013-10-01 11:54:15 -0400 | [diff] [blame] | 5 | http://docs.openstack.org/developer/hacking/ |
Joe Gordon | 1374f88 | 2013-07-12 17:00:34 +0100 | [diff] [blame] | 6 | - Step 2: Read on |
| 7 | |
| 8 | Tempest Specific Commandments |
| 9 | ------------------------------ |
| 10 | |
ghanshyam | 50f1947 | 2014-11-26 17:04:37 +0900 | [diff] [blame] | 11 | - [T102] Cannot import OpenStack python clients in tempest/api & |
| 12 | tempest/scenario tests |
Matthew Treinish | 5e4c0f2 | 2013-09-10 18:38:28 +0000 | [diff] [blame] | 13 | - [T104] Scenario tests require a services decorator |
Andrea Frittoli | a5ddd55 | 2014-08-19 18:30:00 +0100 | [diff] [blame] | 14 | - [T105] Tests cannot use setUpClass/tearDownClass |
Masayuki Igawa | fcacf96 | 2014-02-19 14:00:01 +0900 | [diff] [blame] | 15 | - [T106] vim configuration should not be kept in source files. |
Ken'ichi Ohmichi | 7581bcd | 2015-02-16 04:09:58 +0000 | [diff] [blame] | 16 | - [T107] Check that a service tag isn't in the module path |
Ken'ichi Ohmichi | 80369a9 | 2015-04-06 23:41:14 +0000 | [diff] [blame] | 17 | - [T108] Check no hyphen at the end of rand_name() argument |
John Warren | 3059a09 | 2015-08-31 15:34:49 -0400 | [diff] [blame] | 18 | - [T109] Cannot use testtools.skip decorator; instead use |
| 19 | decorators.skip_because from tempest-lib |
Ken'ichi Ohmichi | c0d96be | 2015-11-11 12:33:48 +0000 | [diff] [blame^] | 20 | - [T110] Check that service client names of GET should be consistent |
Ghanshyam | 2a180b8 | 2014-06-16 13:54:22 +0900 | [diff] [blame] | 21 | - [N322] Method's default argument shouldn't be mutable |
Attila Fazekas | 23fdf1d | 2013-06-09 16:35:23 +0200 | [diff] [blame] | 22 | |
Matthew Treinish | 8b37289 | 2012-12-07 17:13:16 -0500 | [diff] [blame] | 23 | Test Data/Configuration |
| 24 | ----------------------- |
| 25 | - Assume nothing about existing test data |
| 26 | - Tests should be self contained (provide their own data) |
| 27 | - Clean up test data at the completion of each test |
| 28 | - Use configuration files for values that will vary by environment |
| 29 | |
| 30 | |
Attila Fazekas | 10fd63d | 2013-07-04 18:38:21 +0200 | [diff] [blame] | 31 | Exception Handling |
| 32 | ------------------ |
| 33 | According to the ``The Zen of Python`` the |
Attila Fazekas | 58d2330 | 2013-07-24 10:25:02 +0200 | [diff] [blame] | 34 | ``Errors should never pass silently.`` |
Attila Fazekas | 10fd63d | 2013-07-04 18:38:21 +0200 | [diff] [blame] | 35 | Tempest usually runs in special environment (jenkins gate jobs), in every |
| 36 | error or failure situation we should provide as much error related |
| 37 | information as possible, because we usually do not have the chance to |
| 38 | investigate the situation after the issue happened. |
| 39 | |
| 40 | In every test case the abnormal situations must be very verbosely explained, |
| 41 | by the exception and the log. |
| 42 | |
| 43 | In most cases the very first issue is the most important information. |
| 44 | |
Mithil Arun | be067ec | 2014-11-05 15:58:50 +0530 | [diff] [blame] | 45 | Try to avoid using ``try`` blocks in the test cases, as both the ``except`` |
| 46 | and ``finally`` blocks could replace the original exception, |
Attila Fazekas | 10fd63d | 2013-07-04 18:38:21 +0200 | [diff] [blame] | 47 | when the additional operations leads to another exception. |
| 48 | |
Mithil Arun | be067ec | 2014-11-05 15:58:50 +0530 | [diff] [blame] | 49 | Just letting an exception to propagate, is not a bad idea in a test case, |
Bruce R. Montague | 44a6a19 | 2013-12-17 09:06:04 -0800 | [diff] [blame] | 50 | at all. |
Attila Fazekas | 10fd63d | 2013-07-04 18:38:21 +0200 | [diff] [blame] | 51 | |
| 52 | Try to avoid using any exception handling construct which can hide the errors |
| 53 | origin. |
| 54 | |
| 55 | If you really need to use a ``try`` block, please ensure the original |
| 56 | exception at least logged. When the exception is logged you usually need |
| 57 | to ``raise`` the same or a different exception anyway. |
| 58 | |
Chris Yeoh | c2ff727 | 2013-07-22 22:25:25 +0930 | [diff] [blame] | 59 | Use of ``self.addCleanup`` is often a good way to avoid having to catch |
| 60 | exceptions and still ensure resources are correctly cleaned up if the |
| 61 | test fails part way through. |
| 62 | |
Mithil Arun | be067ec | 2014-11-05 15:58:50 +0530 | [diff] [blame] | 63 | Use the ``self.assert*`` methods provided by the unit test framework. |
| 64 | This signals the failures early on. |
Attila Fazekas | 10fd63d | 2013-07-04 18:38:21 +0200 | [diff] [blame] | 65 | |
Mithil Arun | be067ec | 2014-11-05 15:58:50 +0530 | [diff] [blame] | 66 | Avoid using the ``self.fail`` alone, its stack trace will signal |
Bruce R. Montague | 44a6a19 | 2013-12-17 09:06:04 -0800 | [diff] [blame] | 67 | the ``self.fail`` line as the origin of the error. |
Attila Fazekas | 10fd63d | 2013-07-04 18:38:21 +0200 | [diff] [blame] | 68 | |
| 69 | Avoid constructing complex boolean expressions for assertion. |
Attila Fazekas | 7899d31 | 2013-08-16 09:18:17 +0200 | [diff] [blame] | 70 | The ``self.assertTrue`` or ``self.assertFalse`` without a ``msg`` argument, |
| 71 | will just tell you the single boolean value, and you will not know anything |
| 72 | about the values used in the formula, the ``msg`` argument might be good enough |
| 73 | for providing more information. |
| 74 | |
| 75 | Most other assert method can include more information by default. |
Attila Fazekas | 10fd63d | 2013-07-04 18:38:21 +0200 | [diff] [blame] | 76 | For example ``self.assertIn`` can include the whole set. |
| 77 | |
Matthew Treinish | f45ba2e | 2015-08-24 15:05:01 -0400 | [diff] [blame] | 78 | It is recommended to use testtools `matcher`_ for the more tricky assertions. |
| 79 | You can implement your own specific `matcher`_ as well. |
Attila Fazekas | 7899d31 | 2013-08-16 09:18:17 +0200 | [diff] [blame] | 80 | |
Matthew Treinish | f45ba2e | 2015-08-24 15:05:01 -0400 | [diff] [blame] | 81 | .. _matcher: http://testtools.readthedocs.org/en/latest/for-test-authors.html#matchers |
Attila Fazekas | 7899d31 | 2013-08-16 09:18:17 +0200 | [diff] [blame] | 82 | |
Attila Fazekas | 10fd63d | 2013-07-04 18:38:21 +0200 | [diff] [blame] | 83 | If the test case fails you can see the related logs and the information |
| 84 | carried by the exception (exception class, backtrack and exception info). |
Mithil Arun | be067ec | 2014-11-05 15:58:50 +0530 | [diff] [blame] | 85 | This and the service logs are your only guide to finding the root cause of flaky |
| 86 | issues. |
Attila Fazekas | 10fd63d | 2013-07-04 18:38:21 +0200 | [diff] [blame] | 87 | |
Attila Fazekas | 7899d31 | 2013-08-16 09:18:17 +0200 | [diff] [blame] | 88 | Test cases are independent |
| 89 | -------------------------- |
| 90 | Every ``test_method`` must be callable individually and MUST NOT depends on, |
| 91 | any other ``test_method`` or ``test_method`` ordering. |
| 92 | |
| 93 | Test cases MAY depend on commonly initialized resources/facilities, like |
| 94 | credentials management, testresources and so on. These facilities, MUST be able |
Mithil Arun | be067ec | 2014-11-05 15:58:50 +0530 | [diff] [blame] | 95 | to work even if just one ``test_method`` is selected for execution. |
Attila Fazekas | 7899d31 | 2013-08-16 09:18:17 +0200 | [diff] [blame] | 96 | |
Matthew Treinish | 5e4c0f2 | 2013-09-10 18:38:28 +0000 | [diff] [blame] | 97 | Service Tagging |
| 98 | --------------- |
| 99 | Service tagging is used to specify which services are exercised by a particular |
| 100 | test method. You specify the services with the tempest.test.services decorator. |
| 101 | For example: |
| 102 | |
| 103 | @services('compute', 'image') |
| 104 | |
| 105 | Valid service tag names are the same as the list of directories in tempest.api |
| 106 | that have tests. |
| 107 | |
| 108 | For scenario tests having a service tag is required. For the api tests service |
| 109 | tags are only needed if the test method makes an api call (either directly or |
| 110 | indirectly through another service) that differs from the parent directory |
| 111 | name. For example, any test that make an api call to a service other than nova |
| 112 | in tempest.api.compute would require a service tag for those services, however |
| 113 | they do not need to be tagged as compute. |
| 114 | |
Andrea Frittoli | a5ddd55 | 2014-08-19 18:30:00 +0100 | [diff] [blame] | 115 | Test fixtures and resources |
| 116 | --------------------------- |
| 117 | Test level resources should be cleaned-up after the test execution. Clean-up |
| 118 | is best scheduled using `addCleanup` which ensures that the resource cleanup |
| 119 | code is always invoked, and in reverse order with respect to the creation |
| 120 | order. |
| 121 | |
| 122 | Test class level resources should be defined in the `resource_setup` method of |
| 123 | the test class, except for any credential obtained from the credentials |
| 124 | provider, which should be set-up in the `setup_credentials` method. |
| 125 | |
| 126 | The test base class `BaseTestCase` defines Tempest framework for class level |
| 127 | fixtures. `setUpClass` and `tearDownClass` are defined here and cannot be |
| 128 | overwritten by subclasses (enforced via hacking rule T105). |
| 129 | |
| 130 | Set-up is split in a series of steps (setup stages), which can be overwritten |
| 131 | by test classes. Set-up stages are: |
| 132 | - `skip_checks` |
| 133 | - `setup_credentials` |
| 134 | - `setup_clients` |
| 135 | - `resource_setup` |
| 136 | |
| 137 | Tear-down is also split in a series of steps (teardown stages), which are |
| 138 | stacked for execution only if the corresponding setup stage had been |
| 139 | reached during the setup phase. Tear-down stages are: |
Andrea Frittoli (andreaf) | 17209bb | 2015-05-22 10:16:57 -0700 | [diff] [blame] | 140 | - `clear_credentials` (defined in the base test class) |
Andrea Frittoli | a5ddd55 | 2014-08-19 18:30:00 +0100 | [diff] [blame] | 141 | - `resource_cleanup` |
| 142 | |
| 143 | Skipping Tests |
| 144 | -------------- |
| 145 | Skipping tests should be based on configuration only. If that is not possible, |
| 146 | it is likely that either a configuration flag is missing, or the test should |
| 147 | fail rather than be skipped. |
| 148 | Using discovery for skipping tests is generally discouraged. |
| 149 | |
| 150 | When running a test that requires a certain "feature" in the target |
| 151 | cloud, if that feature is missing we should fail, because either the test |
| 152 | configuration is invalid, or the cloud is broken and the expected "feature" is |
| 153 | not there even if the cloud was configured with it. |
| 154 | |
Matthew Treinish | 8b79bb3 | 2013-10-10 17:11:05 -0400 | [diff] [blame] | 155 | Negative Tests |
| 156 | -------------- |
Marc Koderer | a5afb4f | 2014-02-11 15:38:15 +0100 | [diff] [blame] | 157 | Newly added negative tests should use the negative test framework. First step |
Marc Koderer | b3875b0 | 2014-11-27 09:52:50 +0100 | [diff] [blame] | 158 | is to create an interface description in a python file under |
| 159 | `tempest/api_schema/request/`. These descriptions consists of two important |
| 160 | sections for the test (one of those is mandatory): |
Matthew Treinish | 8b79bb3 | 2013-10-10 17:11:05 -0400 | [diff] [blame] | 161 | |
Marc Koderer | a5afb4f | 2014-02-11 15:38:15 +0100 | [diff] [blame] | 162 | - A resource (part of the URL of the request): Resources needed for a test |
Matthew Treinish | f45ba2e | 2015-08-24 15:05:01 -0400 | [diff] [blame] | 163 | must be created in `setUpClass` and registered with `set_resource` e.g.: |
| 164 | `cls.set_resource("server", server['id'])` |
Matthew Treinish | 8b79bb3 | 2013-10-10 17:11:05 -0400 | [diff] [blame] | 165 | |
Marc Koderer | a5afb4f | 2014-02-11 15:38:15 +0100 | [diff] [blame] | 166 | - A json schema: defines properties for a request. |
| 167 | |
| 168 | After that a test class must be added to automatically generate test scenarios |
Marc Koderer | 313cbd5 | 2014-03-26 08:56:59 +0100 | [diff] [blame] | 169 | out of the given interface description:: |
| 170 | |
| 171 | load_tests = test.NegativeAutoTest.load_tests |
Marc Koderer | a5afb4f | 2014-02-11 15:38:15 +0100 | [diff] [blame] | 172 | |
Marc Koderer | b3875b0 | 2014-11-27 09:52:50 +0100 | [diff] [blame] | 173 | @test.SimpleNegativeAutoTest |
| 174 | class SampleTestNegativeTestJSON(<your base class>, test.NegativeAutoTest): |
Marc Koderer | a5afb4f | 2014-02-11 15:38:15 +0100 | [diff] [blame] | 175 | _service = 'compute' |
Marc Koderer | b3875b0 | 2014-11-27 09:52:50 +0100 | [diff] [blame] | 176 | _schema = <your schema file> |
Marc Koderer | a5afb4f | 2014-02-11 15:38:15 +0100 | [diff] [blame] | 177 | |
Marc Koderer | b3875b0 | 2014-11-27 09:52:50 +0100 | [diff] [blame] | 178 | The class decorator `SimpleNegativeAutoTest` will automatically generate test |
| 179 | cases out of the given schema in the attribute `_schema`. |
Marc Koderer | a5afb4f | 2014-02-11 15:38:15 +0100 | [diff] [blame] | 180 | |
| 181 | All negative tests should be added into a separate negative test file. |
| 182 | If such a file doesn't exist for the particular resource being tested a new |
Marc Koderer | b3875b0 | 2014-11-27 09:52:50 +0100 | [diff] [blame] | 183 | test file should be added. |
Matthew Treinish | 8b79bb3 | 2013-10-10 17:11:05 -0400 | [diff] [blame] | 184 | |
Giulio Fidente | 83181a9 | 2013-10-01 06:02:24 +0200 | [diff] [blame] | 185 | Test skips because of Known Bugs |
| 186 | -------------------------------- |
| 187 | |
| 188 | If a test is broken because of a bug it is appropriate to skip the test until |
| 189 | bug has been fixed. You should use the skip_because decorator so that |
| 190 | Tempest's skip tracking tool can watch the bug status. |
| 191 | |
| 192 | Example:: |
| 193 | |
| 194 | @skip_because(bug="980688") |
| 195 | def test_this_and_that(self): |
| 196 | ... |
| 197 | |
Chris Yeoh | c2ff727 | 2013-07-22 22:25:25 +0930 | [diff] [blame] | 198 | Guidelines |
| 199 | ---------- |
| 200 | - Do not submit changesets with only testcases which are skipped as |
| 201 | they will not be merged. |
| 202 | - Consistently check the status code of responses in testcases. The |
| 203 | earlier a problem is detected the easier it is to debug, especially |
| 204 | where there is complicated setup required. |
Matthew Treinish | 96c28d1 | 2013-09-16 17:05:09 +0000 | [diff] [blame] | 205 | |
DennyZhang | 900f02b | 2013-09-23 08:34:04 -0500 | [diff] [blame] | 206 | Parallel Test Execution |
| 207 | ----------------------- |
Matthew Treinish | 96c28d1 | 2013-09-16 17:05:09 +0000 | [diff] [blame] | 208 | Tempest by default runs its tests in parallel this creates the possibility for |
| 209 | interesting interactions between tests which can cause unexpected failures. |
Andrea Frittoli (andreaf) | 17209bb | 2015-05-22 10:16:57 -0700 | [diff] [blame] | 210 | Dynamic credentials provides protection from most of the potential race |
| 211 | conditions between tests outside the same class. But there are still a few of |
| 212 | things to watch out for to try to avoid issues when running your tests in |
| 213 | parallel. |
Matthew Treinish | 96c28d1 | 2013-09-16 17:05:09 +0000 | [diff] [blame] | 214 | |
| 215 | - Resources outside of a tenant scope still have the potential to conflict. This |
| 216 | is a larger concern for the admin tests since most resources and actions that |
DennyZhang | 900f02b | 2013-09-23 08:34:04 -0500 | [diff] [blame] | 217 | require admin privileges are outside of tenants. |
Matthew Treinish | 96c28d1 | 2013-09-16 17:05:09 +0000 | [diff] [blame] | 218 | |
| 219 | - Races between methods in the same class are not a problem because |
| 220 | parallelization in tempest is at the test class level, but if there is a json |
| 221 | and xml version of the same test class there could still be a race between |
| 222 | methods. |
| 223 | |
| 224 | - The rand_name() function from tempest.common.utils.data_utils should be used |
| 225 | anywhere a resource is created with a name. Static naming should be avoided |
| 226 | to prevent resource conflicts. |
| 227 | |
| 228 | - If the execution of a set of tests is required to be serialized then locking |
| 229 | can be used to perform this. See AggregatesAdminTest in |
| 230 | tempest.api.compute.admin for an example of using locking. |
Marc Koderer | 31fe483 | 2013-11-06 17:02:03 +0100 | [diff] [blame] | 231 | |
| 232 | Stress Tests in Tempest |
| 233 | ----------------------- |
| 234 | Any tempest test case can be flagged as a stress test. With this flag it will |
| 235 | be automatically discovery and used in the stress test runs. The stress test |
| 236 | framework itself is a facility to spawn and control worker processes in order |
| 237 | to find race conditions (see ``tempest/stress/`` for more information). Please |
| 238 | note that these stress tests can't be used for benchmarking purposes since they |
| 239 | don't measure any performance characteristics. |
| 240 | |
| 241 | Example:: |
| 242 | |
| 243 | @stresstest(class_setup_per='process') |
| 244 | def test_this_and_that(self): |
| 245 | ... |
| 246 | |
| 247 | This will flag the test ``test_this_and_that`` as a stress test. The parameter |
| 248 | ``class_setup_per`` gives control when the setUpClass function should be called. |
| 249 | |
| 250 | Good candidates for stress tests are: |
| 251 | |
| 252 | - Scenario tests |
| 253 | - API tests that have a wide focus |
Matthew Treinish | 6eb0585 | 2013-11-26 15:28:12 +0000 | [diff] [blame] | 254 | |
| 255 | Sample Configuration File |
| 256 | ------------------------- |
| 257 | The sample config file is autogenerated using a script. If any changes are made |
David Kranz | fb0f51f | 2014-11-11 14:07:20 -0500 | [diff] [blame] | 258 | to the config variables in tempest/config.py then the sample config file must be |
| 259 | regenerated. This can be done running:: |
| 260 | |
| 261 | tox -egenconfig |
Matthew Treinish | ecf212c | 2013-12-06 18:23:54 +0000 | [diff] [blame] | 262 | |
| 263 | Unit Tests |
| 264 | ---------- |
| 265 | Unit tests are a separate class of tests in tempest. They verify tempest |
| 266 | itself, and thus have a different set of guidelines around them: |
| 267 | |
| 268 | 1. They can not require anything running externally. All you should need to |
| 269 | run the unit tests is the git tree, python and the dependencies installed. |
| 270 | This includes running services, a config file, etc. |
| 271 | |
| 272 | 2. The unit tests cannot use setUpClass, instead fixtures and testresources |
| 273 | should be used for shared state between tests. |
Matthew Treinish | 5507888 | 2014-08-12 19:01:34 -0400 | [diff] [blame] | 274 | |
| 275 | |
| 276 | .. _TestDocumentation: |
| 277 | |
| 278 | Test Documentation |
| 279 | ------------------ |
| 280 | For tests being added we need to require inline documentation in the form of |
Xicheng Chang | 6fb98ec | 2015-08-13 14:02:52 -0700 | [diff] [blame] | 281 | docstrings to explain what is being tested. In API tests for a new API a class |
Matthew Treinish | 5507888 | 2014-08-12 19:01:34 -0400 | [diff] [blame] | 282 | level docstring should be added to an API reference doc. If one doesn't exist |
| 283 | a TODO comment should be put indicating that the reference needs to be added. |
| 284 | For individual API test cases a method level docstring should be used to |
| 285 | explain the functionality being tested if the test name isn't descriptive |
| 286 | enough. For example:: |
| 287 | |
| 288 | def test_get_role_by_id(self): |
| 289 | """Get a role by its id.""" |
| 290 | |
| 291 | the docstring there is superfluous and shouldn't be added. but for a method |
| 292 | like:: |
| 293 | |
| 294 | def test_volume_backup_create_get_detailed_list_restore_delete(self): |
| 295 | pass |
| 296 | |
| 297 | a docstring would be useful because while the test title is fairly descriptive |
| 298 | the operations being performed are complex enough that a bit more explanation |
| 299 | will help people figure out the intent of the test. |
| 300 | |
| 301 | For scenario tests a class level docstring describing the steps in the scenario |
| 302 | is required. If there is more than one test case in the class individual |
| 303 | docstrings for the workflow in each test methods can be used instead. A good |
| 304 | example of this would be:: |
| 305 | |
Masayuki Igawa | 93424e5 | 2014-10-06 13:54:26 +0900 | [diff] [blame] | 306 | class TestVolumeBootPattern(manager.ScenarioTest): |
Dougal Matthews | 4bebca0 | 2014-10-28 08:36:04 +0000 | [diff] [blame] | 307 | """ |
| 308 | This test case attempts to reproduce the following steps: |
Matthew Treinish | 5507888 | 2014-08-12 19:01:34 -0400 | [diff] [blame] | 309 | |
Dougal Matthews | 4bebca0 | 2014-10-28 08:36:04 +0000 | [diff] [blame] | 310 | * Create in Cinder some bootable volume importing a Glance image |
| 311 | * Boot an instance from the bootable volume |
| 312 | * Write content to the volume |
| 313 | * Delete an instance and Boot a new instance from the volume |
| 314 | * Check written content in the instance |
| 315 | * Create a volume snapshot while the instance is running |
| 316 | * Boot an additional instance from the new snapshot based volume |
| 317 | * Check written content in the instance booted from snapshot |
| 318 | """ |
Matthew Treinish | a970d65 | 2015-03-11 15:39:24 -0400 | [diff] [blame] | 319 | |
Chris Hoge | 0e000ed | 2015-07-28 14:19:53 -0500 | [diff] [blame] | 320 | Test Identification with Idempotent ID |
| 321 | -------------------------------------- |
| 322 | |
| 323 | Every function that provides a test must have an ``idempotent_id`` decorator |
| 324 | that is a unique ``uuid-4`` instance. This ID is used to complement the fully |
Naomichi Wakui | dbe9aab | 2015-08-26 03:36:02 +0000 | [diff] [blame] | 325 | qualified test name and track test functionality through refactoring. The |
Chris Hoge | 0e000ed | 2015-07-28 14:19:53 -0500 | [diff] [blame] | 326 | format of the metadata looks like:: |
| 327 | |
| 328 | @test.idempotent_id('585e934c-448e-43c4-acbf-d06a9b899997') |
| 329 | def test_list_servers_with_detail(self): |
| 330 | # The created server should be in the detailed list of all servers |
| 331 | ... |
| 332 | |
| 333 | Tempest includes a ``check_uuid.py`` tool that will test for the existence |
| 334 | and uniqueness of idempotent_id metadata for every test. By default the |
| 335 | tool runs against the Tempest package by calling:: |
| 336 | |
| 337 | python check_uuid.py |
| 338 | |
| 339 | It can be invoked against any test suite by passing a package name:: |
| 340 | |
| 341 | python check_uuid.py --package <package_name> |
| 342 | |
| 343 | Tests without an ``idempotent_id`` can be automatically fixed by running |
| 344 | the command with the ``--fix`` flag, which will modify the source package |
| 345 | by inserting randomly generated uuids for every test that does not have |
| 346 | one:: |
| 347 | |
| 348 | python check_uuid.py --fix |
| 349 | |
| 350 | The ``check_uuid.py`` tool is used as part of the tempest gate job |
| 351 | to ensure that all tests have an ``idempotent_id`` decorator. |
| 352 | |
Matthew Treinish | a970d65 | 2015-03-11 15:39:24 -0400 | [diff] [blame] | 353 | Branchless Tempest Considerations |
| 354 | --------------------------------- |
| 355 | |
| 356 | Starting with the OpenStack Icehouse release Tempest no longer has any stable |
| 357 | branches. This is to better ensure API consistency between releases because |
| 358 | the API behavior should not change between releases. This means that the stable |
| 359 | branches are also gated by the Tempest master branch, which also means that |
| 360 | proposed commits to Tempest must work against both the master and all the |
| 361 | currently supported stable branches of the projects. As such there are a few |
| 362 | special considerations that have to be accounted for when pushing new changes |
| 363 | to tempest. |
| 364 | |
| 365 | 1. New Tests for new features |
| 366 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
| 367 | |
| 368 | When adding tests for new features that were not in previous releases of the |
| 369 | projects the new test has to be properly skipped with a feature flag. Whether |
| 370 | this is just as simple as using the @test.requires_ext() decorator to check |
| 371 | if the required extension (or discoverable optional API) is enabled or adding |
| 372 | a new config option to the appropriate section. If there isn't a method of |
| 373 | selecting the new **feature** from the config file then there won't be a |
| 374 | mechanism to disable the test with older stable releases and the new test won't |
| 375 | be able to merge. |
| 376 | |
| 377 | 2. Bug fix on core project needing Tempest changes |
| 378 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
| 379 | |
| 380 | When trying to land a bug fix which changes a tested API you'll have to use the |
| 381 | following procedure:: |
| 382 | |
| 383 | - Propose change to the project, get a +2 on the change even with failing |
| 384 | - Propose skip on Tempest which will only be approved after the |
| 385 | corresponding change in the project has a +2 on change |
| 386 | - Land project change in master and all open stable branches (if required) |
| 387 | - Land changed test in Tempest |
| 388 | |
| 389 | Otherwise the bug fix won't be able to land in the project. |
| 390 | |
| 391 | 3. New Tests for existing features |
| 392 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ |
| 393 | |
| 394 | If a test is being added for a feature that exists in all the current releases |
| 395 | of the projects then the only concern is that the API behavior is the same |
| 396 | across all the versions of the project being tested. If the behavior is not |
| 397 | consistent the test will not be able to merge. |
| 398 | |
| 399 | API Stability |
| 400 | ------------- |
| 401 | |
| 402 | For new tests being added to Tempest the assumption is that the API being |
| 403 | tested is considered stable and adheres to the OpenStack API stability |
| 404 | guidelines. If an API is still considered experimental or in development then |
| 405 | it should not be tested by Tempest until it is considered stable. |