Sean Dague | 6dbc6da | 2013-05-08 17:49:46 -0400 | [diff] [blame] | 1 | Tempest Guide to Scenario tests |
| 2 | ======== |
| 3 | |
| 4 | |
| 5 | What are these tests? |
| 6 | -------- |
| 7 | |
| 8 | Scenario tests are "through path" tests of OpenStack |
| 9 | function. Complicated setups where one part might depend on completion |
| 10 | of a previous part. They ideally involve the integration between |
| 11 | multiple OpenStack services to exercise the touch points between them. |
| 12 | |
| 13 | An example would be: start with a blank environment, upload a glance |
| 14 | image, deploy a vm from it, ssh to the guest, make changes, capture |
| 15 | that vm's image back into glance as a snapshot, and launch a second vm |
| 16 | from that snapshot. |
| 17 | |
| 18 | |
| 19 | Why are these tests in tempest? |
| 20 | -------- |
| 21 | This is one of tempests core purposes, testing the integration between |
| 22 | projects. |
| 23 | |
| 24 | |
| 25 | Scope of these tests |
| 26 | -------- |
| 27 | Scenario tests should always test at least 2 services in |
| 28 | interaction. They should use the official python client libraries for |
| 29 | OpenStack, as they provide a more realistic approach in how people |
| 30 | will interact with the services. |
| 31 | |
| 32 | TODO: once we have service tags, tests should be tagged with which |
| 33 | services they exercise. |
| 34 | |
| 35 | |
| 36 | Example of a good test |
| 37 | -------- |
| 38 | While we are looking for interaction of 2 or more services, be |
| 39 | specific in your interactions. A giant "this is my data center" smoke |
| 40 | test is hard to debug when it goes wrong. |
| 41 | |
| 42 | A flow of interactions between glance and nova, like in the |
| 43 | introduction, is a good example. Especially if it involves a repeated |
| 44 | interaction when a resource is setup, modified, detached, and then |
| 45 | reused later again. |