GitLab Commit is coming up on August 3-4. Learn how to innovate together using GitLab, the DevOps platform. Register for free: gitlabcommitvirtual2021.com

bdd-walkthrough.rst 4.08 KB
Newer Older
vla22's avatar
vla22 committed
1
2
3
BDD Walkthrough
***************

4
This is a small adaptation of material by Giorgio Brajnik. For background on how BDD tests work, please read :doc:`bdd-test-context`.
vla22's avatar
vla22 committed
5

vla22's avatar
vla22 committed
6
* Identify an SKA requirement or Feature. This may be an existing requirement, such as an interface requirement, or, if you are working on system verification, you may create a new requirement, such as `VTS-221 <https://jira.skatelescope.com/browse/VTS-221/>`_.
vla22's avatar
vla22 committed
7
   
vla22's avatar
vla22 committed
8
   * If writing a new requirement, please label it with the PI (Program Increment) in which you plan to implement it. 
vla22's avatar
vla22 committed
9

vla22's avatar
vla22 committed
10
* Create a JIRA issue of type "Test Set" in the XTP project. 
vla22's avatar
vla22 committed
11

vla22's avatar
vla22 committed
12
   * (optional) Add a fix version corresponding to the relevant PI.
13
   * Link this issue to the requirement or Feature defined above, using the "tests" relationship. (This can also be done from the requirement/Feature, but then the relationship used should be "tested by".) This can be done from the Test Set Create screen using the "link" field.
vla22's avatar
vla22 committed
14

vla22's avatar
vla22 committed
15
* Create the tests for the Test Set.
vla22's avatar
vla22 committed
16

vla22's avatar
vla22 committed
17
18
19
   * Create issue of type "Test" in the XTP project.
   * (optional) Add fix version.
   * Click on the "Test Details" tab in the newly-created issue
vla22's avatar
vla22 committed
20

vla22's avatar
vla22 committed
21
.. image:: images/bdd-test-details.png
vla22's avatar
vla22 committed
22
23
  :alt: Create Issue dialog box, showing the Test Details tab.
  :align: center
vla22's avatar
vla22 committed
24
25
26

Then provide the test details:

vla22's avatar
vla22 committed
27
28
29
30
      * Test type: Cucumber
      * Cucumber type: scenario
      * Cucmber scenario: write your Gherkin (given, when, then) steps here.

31
   * Link your test to the relevent Test Set or Test Sets. If you wish to link an existing test to a new Test Set, that's encouraged, and you can skip the test creation steps.
vla22's avatar
vla22 committed
32

vla22's avatar
vla22 committed
33
34
35
36
* Once all the tests for the Test Set are defined, you can export the ``.feature`` file. 
   * Find the relevant Test Set.
   * Go to the More dropdown menu.
   * Select Export to Cucumber from the menu. You'll need to do this for each Test Set you wish to exercise. 
vla22's avatar
vla22 committed
37

vla22's avatar
vla22 committed
38
.. image:: images/export-to-cucumber.png
vla22's avatar
vla22 committed
39
40
41
  :alt: XTP JIRA issue showing the More dropdown expanded
  :align: center

vla22's avatar
vla22 committed
42
* Add the ``.feature`` file to the relevant GitLab repository. We recommend placing this in the same directory as your tests; you may want to create a directory for your .feature files so that they are placed close to the test code, but so they're not confused with it. 
vla22's avatar
vla22 committed
43
* Implement your tests using ``pytest-bdd``. 
vla22's avatar
vla22 committed
44

vla22's avatar
vla22 committed
45
46
47
48
   * Import ``pytest-bdd`` to your test module.
   * Define a pytest fixture. This creates an empty dictionary that is used to communicate data between steps. 
   * Annotate the test case with the relevant scenario.
   * Write your tests, annotating the methods with the Gherkin keywords. These methods can be reused by your tests (e.g. the same "given" step can be reused by several tests).
49

50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
.. code-block:: python3 

    // import the relevant libraries
    from pytest-bdd import (given, parsers, scenarios, then, when)

    // load the scenarios from the .feature file. If there are multiple scenarios, add the scenario name after the path.
    scenarios(path/to/.feature/file)

    //you can create a pytest fixture to allow you to pass data between steps via a dictionary
    @pytest.fixture
    def result():
       return {}

    //then write your test steps, annotating them appropriately:
    @given('I have an SDPSubarry device')
    def subarray_device(devices):
       //code to get a subarray device
       result = devices.get_device(DEVICE_NAME)
       return result

    // note that this given step can be reused for many tests that need an SDP subarray device.

    @when('Test step goes here')
    def set_device_state(device):
        // more test code goes here

    @then('Result step goes here')
    def test_result():
        // test the result of your when steps here

This code is loosely based on https://gitlab.com/ska-telescope/sdp/ska-sdp-lmc/-/blob/master/tests/test_subarray.py.

82
.. note::
83
   
84
   We strongly recommend only using the JIRA integration on repositories such as skampi, that do a lot of integration. We further recommend only using the JIRA integration on the main/master branch. If you like the BDD testing style, you can just use ``pytest-bdd`` and get test outcomes as part of the usual CI/CD pipeline. 
85
86