Show JUnit test results in the MR widget - Design
Design Proposal
Background Information:
JUnit XML Output Schema: http://llg.cubic.org/docs/junit/
Example JUNIT XML output of a failed test case:
<?xml version="1.0" encoding="UTF-8" ?> <testsuite tests="3" failures="2" name="SampleTest" time="0.025" errors="0" skipped="0"> <testcase classname="SampleTest" name="testOne" time="0.025"> <failure message="expected:<3> but was:<5>" type="java.lang.AssertionError">java.lang.AssertionError: expected:<3> but was:<5> at org.junit.Assert.fail(Assert.java:88) at org.junit.Assert.failNotEquals(Assert.java:834) at org.junit.Assert.assertEquals(Assert.java:645) at org.junit.Assert.assertEquals(Assert.java:631) at SampleTest.testOne(SampleTest.java:12) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78) at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57) at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290) at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71) at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288) at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58) at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268) at org.junit.runners.ParentRunner.run(ParentRunner.java:363) at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:252) at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:141) at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:112) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.maven.surefire.util.ReflectionUtils.invokeMethodWithArray(ReflectionUtils.java:189) at org.apache.maven.surefire.booter.ProviderFactory$ProviderProxy.invoke(ProviderFactory.java:165) at org.apache.maven.surefire.booter.ProviderFactory.invokeProvider(ProviderFactory.java:85) at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:115) at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:75) </failure> <system-out> Currency: USD Country: USA </system-out> <system-err> Sample error output. Reason for error output. </system-err> </testcase> </testsuite>
Feature Specs:
There is an existing mockup for this feature from the product vision demo that I used as a reference for my proposal: https://framer.cloud/UaofH/index.html which is linked from this issue: gitlab-com/www-gitlab-com#1802 (closed).
-
Show test summary of X tests failed out of total Y tests. These tests will just be the ones that failed in the MR. The summary will access information from the testsuites data of the XML file. For example, the data from the above XML would say: "Test summary: 2 out of 3 tests failed"
-
Within the widget, each failed test from the MR should display the test file name and specific test within that file that failed. The above example would be testOne failed in SampleTest.java.
-
Allow users to hyperlink to the specific file/line in the JUnit testfile where the test failure occurred.
-
Allow users to dive deeper (possibly in a pop-up modal or in an expandable MR widget) to view the runtime, failure stack trace (containing failure type and failure message) and standard out and standard error messages for each test.
-
When a test fails we show the first job this test failed at (without a success following it) and a direct link to that job. This will help users see which commit might be responsible for the failed unit test. Some of the interest for this feature has been discussed in the comment section here: https://gitlab.com/gitlab-org/gitlab-ce/issues/17081. It doesn't show the full build history for each test (although we should consider how we could iterate to this in the future if needed) but the immediate need to see when the test started to fail is important.
The Test Scenarios We Capture:
Consider a scenario where a MR is created to merge a feature branch feature-branch
into master
at some point, let’s say commit c0
.
head
is the HEAD pointer for the feature-branch
, pointing to the commit the pipeline has been running on (normally the latest one)
base
is the last common commit where the two branches diverged, c0
, so it could be the latest master
, but it is not true as soon as master
is updated by other MRs
If we consider head
the report we want to show, and base
the report we want to compare to, the list could look like this:
-
a summary, showing how many tests failed out of total tests and how many have been fixed
-
(red) tests failed on
head
but not onbase
-
(red) any other test failed on
head
-
(green) tests failed on
base
but not onhead
. these are essentially fixed tests from prior bugs onhead
If there is no base
, only the list of failed tests for head
is shown (These mocks will cover the scenario where we don't do any comparison on the diffs).
It's possible each scenario will be broken out into subsections visually such as on the Security Scanning report breaks out SAST, DAST, Container & Dependency Scanning: gitlab-examples/security/security-reports!1 (closed)
Design
MR widget | Modal dialog | Pipeline page |
---|---|---|