We have been in the software quality assurance field for over 7 years and understand every aspect of this important business. We have grown to a 120+ engineers team that works with a wide variety of technology solutions and helps to improve software quality for our clients, both startups and Fortune 500 companies.

If you haven’t worked with a third party quality assurance partner before, you might have a question about deliverables of said cooperation. What the test reports will look like and what information will it include? These are valid questions and in this blog post I will walk you through our manual and automated test report examples. I can say from my experience that very often the success of software product development depends on the quality of provided test reports.

But before getting into the details of specific reports, I would like to mention that our engineers can work with a wide variety of issue tracking software and will gladly track bug information in a system of client’s choice in a way that is convenient. If there is no such system in place, we can help implement it or just stick to producing test reports of our own.

Manual testing report example

Manual test engineers are tasked with checking all the essential features of tested application and generating the test report in a professional format. Our engineers are ISTQB certified, so they know what should be included in the bug report and how it should be formatted. If there is no issue tracking software in place, we use simple spreadsheets to provide our clients with all the necessary information.

For test management purposes, spreadsheet-type test reports include the following tabs:

  • Detailed project information
  • Fault list
  • Summary

Detailed project information

The first tab contains detailed project information which includes general information about the project and label explanations (in this case – device list and legend).

Device list usually consists of multiple columns that include information about device manufacturer, device model, OS version, screen resolution, and screen size. To provide high-quality testing services at TestDevLab we are sticking to the real devices and currently, we have more than 600 devices for testing purposes (including phones, tablets, PC’s, VR headsets, etc.).

TestDevLab manual test report example; device list.

Legend shows more detailed information and predefined criteria such as segments, fault types, fault priorities and build versions (of tested software) used in manual tests. Features of the tested application have been split into multiple segments, so QA engineer can clearly define in which part of the app the issue has been found. To make the test report more comprehensible for the client and easier for our engineers to use, we have predefined fault types and priorities as shown in the image below.

TestDevLab manual test report example; detailed information.

Fault list

The second tab is the fault list which contains all the faults discovered during the testing. Each row contains information about the device it was performed on, date, build, segment, priority, OS and the test results. Test results include – pre-conditions, steps, actual results, expected results, link to a media with demonstration (screenshots or video) and comments. This is the part of the report that will be most important for developers in their work.

Summary

This tab contains a graphical visualization of the results – faults by type, priority, segments, and screen size proportion. In short – it shows the general quality of tested app. This part is usually presented to client representatives, because visualized results often are far more perceivable than a huge spreadsheet table with raw data.

Spreadsheet-type reports are also used for other testing solutions, for example, our mobile app battery & data usage testing solution.

Keep in mind, the sample is made to help you understand the idea of our spreadsheet-type reports. The information contained in the actual report is customizable and will differ depending on the client’s needs and project specifics.

Mobile automation testing report example

One of the main goals of test automation is to decrease the time required for test execution and improve the quality of testing by excluding the human factor from this process. Mobile automation testing report starts with a detailed schematic of the whole test automation process and its steps.

Parallelized mobile automation structure

Similarly to manual testing, automated testing requires certain tools that allow us to keep track of all the information and create test reports. In the case of automated testing, spreadsheets don’t fit all the requirements, so we generate different reports. For test management there is an abundance of tools that are both paid and free, such as – JIRA, TestRail, qTest, Zephyr, and others. Also we help with both test management tool implementation and software testing itself if this is something totally new to you.

For mobile test automation report generation, we use Appium (mobile test automation framework) and Cucumber (BDD tool for test structuring and report generation). Given that test automation can be performed on one or multiple devices simultaneously, we can divide our mobile test automation reports into two separate groups:

  • Single-device testing report
  • Multi-device testing report

Single-device Testing report

We use single-device HTML reports very rarely, because usually tests are automated on multiple devices simultaneously. Nevertheless, if such testing is needed, we prepare a single-device HTML report. Report contains some general information about the test itself (number of executed scenarios, test steps, duration of testing, etc.) and detailed information of each executed scenario and individual test steps. Report preview is shown below.

TestDevLab mobile automation testing example; single-device HTML report preview. The example test report shows tests for two mobile application features – bus and train schedule. An example in the total includes 2 features with 2 scenarios, where each have been executed and 1 of them has failed.

Multi-device Testing report

Compared to the preceding example, multi-device automated testing and reporting is more complicated, because it involves test parallelization on multiple devices. Parallelization is achieved by dividing test flow into threads that each trigger Cucumber tests on a specific device, as well as generates reports in a specific directory. After that, depending on the previous configuration, we either get individual HTML report for each device, or, for example, JSON reports that can be combined in a single HTML report for clarity and better understanding. To create such a report we have to use some other tools like Jenkins plugin “Cucumber Reports”. Multi-device single HTML report provides us with detailed test statistics which includes:

  • General project information (Project name, number, date, etc.)
  • Overall features Statistics (list of features/test cases, each test case and scenario status, total test duration, etc.)
  • Detailed feature Report for each of the test cases (includes an attachment with the corresponding screenshot

Preview from detailed feature report is shown below.

TestDevLab mobile automation testing example; multi-device report overview with general data.
TestDevLab mobile automation testing example; multi-device report, detailed feature report with an attached screenshot.

Every project is different and our extensive experience in testing solutions, big and small across different business verticals, allows us to craft test reports that precisely suits the needs of our clients. Plus, we can help with the creation of the whole test process and implementation of testing and reporting tools.

Contact us for more information on how we can help in your specific case!