Leveraging Analytics Data to Enhance Software Quality
As QA engineers, when thinking of user telemetry monitoring or user data analysis, we often assume that this would be a job for Software developers and IT administrators, right? Well, user data analysis can also be super useful when planning quite important work for the QA team. For example, what devices or OS versions QA engineers should test on? What scenarios have the highest priority? How to know why there is a sudden change in user behavior that might lead to an issue in product quality? All of this and more can be answered by user data analysis.
There are differences in gathered analytics data between industries, but the basic data usually is tracked for all projects.
Basic User Device Information
This information would be useful when planning on what devices the software should be tested on. You should be able to check what device models, OS versions, and browser versions were used.
Most Used Features
This information can help you decide on the priorities for bugs and test cases. When you find a new bug on a feature that is rarely used, you can tell that this probably won't be a high-priority issue. Knowing what are the most used parts in your software, can also help you understand the risks when releasing a new feature and the impact in case any issues come up.
What Markets Have the Largest User Base
This information helps prioritize tasks when planning tests. For example, what markets should be tested every release in regression test rounds, and which ones don't have to be tested every release? You can also understand better which markets you need to put the most focus on during exploratory testing sessions.
This will make your app crash-free. Kidding. But it can help you discover crashes that happen often, and, in case your analytics tool supports it, you can also filter the device models/OS versions/app versions that have experienced crashes. This will allow you to check how to reproduce the crash and if it still is appearing. In larger projects with multiple teams working on similar scopes, it can be the case that someone already fixed a crash that was reported to your team.
But not only crashes can be found using analytics error tracking tools. It depends on how your project is set up but you can also try filtering out other errors or spikes of unusual user behavior. For example, when there is a drop in reports coming in from users, it could be that the error reporting system is not working as expected. Another example, from recent experience, would be that a spike in product sales was discovered for a specific product. It turned out that there was a misconfiguration in the pricing system.
How Fast Users Pick Up New Software Releases
This information helps to understand how many users would be affected after a new feature release. This helps prioritize testing activities.
Overall, it depends on the project you're working on. Larger projects tend to have their own analytics team monitoring and trying to catch the most spikes in the data. If you're a part of a small project, chances are that some of this data isn't tracked yet. If this is the case, it would be a good idea to encourage your team to implement analytics tracking systems - at least the basic ones, so the team can understand the priorities and don't waste time on testing devices/OS versions on which the software isn't used at all. In case you don't have access to an analytics tool in your project, or if you're working on a start-up and analytics is not yet set up, here you can read some suggestions on how to decide what devices you should choose for testing.
In any case, please contact us, if you need help with your testing activities. We have years of experience with testing projects of various complexity and scale and will find the most suitable solution for you.