Over the last several years the popularity of live streaming applications has been on an exponential upward trajectory, and 2021 is no exception. For example, in March and April 2020 an astounding 34% of internet users were streamers in the gaming and social life areas. Additionally, during the course of only several months, live streaming events saw an increase of 300%. With the whole world being homebound most of the time, the number of live streaming applications consumers – both broadcasters and viewers – has significantly risen, and consequently, the importance of providing a high-quality experience for end users has been brought under an even larger magnifying glass. Terms such as “Live Stream”, “VOD”, “Clip” have seen the light of day more than the humans of this world have seen for the past year and a half.
In a word, live streaming applications provide end users video hosting platforms where they can easily broadcast their video content to wide audiences in real-time. Some platforms allow for audiences to catch up with that video content by creating Videos on Demand (or VODs for short), which are recordings of said live streams. There are also clips, which, as the name suggests, are very short videos – clipped content if you will – from the previously mentioned VODs. And in order for all of these users to have a pleasant and seamless experience, a lot of functional factors are at play – none of which are dismissed at TestDevLab. This article aims to provide insight into the functional manners in which the greatness of live streaming applications is being maintained.
Functional Testing Process: An Overview
First and foremost, a little reminder of what functional testing is, according to ISTQB: “Testing performed to evaluate if a component or system satisfies functional requirements”. In layman’s terms, functional testing ensures that the applications will maintain consistency for their consumers, and won’t break down even when changes or new features are being implemented. The main goal is to provide a harmonious and engaging experience to end users.
From a QA aspect, the magic which allows for such experience starts with having a structured approach to the testing process. While we at TestDevLab are no strangers to ad hoc testing, following a well-put organizational process aids in providing a faster and smoother testing process. Some of the components of such structured approach entail:
- Organized planning of the whole testing process: from start to end
- Prioritization of issues
- Following well established test processes
- Adopting a good test management tool
- Applying efficient test execution templates
- Using test reporting templates which follow the industry standards
- Creating and implementing noteworthy bug reporting templates
- Equipping the team with wiki pages for professional knowledge and advancement
- Utilizing a variety of technical equipment in order to simulate real life scenarios
Real life experience has demonstrated that following such an approach leads to successfully providing high-quality services, while at the same time having fun and constantly leveling up on skills acquired in the process.
Core Functionalities and Metrics of Live Streaming Applications
Without further ado, let’s delve deeper into the specifics of the skills and knowledge we at TestDevLab have by taking on some technical terms and the correlations between them. Note that this is not an exhaustive list of all the metrics we use for functional testing. Listed below are some of the core metrics used for the functional testing of live streaming applications:
- Video Quality: As its name indicates, video quality refers to the different qualities in which a live stream or video can be watched. From a technical aspect, video quality is represented by the number of horizontal lines displayed on a screen, each line having the same pixel width. The higher the video quality is, the higher the image sharpness will be. An interesting point that is worth mentioning is the term ‘subjective video quality’. As its name suggests, subjective quality refers to the perception of quality by viewers and their own personal judgement of the quality, which could differ from objective algorithms designed to measure video quality. For instance, a video quality of 480p might be perceived as good by viewers, but bad by quality measuring algorithms. The “p” in the abovementioned video quality stands for pixels, which leads us to our next point.
- Video Resolution: Each different video quality carries its own video resolution. For instance, manually selecting a video quality of 1080p on a 16:9 screen would have a video resolution of 1920×1080. Selecting a quality of 720p would have a video resolution of 1280×720. Before the numbers become too many, the main point is that the smaller the quality, the smaller the video resolution and image sharpness will be. And all of this is connected to pixels in the following manner: a video with a quality of 720 pixels would have 720 horizontal lines stacked on top of each other, and each of those lines would have a width of 1280 pixels.
- Video bitrate and ABR: Closely connected to both of the above points, video bitrate represents the quality, as well as the size of the video. Similar to the previous ones, video bitrate is also directly proportional to video quality: higher bitrate is equal to higher quality. An interesting aspect of video bitrate is ABR. This metric is an extension of video bitrate, and the abbreviation stands for Adaptive Bitrate Streaming. ABR allows for the dynamic adjustment of video quality based on available network bandwidth. ABR is the opposite of fixed bitrate, which, as you might have guessed, means that the video bitrate does not adjust dynamically to the available network bandwidth.
- Network Bandwidth: All of the abovementioned metrics are dependent on network bandwidth, which represents the capacity of the network to transmit a given amount of data in a given amount of time from one to another point over the internet. In other words, network bandwidth is the capacity with which data is being transferred. The measurement for network bandwidth is bits per second or bps. These bits can be megabits (millions of bits per second) or even gigabits (billions of bits per second), depending on the capacity of the network. Network bandwidth is not to be confused with network speed. The difference between the two is that bandwidth refers to the capacity of data transfer, whereas speed refers to the rate of said data transfer.
- Video Latency: This metric indicates the difference of the time interval in which the Viewer sees what the Broadcaster sent. In other words, it shows how big of a delay there is between what is being streamed at the moment and what is being seen on others’ screens. Video latency is related to all of the above metrics in the following way: if the end user has a low network bandwidth, it would be more difficult for them to watch a live stream video in high quality. If said live stream has an ABR, the video quality could automatically be adjusted to the network bandwidth, and the latency would not be too high. However, if said live stream as a fixed video bitrate, the video quality could not be as easily automatically adjusted to the network bandwidth, which in turn would result in a higher video latency.
- Video Buffering: Video buffering usually happens when the network speed is slow and, due to that reason, unable to download the necessary amount of data while the user is watching a live stream or a VOD. Therefore the video buffers while the necessary amount of data is being downloaded so that the live stream or video can continue playing without stalls or freezes. The type of internet connection can also affect video buffering. A wired connection using an ethernet cable that is connected to the streaming or viewing device directly from the router allows for a more stable network, while a wireless connection can result in an unstable network by factors such as router distance, different objects standing between the router and the streaming/viewing device, signals from other devices, as well as shared network bandwidth, to name a few.
- Video Player Controls: Undoubtedly, controls such as Play and Pause buttons, as well as Volume buttons are the essential components of every video player. Some video players have whole fancy sets of additional controls, such as Mute buttons, Quality Dropdowns, Max Quality Dropdowns, Speed dropdowns (for VODs and Clips). These controls play a vital role in the functional testing of live streaming applications. For instance, hitting the Pause button while the live stream or video is buffering can help avoid video freezes. Similarly, changing the video quality to a smaller resolution can aid in preventing video buffering. And of course, changing the speed of a video from a normal speed to a higher speed allows for faster watching of the video, if you are in a hurry.
Testing Process meets Core Functionalities
This section discusses the point at which the testing processes and metrics described above connect.
In order to conduct a successful functional testing process, organization is key. The testing process does not start at the test execution level, rather, it starts at the planning level and builds up from there. Therefore, for one to be able to test the above discussed metrics, a sound test plan in which all key aspects of the test execution process are gathered is of fundamental importance. This is where prioritization comes in: prioritizing which elements to test allows to obtain a faster proof of concept and, in turn, verify whether the testing should stop or continue. Adhering to well established organization and prioritization processes leads to a more efficient and effective test execution process. This is where templates come in handy – the utilization of a test management tool can facilitate the creation of test execution as well as test reporting templates. Having planned, organized, and prioritized the functional tests, the Quality Engineers would be able to form sound templates which would come in handy come functional testing time. When the Quality Engineers would encounter a bug, they could apply the previously established bug reporting template in order to report said bug; of course, prior having tested the bug on a variety of devices and operational systems. In the end, the test execution process could be summarized in a test report using the test management tool.
To demonstrate the aforementioned with a practical example, imagine a Quality Assurance Engineer starts a test execution on the features with the highest priority while testing a live streaming application, and comes upon video glitches while testing video quality. The first thing to do would be to try to reproduce the error on various devices in order to see whether the error is happening on all devices, or whether it is device or OS specific. After determining the reproducibility, the next thing would be to examine the possible culprits, which is not always an easy feat. Video glitches could occur due to poor network bandwidth, a browser update, specific device properties, quality transcoding issues and many other internal and external factors. If the error is not device specific, rather, it occurs across different devices and browsers, it would be considered a release blocker. If the error is device specific, other factors would come into play, such as the percentage of device usage by end users and deployment risks connected to the found error. The Quality Engineer should include this in the test report, which will give a deeper insight into all specifics of a given test execution and aid in determining whether the tested build should be deployed, considering all risks.
Live streaming applications are predicted to grow in user base for years to come. Consequently, the quality standards for these applications will become even higher, and the need for them to be tended by skilled professionals will be of substantial importance. These professionals will have to keep up with the latest trends in technology – a field that is perpetually advancing, and the way in which they can achieve that is by obtaining constant professional growth. Of course, this is the way our quality assurance engineers at TestDevLab operate: continuously cultivating knowledge and expanding on their skills in accordance with the ever-developing tech world standards.