YOU ARE AT:Test and MeasurementSRG talks 'apples-to-apples' network comparisons and interactivity testing

SRG talks ‘apples-to-apples’ network comparisons and interactivity testing

How do you approach network benchmark testing so that it represents the user experience in a way that’s as reflective of real-world conditions as possible? And how do you best test the back-and-forth data transfers that determine the mobile gaming experience but aren’t necessarily captured by one-off tests of latency and data speed?

Signals Research Group typically focuses on the Radio Access Network and the real-world capabilities of specific network features, but it recently worked with Rohde & Schwarz to conduct testing in the Dallas, Texas, market that focused on 5G user experience. The testing used ETSI’s published technical recommendations on best practices for benchmarking mobile networks, which outline how to collect and score data as well as which KPIs “really influence the customer experience.”

“Max speeds are nice for marketing purposes, but they have little bearing in determining the user experience with typical applications and use cases,” according to the resulting report. “We prove this point and show why an industry-approved and fully-disclosed approach makes sense for benchmarking purposes.”

How quickly a mobile game responds, and how fast a network page loads, isn’t necessarily a direct reflection of data speed, explained Mike Thelander, CEO of SRG. “It’s not about how fast can I download a gigbyte file, it’s much more so about the smaller data transactions,” Thelander said, adding that that is appropriate for what most consumers actually do on mobile networks.

The ETSI-defined approach “really provides a framework that is open and well-defined,” said Emil Olbrich, SRG’s VP of network technology. While SRG can (and in various reports, has) tested things such as RF performance, latency, throughput, quality of video, and other KPIs, the ETSI standardization lays out exactly how each factor is weighted as well, rather than a proprietary secret sauce. Disparate networks can be compared on the basis of the same KPI, measured and weighted the same way, and their evolution over time can be tracked in an apples-to-apples manner.

The results of SRG’s testing, by the way, were: AT&T ranked with the best network performance overall, followed by Verizon and then T-Mobile US. Voice performance among the three carriers was practically equal, it was the data performance across multiple tasks that was the differentiator among them.

One of the other things that is being standardized is an approach to interactivity testing of applications such as mobile gaming, which have a lot of back-and-forth data transfers. Thelander and Olbrich discuss this further in the video interview below>

Watch the video interview with SRG below:

ABOUT AUTHOR

Kelly Hill
Kelly Hill
Kelly reports on network test and measurement, as well as the use of big data and analytics. She first covered the wireless industry for RCR Wireless News in 2005, focusing on carriers and mobile virtual network operators, then took a few years’ hiatus and returned to RCR Wireless News to write about heterogeneous networks and network infrastructure. Kelly is an Ohio native with a masters degree in journalism from the University of California, Berkeley, where she focused on science writing and multimedia. She has written for the San Francisco Chronicle, The Oregonian and The Canton Repository. Follow her on Twitter: @khillrcr