One of the most important parts of small cell testing is interoperability between vendors’ products; establishing the robustness of the standards and to identify and resolve issues before solutions are deployed in real-world networks. The Small Cell Forum and the European Telecommunications Standards Institute have partnered on a series of five plugfests in the past several years to test features and interoperability in small cells, with the next event slated for mid-April (the registration for that plugfest is March 9 – more information is available here.)
“Until you plug two systems together, you’re never quite sure how the stability may play out, the way messages are exchanged or the choices that are taken based on the messages that have been exchanged,” said Alan Law, chairman of the Small Cell Forum.
The most recent plugfest, held last year at an Orange lab seven floors below Paris, focused on LTE small cells and put particular focus on testing multivendor self optimizing networks and voice over LTE as well as circuit-switched fall back. It was the largest event so far, according to Silvia Almagia, plugfest manager for ETSI’s Center for Testing and Interoperability, with about 65 engineers participating over a two-week testing period.
“There is a very nice collaborative spirit among the engineers,” Almagia said. “We are working together to have the tests passed, and fix the problems both in the specifications and in the implementations.”
Results are anonymized, but the plugfests still provide an interesting window into what stage of successful interoperability small cells have reached in a multivendor environment. (You can read the full report here as a pdf.)
The most recent plugfest had an overall 93.6% pass rate among the interoperability tests conducted, with a “very acceptable” failure rate of 6.4% that ETSI said “can be explained by software bugs [and] implementation errors that could not be fixed before the end of the plugfest, as well as some ambiguities or errors in standards.”
Interoperability success varied widely. For instance, where 316 tests were completed for Home ENodeB management with a 98.1% success rate, there were 18 tests completed on SON, included for the first time at a plugfest; 12 passed and six did not. Another 25 tests ran out of time.
“The main reasons for these relatively low rates was the lack of support from macro ENB vendors,” ETSI concluded in its plugfest results document on the results of the SON testing. “Test cases and test setup are expected to be refined for future events as companies get familiar with this test group. An improvement in these results can be expected in future events.”
In mobility testing, 54 tests were completed, with 29 passing and 25 failing. Another 26 tests ran out of time. The mobility test group “is one of the most challenging in terms of preparation, configuration, lab setup and tools, which explains the low execution rate (45%) and interoperability level (53%),” ETSI said. In the same vein as SON testing, “the lack of support from macro ENB vendors is causing long ramp-up periods and misconfiguration problems. These issues were largely discussed with participants during and after the Plugfest, and it was agreed to proceed to a thorough review of both the lab requirements and test cases.
Regression testing, typically used with software debugging to check that changes haven’t affected functionality, was the most frequently conducted test category, with 728 tests completed and a 94.5% rate of success.
The upcoming event in April will be conducted entirely remotely for the first time. Almagia said that the interoperability testing will focus on carrier aggregation, local IP access, selected IP traffic offload and closed subscriber groups. There will be particular emphasis on signaling due to the nature of remote testing, she added.
ETSI has developed a handbook for best practices for interoperability, available for free here.