NETWORK-BASED DRIVERS FOR BETTER SPECTRUM AWARENESS
There are multiple reasons why operators and regulators are interested in more comprehensive and efficient approaches to spectrum monitoring and the identification, location and mitigation of interference. These range from enabling new spectrum sharing approaches, to enabling the internet of things, to improving cellular network operations. On the network-focused side, these include:
–Complex spectrum holdings to manage and optimize. Operators are juggling increasingly varied spectrum porfolios in order to meet capacity needs and support LTE Advanced and LTE Advanced Pro features such as three-component and four-component carrier aggregation. New spectrum bands, when available, are being brought into operation as quickly as possible: T-Mobile US expects to have some of its newly-won 600 MHz spectrum operating in some locations by the end of the year. LTE in general is a very noise-sensitive technology, particularly in the uplink and some of the features such as higher-order modulation schemes require excellent channel conditions in order to achieve the full benefit of the feature implementation.
“It is key, especially with LTE, to really squeeze as many bits per second out of every hertz as you can,” said Bill Myers, vice president of corporate strategy and product management at ISCO International. “To do that, you really do need as good a [signal to noise ratio] as you can get.”
Myers added that ISCO is seeing the emergence of more “self-inflicted” interference as carriers utilize more and more bands at the same sites. PIM, Myers added, has historically been intermodulation created on a single band by loose or rusty connections or nearby sources of metal: railings, roof flashing, billboards. Often, physical sources could be mitigated. However, new PIM problems are more complex because he said that certain combinations of spectrum can create intermodulation: some PCS and AWS spectrum, for example, or combinations of 700 MHz bands.
“They’re deploying multiple networks to get more capacity, but if they have to back off the transmit power in order to prevent uplink interference from PIM, it really takes away the value of the spectrum they’ve purchased and are deploying,” he said.
–A noisy spectrum environment. In late 2016, the Federal Communications Commission asked for public input as to whether it should pursue a study of changes in the “noise floor,” or the state of noise and unwanted signals in the overall radio frequency environment, given the radical change in the number of RF radiators in the past two decades or more. The answer from industry came back as a fairly urgent “yes”, with many respondents citing specific sources of noise that they commonly deal with such as LED lights, fluorescent light ballasts, large display screens in event venues, baby monitors and many more. An increase in the overall noise floor means that networks operate less efficiently and have worse performance.
“Anecdotally, CTIA believes that the increase in device usage has caused interference to licensed, exclusive use spectrum,” CTIA told the FCC. “For mobile carriers, the reliable coverage area is mathematically defined by the ambient noise floor, the immunity of the receiver, and the transmit power of a given cell site. Growth in the noise floor thus leads to a reduction in a carrier’s reliable service area. If a cell site is at the outer boundary of a carrier’s system, the interference may result in lost coverage. Meanwhile, customers traveling between cells may experience more dropped calls. Moreover, a local rise in the noise floor can reach a point where the required carrier-to-interferer ratio exceeds the design specifications for a mobile device operating at the fringes of a cell site’s reliable service area. In such a case, consumers may experience diminished voice quality, slower data transmission, and decreased coverage.”
–Network densification. Networks continue to densify, which is expected to only increase with the advent of millimeter-wave-based “5G” systems. Densification is a double-edged sword when it comes to the RF environment: getting antennas closer to the end user, with smaller cell radii, can mean that the system inherently has fewer noise sources than a sector that covers kilometers of area.
“If you have a radius of 3 kilometers, there’s a lot of potential interference. In a radius of 30 meters, it’s much less likely,” said Paul Denisowski, applications engineer with Rohde & Schwarz. In addition, he noted, technologies such as beamsteering, and beamforming in a 5G context will limit the impact of broadband noise by focusing and concentrating the energy. And at millimeter wave frequencies, the bands are very wide – so narrowband noise is something that the network can likely schedule around, not to mention that there simply isn’t as much natural interference at higher frequencies.
“For LTE, interference will continue to be a problem, but densification may take care of a lot of that,” said Denisowski.
On the other hand, small cell sites — by nature of being located closer to users — are also located closer to a variety of potential interferers.
“As the infrastructure moves more and more to small cells and indoor venues, there’s a lot more antennas located a lot closer to copy machines or light ballasts than there used to be,” Myers of ISCO added. “When you densify, you’re taking a problem for one or two sites and making it a problem for 10 to 20 sites, where they’re all impacted within that area. Things can impact more sites because there are more sites to be impacted.”
-More RF-savvy rule-breakers. The general population may not think much about the RF environment except to check the number of bars on their smartphones, but there are plenty of people who understand just enough about signal blocking to know it can be achieved and either don’t know or don’t care that it’s illegal for them to do it.
In one case in Florida, a man drove around for more than a year with a cell phone jammer in the trunk of his car, because he was frustrated with other drivers using their cell phones and decided to block them from doing so. A Chicago public transit user was arrested in 2016 for bringing a cell phone jammer with him on the metro because he was annoyed by fellow commuters talking on their phones. In April of this year, AT&T had network issues in an area of Dallas, Tex. because a business owner there was operating an illegal jammer to keep employees from using their mobile phones at work; in that case, the FCC ultimately was called in and issued a $22,000 fine to the business owner.
–Need to reduce network operating costs. Remote spectrum monitoring is starting to emerge as a supplement to traditional interference mitigation approaches, to provide a more comprehensive and constant view of the spectrum environment that ultimately gives interference hunters better data and more information to start with, so that less time and money is spent resolving RF issues at individual sites. Remote spectrum monitors can also record information over time, so that intermittent signals can be more easily tracked and identified.
“People want to be able to monitor spectrum remotely – they don’t want to have to send someone out. This is the holy grail,” said Paul Denisowski, applications engineer at Rohde & Schwarz. “I can look at all the spectrum, all the time and if there’s a problem, I know about it right away and can diagnose it.”
“At the end of the day, they’re all being strapped for resources, so they need to be able to cover a lot more ground and to be smarter, faster,” Myers said.
Join RCR Wireless News today for a webinar on spectrum monitoring and analysis.
Image copyright: audriusmerfeldas / 123RF Stock Photo