Learning from the TV white space debacle
Some wireless technologies, such as “4G” and Wi-Fi, have been so successful and have become so much a part of our lives that people might assume all wireless technologies must be similarly triumphant. However, history is littered with wireless experiments that never achieved the widespread use envisioned by their creators. Examples include the Metricom Ricochet network; pre-standard wireless local area networks; cellular digital packet data; 2G and even initial 3G cellular-data services; multiple low-earth orbiting satellite networks; and municipal Wi-Fi. Now, we can add another to the list: TV white space networks.
Understanding why one technology fails while another is wildly successful is crucial for anyone involved with setting U.S. spectrum policy. This information informs how future networks may operate and is particularly relevant as policymakers look to extend the use of Spectrum Access System databases.
The idea of white spaces was to make unused TV channels available for unlicensed use, with the expectation that the favorable propagation of sub-700 MHz frequencies would unleash innovative applications not otherwise possible. For example, in its Second Memorandum Opinion and Order in 2010 (available at https://apps.fcc.gov/edocs_public/attachmatch/FCC-10-174A1.pdf), the FCC said: “Access to this spectrum could enable more powerful public Internet connections – super Wi-Fi hot spots – with extended range, fewer dead spots, and improved individual speeds as a result of reduced congestion on existing networks. … The potential uses of this spectrum are limited only by the imagination.” Industry proponents said the technology would “benefit consumers in ways yet unimaginable.” (https://net.educause.edu/ir/library/pdf/EPO0733.pdf) Yet last year, in the U.S., only 600 wireless devices using white space networks were in operation (http://www.wired.com/2015/04/fcc-white-spaces-database/). What went wrong?
As I explained in a 2013 article (https://gigaom.com/2013/03/17/white-spaces-networks-are-not-super-nor-even-wi-fi/, “super Wi-Fi,” as people were calling TV white spaces, was neither super nor Wi-Fi. The available speeds, about 30 megabits per second at best per six-megahertz TV channel, were only a fraction of what Wi-Fi users were experiencing with other Wi-Fi technologies. Also, the technologies being touted for “super Wi-Fi” were not compatible with other Wi-Fi technologies: users needed entirely new equipment. In fact, the misrepresentation of the term “Wi-Fi” led the Wi-Fi Alliance, which tests for interoperability and promotes Wi-Fi technologies, to ask that the term “super Wi-Fi” not be used to describe white-space networks. (http://www.wi-fi.org/news-events/newsroom/wi-fi-alliance-statement-regarding-super-wi-fi)
More concerning was the misunderstanding of so-called “super Wi-Fi” “favorable” radio propagation. The propagation characteristics of made it likely that multiple unlicensed users of the network in any overlapping coverage area would interfere with one another. In contrast, “normal” Wi-Fi and its short range means only nearby networks see each other, and because Wi-Fi has sufficient co-existence mechanisms to operate well in such circumstances, interference is minimal.
The technical limitations of “super Wi-Fi” — the technology envisioned for TV white spaces — limits the utility of the spectrum, favoring point solutions and rural deployment rather than wide-area networks in dense population areas that could serve large numbers of users. In addition, there are a variable number of channels across different markets, which inhibits the build out of multicity deployments.
The technology does make sense for wide-area networks in countries that have no wireline infrastructure, which explains why we do see white space projects springing up around the world, especially in less-developed countries on the African continent (http://dynamicspectrumalliance.org/wp-content/uploads/2016/01/Pilots-and-Trials-Brochure_Jan-16.pdf).
There are lessons to be learned from this foray into deploying more unlicensed spectrum to support large-scale commercial mobile broadband deployments. They include:
- Database-driven spectrum access systems are still in their infancy and cannot be relied on for near-term spectrum deployments or network configurations.
- Wide-area unlicensed bands are not a practical solution to the problem of spectrum-constrained commercial mobile broadband networks.
- Spectrum sharing remains more of a theory than a solution, especially in dynamic situations.
It is therefore concerning that policymakers are importing the same vulnerabilities and concerns into the 3.5 GHz band, which will use a SAS database to coordinate incumbent, licensed and unlicensed users. As I outline in a recent piece (http://hightechforum.org/scary-experimentation-3-5-ghz/), the Federal Communications Commission is conducting an ambitious experiment with valuable spectrum. And it is acting as if its previous experiment with white space spectrum was successful, when in fact it was a failure.
Eventually, dynamic database-driven systems coupled with spectrum-sensing mechanisms may become the norm, at least for certain usage scenarios. But the technology has not yet matured to that extent. Coordinating access to spectrum for multiple simultaneous users is at the heart of any wireless system, accounting for its complexity. Coordinating such access across disparate systems is exponentially more complicated. Government and industry stakeholders should take measured steps in implementing such systems. (See my report on the complexities of spectrum sharing at http://www.rysavy.com/Articles/2014-04-Spectrum-Sharing-Complexities.pdf).
Instead of taking measured steps, however, policymakers appear to be doubling down on their approach in the TV white spaces context and are now considering the 3.5 GHz SAS for other spectrum bands, such as in the July 2016 Further Notice of Proposed Rulemaking for 5G millimeter wave spectrum. (See https://www.fcc.gov/document/spectrum-frontiers-ro-and-fnprm). The SAS will be the most complex spectrum management system ever developed. We don’t know how well it will work or whether anybody will even use the resulting system. Until it has been deployed, tested, proven to work technically, and most important of all, proven to support effective business models, we should not be designing it into other bands and unnecessarily placing those bands at risk.
Peter Rysavy, president of Rysavy Research, has been analyzing and reporting on wireless technologies for more than 20 years. See www.rysavy.com.
Editor’s Note: Welcome to Analyst Angle. We’ve collected a group of the industry’s leading analysts to give their outlook on the hot topics in the wireless industry.