The digital technology wars that marked the launch of second-generation services in the United States is one of the most storied and significant events in the industry’s short history. In its infancy, the U.S. wireless industry adhered to a common analog standard-AMPS. The ensuing popularity of the wireless industry surprised industry experts and casual industry observers alike. Late in the 1980s, some carriers began to realize that the popularity of wireless would push current analog capacity to the limit and a next-generation technology would be needed to allow the industry to continue growing.
“In the late ’80s we started to hear that certain cellular markets in the United States were becoming congested,” said John Kaewell, senior vice president, advanced product development, at InterDigital Communications Corp.
“The industry in the United States began like many other parts of the world looking at converting analog networks to digital because people realized that some of the early predictions on the growth rates were going to be obliterated exponentially,” said Chris Pearson, executive vice president of the Universal Wireless Communications Consortium. “This was going to be a technology that was going to go to the mass market.”
Digital flavors
Early on, the industry debated the merits of TDMA and FDMA technologies, eventually settling on TDMA as the standard to which the industry would migrate. The industry began setting requirements, and some early movers launched the technology. McCaw Cellular was one of the first companies to launch digital on its networks using TDMA technology.
“I think the advantage of being a first mover in digital made us think about multiple standards knitting together divergent spectrum,” said Bill Malloy, a long-time McCaw Cellular/AT&T executive who now is chief executive officer of WorldStream. “If McCaw hadn’t moved as fast as they did in digital and got their nose bloody on it and learned the lessons of field deployment, there’s no way that when they launched PCS networks they could have as quickly knitted them together as they did.”
“AT&T/McCaw was trying to be on the front edge, and at that point AT&T, which included Lucent, thought if it could get its networks up using that technology, everybody else would follow,” said Mike Crawford, president of M/C/C, a marketing communications company.
Instead of following McCaw down the TDMA path, however, many U.S. carriers elected to wait for a different technology option.
“A number of folks began becoming concerned that the TDMA spec was not going to be an appropriate spec because it effectively was not going to meet what they thought were their needs,” said Perry LaForge, executive director of the CDMA Development Group.
At the time, GSM technology, which Europe had adopted as its technology standard, was not a choice for cellular carriers in the United States because GSM equipment was not available for the 850 MHz band. Meanwhile, a small San Diego upstart was trying to find a terrestrial niche for a digital technology it had been working on for the satellite voice industry.
Qualcomm demonstrated CDMA technology to PacTel in 1989, which caught the attention of Ameritech and Nynex. With three significant players interested in the technology, Qualcomm continued testing and developing the technology until it was approved as a standard in 1993.
“There were a lot of people who said `We’ve got a standard. Why do we need something else?”‘ recalled Butch Weaver, executive vice president of engineering at Qualcomm, who has been with the company since 1986. “But there were some people at PacTel, Ameritech and Nynex who wanted something more than the TDMA standard and were interested in getting the 10 to 20 times capacity increase instead of the two to three times capacity increase that TDMA gave, and that significant difference in capacity improvement was enough for them to look at alternatives.
“Making it a standard was important because that gave one level of credibility,” said Weaver. “But there was still the issue that CDMA was a new technology. Where TDMA was more of an evolution, taking CDMA into the cellular world really was more of a revolution.”
“It was difficult for them to win acceptance of a new waveform with a wider bandwidth channel format in an existing scenario that had narrow bandwidth channels,” said InterDigital’s Kaewell. “They built momentum and they built a war chest, and they built a good team of engineers to put it all together.”
“CDMA had to fight hard because it was sort of the up-and-coming standard,” said LaForge. “There are a lot of things that happened, ultimately, that made it happen, but it was really driven by the carrier needs.”
When PCS licenses were auctioned off during the mid-1990s in the United States, GSM technology could finally enter the market in a meaningful way, and a three-horse technology race began.
“One technology was available earlier on, and the camps that decided to go with TDMA migrated to TDMA,” said Daniel McGuire, vice president of sales and VP in charge of CDMA technology at Audiovox Communications Corp. “Other camps had been basically swayed away by Qualcomm.
“Here in the U.S., there were a lot of customers churning,” said McGuire. “It brought the competitive nature in and not only made carriers make a decision based on what the technology offered but also it brought them away with the feeling that the customer would no longer be able to churn (if they had) a distinct technology.”
“Carriers were trying to assess for their markets what made the most sense, and I think it was a combination of their spectrum, their spectrum planning, the economics of upgrading to new systems and what they had currently in legacy equipment in AMPS,” said Pearson. “All these factors went into it, and people came out with different decisions.”
Lessons learned
While U.S. carriers were busy deploying networks using three standards, European nations agreed to standardize on GSM. Many wonder whether the United States ultimately was hurt or benefited by having competing digital technologies. Some believe customers and carriers alike could have enjoyed greater economies of scale had the United States settled on a single standard.
“There’s only so many engineers and there’s only so much time and money and resources to put together these very sophisticated little phones, and it’s like we’re inventing many different wheels. If we could have consolidated that, I think the consumer would have benefited and maybe the U.S. would have a stronger position overall in the world with cellular technology and products,” said Kaewell.
While conceding that the process may have been smoother along a single-technology path, most still believe that an open-market approach is superior.
“We have always proposed that open-market competition is the best route, and I would still say that’s the best route,” said Pearson. “If you look at any technology or any industry, the more competitors you have attending to the needs of the consumer, the better.”
“In general, I am not one to believe that the U.S. telecom industry in the long run has been hurt by not mandating a specific standard,” said LaForge. “I think in fact what it has led to is innovation and, in the long run, will lead to even greater advancement of the wireless industry, particularly in those areas where people are given options. The marrying of public policy and technology policy always leads, I believe, to adverse consequences.”
“Each proponent of a technology would like to have everybody go their way, but then it’s not competition anymore,” said Qualcomm’s Weaver. “If things become too homogenous, you stifle innovation. If things are too freewheeling, that’s bad for the consumer. So you have to work together to have these systems strike a balance and allow the
users flexibility but still allow the new technology to develop.”
However
, as the industry moves to third-generation technology, wireless players have again divided themselves into different technology camps.
“If there’s going to be a 3G, there has to be unified resolve from the carriers to the infrastructure players that is unwavering,” said Malloy. “I think people are now starting to add up the cost of this experimentation.”