Editor’s Note: Welcome to our weekly Reader Forum section. In an attempt to broaden our interaction with our readers we have created this forum for those with something meaningful to say to the wireless industry. We want to keep this as open as possible, but we maintain some editorial control to keep it free of commercials or attacks. Please send along submissions for this section to our editors at: [email protected].
Squeezing the maximum battery life out of a mobile phone is a complex challenge, and most of the technical efforts to date have centered on the hardware and operation of the handsets themselves. However, there are a number of other external factors that can have a significant impact on the power consumption of mobile handsets. Indeed, it could be argued that the second biggest drain on battery life is actually the software engineers working at base station manufacturers.
It might seem hard to believe, but duplexing schemes and network resource allocation strategies in the base station have a direct impact on handset battery life. In LTE, where the networks dynamically allocates resources to handsets, the base station scheduler decides whether the data payload should be sent as a short burst of data at maximum power and bandwidth, or whether the same data is sent at low bandwidth over multiple time slots. The difference between these strategies, determined more often by base station manufacturers than by operators, could mean as much as a ten-times increase in battery drain.
Equally, in full duplex FDD systems, simultaneous receive and transmit creates a lot of complexity and expense in the RF front end, as a high degree of isolation is required between the two signal paths to avoid noise leakage. This is very challenging and traditionally power amplifiers in handsets have had to compromise on efficiency to allow for these worst-case noise requirements. This compromise increases the drain on battery life from the RF front end.
However, for network operators, the deployment of full duplex FDD represented a more natural transition from the voice-oriented architecture of previous 2G and 3G networks. As such it remains the dominant standard used in the majority of current LTE networks.
FDD vs. TDD
The dominance of full duplex FDD is also great for filter suppliers – filters are a growing cost center in handsets as original equipment manufacturers attempt to minimize noise leakage, but there are alternatives.
TDD is gaining more traction as a standard for LTE networks – most obviously in China, but also in the United States, Japan and other regions such as India and Australia. TDD systems do not allow simultaneous receive and transmit from handsets, instead the transmit channel and the receive channel take turns on the same frequency band. The frame structure is usually configured asymmetrically, reflecting the typical four-to-one ratio of downlink-to-uplink data traffic.
As handsets can only be transmitting or receiving at any given time, TDD systems do not have such strict noise issues in the RF front end and power amplifiers do not need to compromise so much on efficiency to avoid leakage. As a consequence, battery drain is somewhat reduced, although filters are still required for co-existence with other systems such as Wi-Fi and GPS.
Embracing half duplexing in FDD
There is a third way that can reduce the strain on handset battery life by combining some of the best of both FDD and TDD – half duplex FDD.
In a half-duplex FDD system, the transmit and receive bandwidth for a particular handset is scheduled in alternate time slots of the FDD frame structure. As a result the base station can avoid the handset having to transmit at high power whilst also trying to receive a weak signal from a distant base station. Unlike true half-duplex (TDD) networks, where all handsets have the same fixed time slots for transmit and receive, half-duplex FDD achieves half-duplex operation at each handset, whilst maintaining full capacity at the network level, by allowing half of the handsets to transmit in one time slot and half to transmit in the next time slot.
This brings some of the benefits of TDD into FDD systems. Base station manufacturers could easily implement half-duplex FDD in software today, without requiring operators to rip out and replace existing FDD infrastructure.
Combined with envelope tracking enabled PAs in the handset, which allow power consumption and RF performance to be traded off in software rather than fixed in the hardware design, shifting to half-duplex FDD could enable highly efficient RF front ends transmitting at full power with minimal noise interference. Unlocking this combination will simplify handset design, enhance wireless performance and increase battery life for users.