Will Millimeter Wave Work for 5G?

By Jaime Fink

With the recent hype surrounding the practicalities of delivering 5G, and the push to crown millimeter wave (mmWave) wireless as the de facto approach, the natural question many are asking is “will it work?”

While there can be no definitive answer until the trials are completed and the costs are understood, it’s a safe bet mmWave will have a role to play in 5G infrastructure. The vast amounts of new spectrum that mmWave bands offer is in itself an attractive offer.

Plus, mmWave is already delivering fiber-like speeds in line of sight, short range, point-to- point backhaul networks between buildings and towers, and recent cost and capacity improvements will surely continue. However, when it comes consumer fixed wireless, it is not that simple.

Today over 81 percent of the United States’ population resides in suburban and urban areas. These regions represent the vast majority of underserved residences, despite the better publicized digital divide in rural areas. To date, fiber has failed to cost effectively span the ‘last mile’ final connection to homes.

Unfortunately, while numerous obstacles limit the viability of fiber to the home, new mmWave solutions also have considerable obstacles to overcome in urban and suburban environments. This may make the solution less cost effective than it first appears.

Although it is significantly faster to deploy, the costs are still nearly equivalent to fiber.

The move towards 5G fixed wireless is progressing franticly. Just three years ago New York University research by Professor

Ted Rappaport in urban New York City explained that “future 28 GHz base stations will be pico cells of 100-200 meters of range, and phased antenna arrays capable of beam steering will be needed to overcome the high path loss and improve link quality.” Yet, Lowell McAdam, CEO of Verizon, recently projected cell sites spaced 1 km apart for their ‘wireless fiber’ 5G project.

This shows a significant discrepancy in terms of how much coverage mmWave base stations could offer, and of course how much it will cost. Much has been made about how mmWave can work in urban environments, and as the New York analysis demonstrated at 200 meters in an urban reflective environment, modern phased array technologies and massive MIMO technology can prosper. But moving into real world suburban residential areas, foliage is widespread and impairs high frequency mmWave signals far more than lower sub-6 GHz counterparts, the workhorse bands of the mobile and Wi-Fi industries.

For mmWave bands, this will significantly limit possible coverage in typical single family home areas, and reduce the number of homes served per base station. This, in turn, will significantly increase the cost of covering an area. At longer distances, without being able to penetrate foliage and with limited ability to generate multipath, line of sight becomes a virtual necessity, limiting the number of homes that can be connected.

The only viable solution is to shorten links to overcome environmental challenges, resulting in a much denser constellation of base stations closer to households. So this naturally begs the question whether sub-6 GHz frequencies should be considered as a better solution to serve high density suburban environments to improve coverage and reduce costs.

Over 14,000 wireless ISPs globally have quietly proven in rural areas that the sub-6 GHz bands are extremely effective at delivering fixed wireless services at long distances. Today, we already are experiencing over 1.2 Gbps of bandwidth with the latest generation of 5 GHz technology.

Now, with an influx of new technology innovations familiar to the 4G/5G community including low-cost 8x8 Massive MIMO, beamforming and new TDMA+GPS synchronization techniques, this new wireless architecture is poised for large scale use in high density neighborhoods for multipoint deployments for the first time.

Unlike the previous tower-based model, a new ‘MicroPoP’ model places base stations at lower elevation rooftop or utility pole locations in high density residential neighborhoods. In real world suburban deployments, subscriber homes within 300 meters are reliably observing full speed connectivity of 300 Mbps+, and out to 500 meters, 200 Mbps+. Plus, unlike the relatively high costs of high frequency 28 GHz, sub-6 GHz costs come in below $100 per subscriber, making it a cost effective alternative.

The unlicensed 5 GHz spectrum offers many advantages. It offers important signal propagation benefits, resulting in improved ability to penetrate obstructions versus higher frequencies. In addition it offers reductions in cost as a result of using the unlicensed spectrum, fewer PoP (Points of Presence) locations and less base station equipment. However, the 5 GHz spectrum is obviously subject to potential interference from Wi-Fi versus protected licensed spectrum. At first glance, this may seem concerning, but with new advanced spectrum reuse techniques leveraging TDMA+GPS synchronization, it’s possible to run an entire network on only two operating channels. This represents only a fraction of the unlicensed spectrum. Furthermore, with signal attenuation of interfering low powered Wi-Fi router signals through house walls and roofing materials, it is easy to attain the necessary 10-25 dB of signal to noise to deliver a quality service.

It’s abundantly clear the entire wireless industry is making enormous technological strides, and that the ‘wireless first’ approach is gaining ground over fiber and DSL networks. By carefully deploying and comparing different models, we’re now able to witness the impact of different spectrum in varying deployment environments, applications, and resulting costs. This will enable the industry to make the most of each technology and improve the quality of global internet access.