The popular Internet of Things (IoT) vision of the future includes billions of communicating devices, everything from cars to wearable health monitors to vending machines. While some of these devices will use wired connections, most will use wireless connectivity. Now, most people connect their personal devices to 4G mobile networks, allowing for Wi-Fi offload—that is, offloading 3G/4G mobile users to a Wi-Fi network to help carry some of the load— by the end of the decade, 4G will no longer be able to deliver the quality of experience mobile device users expect.
Pin blame on the world’s appetite for the Internet: estimates are that every minute more than 5 million videos are seen on YouTube, almost half a million tweets appear on Twitter and nearly 70,000 photos are posted on Instagram. The data explosion will require a revolutionary new network backbone. Called 5G, it will be—as the moniker suggests—the fifth generation of wireless networks.
At their core, 5G networks will provide more data bandwidth and less latency (the time it takes one device to send a packet of data to another device) with the ability to handle data much more efficiently than the current 4G networks.
Is it too soon to be discussing 5G? The answer is no. From 2G in the early 1990s to the 4G LTE networks that did not emerge until late 2009, a new generation has appeared approximately every 10 years. Therefore, 5G fits a timeline that calls for a new generation network to begin to roll out in 2020.
5G will integrate networking, computing and storage resources into a unified infrastructure. To achieve this, the 5G Infrastructure Public Private Partnership (5G PPP) has been created by the European Union Commission and industry manufacturers, telecommunications operators, service providers and researchers. 5G PPP is tasked with delivering solutions, architectures, technologies and standards for the ubiquitous next-generation communication infrastructure of the coming decade.
Already, 5G PPP has outlined several goals for the future wireless network. These include:
- The ability to handle traffic from more than 100 billion devices. This will require a network with a capacity of several thousand times that exists today.
- The capacity to provide peak end‐user data rates of at least 10 Gbps, with generally available end‐use data rates of at least 100 Mbps, regardless of a user’s location. Applications that require this speed include Ultra-High Definition (UHD) multimedia streaming. 5G rates less data-intensive high-definition (HD) video movies. For example, it should be accessible and ready for viewing within seconds after the user presses play.
- The possibility for latency of 1 millisecond when needed. Low latency is needed to support emergency services, cloud-based data storage and retrieval, remote surgery, online gaming and safety in autonomous vehicular applications. With 4G, the latency rate is around 50 milliseconds.
Expectations are that 5G networks will include new local area radio access technologies, which will co‐exist with current 4G/LTE systems. The 5G network will be a conglomeration of different networks optimized for specific purposes. For example, one network could be in place to handle mobile phone calls, a second network with a higher data rate could target transmission of UHD video, a third network could deal with car-to-car communication, and a fourth network, one with a lower data rate and low energy use, could be designed for sensor networks. All of these networks would have different quality-of-service requirements and each will use the wireless technology best suited for a particular application.
Enabling technologies that include higher frequency millimeter wave solutions, are expected to play an important role in delivering cost-effective gigabit data rates to users throughout an operator's mobile network. Millimeter transmission is so-called because radio waves in the electromagnetic frequency spectrum from 30 to 300 Gigahertz (GHz) have wavelengths from 1-10 millimeters.
At National Instrument’s (NI) Week 2015 in Austin, Texas, in early August, Nokia Networks demonstrated a prototype mmWave system capable of transmitting 10 Gbps over 200m using a 73 GHz carrier.
While millimeter wave bands are expected to provide 10 times more bandwidth than 4G cellular bands, drawbacks do exist; above 3 GHz signals begin to dissipate quickly and become more directional, resulting in a shorter effective range. Millimeter-wave frequencies also do not pass well through solid objects and the signals can be blocked by buildings. What is more, they are adversely affected by inclement weather, so external conditions such as rainfall, snowfall and fog must be taken into consideration.
As a result, high-frequency communication is not suitable for providing umbrella coverage and 5G networks operating in the millimeter wave region will not fully replace the microwave transmission band allocations currently in use. However, millimeter wave spectrum can help carriers cope with large volumes of small cell traffic. 5G networks adopting this technology will most likely use small base stations positioned closely together.
Advanced multiple input-multiple output (MIMO) techniques, which use numerous antennas and receivers on base stations and mobile units to service a single stream of data, potentially could overcome millimeter wave issues. MIMO first made its way into the consumer mindset with the release of Ethernet standard 802.11n, where up to four separate physical transmit and receive antennas carry independent data that are then aggregated in the modulation/demodulation process. As a result, 802.11n provides an improvement in the maximum single-channel data rate to over 100 Mb/sec.
Today's MIMO solutions configure antennas to form beams in one direction—horizontally—allowing multiple users to each receive a signal from the antenna. This effectively allows a cell's capacity to increase because users in the cell no longer compete with each other. With the future introduction of Full Dimension MIMO (FD-MIMO), wireless signals can be adaptively fed to an array of transmission antennas to form virtual beams that can lock in on multiple receivers in three dimensions. This mitigates interference from overlapping simultaneous transmissions and increases the power of the signal that reaches the target.
At NI Week, Samsung demonstrated a base station that transmitted simultaneously at different data rates to four separate receivers. For the demo, the base station transmitted at 3.5 GHz, although production transmitters will likely use tens-of-gigahertz carriers. Samsung’s 3D beamforming algorithms increased data throughput from 2 Mbps to more than 25 Mbps per user.
By using evolving 4G/LTE technology, small cells and well-integrated Wi-Fi networks engineers believe sufficient 4G capacity exists through the end of this decade to achieve a minimum downlink user data rate of 10 Mbps. However, beyond 2020, 4G will begin to run out of steam. The first large-scale tests of networks running at high frequencies for use in the 5G wireless era are expected to take place at the 2020 Olympics in Tokyo.