The perfect pair: Edge computing + 5G
by Blogs & Opinions
What was once believed to be the foreseeable future is happening right now, whether we are ready for it or not. With the emerging 5G ecosystem, we are in the middle of a data revolution that has operators scrambling to overcome the challenges associated with having a centralized cloud computing environment. Edge computing promises to remedy these challenges (addressed later in this blog) by creating a distributed network architecture to offload tasks from the core.
There are many facets to edge computing; however, when we discuss digital transformation today, similar to the real estate market, the hot topic is – location, location, location. Many of my recent conversations about the growth of 5G and the Internet of Things (IoT) are focused on the geographic location in which the applications will live, and the edge of a network is where everything is going.
The birth of edge computing
Cloud computing architecture has been around for a lot longer than edge. People would send all data to the cloud to be processed and analyzed in one centralized location. Although many benefits come with cloud computing, such as flexibility and scalability, large amounts of data cannot always be migrated or moved quickly to the cloud and alternate strategies are needed. Thus, the birth of edge computing is an efficient way to handle the continual increase of data volume.
Why is location so important?
Organizations need a more seamless way to access the data, interact with it and make faster decisions in real-time. Enterprises, for example, require a more intelligent placement of workloads and division of tasks among multiple computers and access points connected via a single network to speed up those tasks. Having a distributed cloud in your network infrastructure enables less strain on the core network and provides the means to deploy ultra-low latency applications in remote locations.
The optimal time to have a distributed model – including applications at the far edge of the network – is when endpoint devices leverage more and more data. The emergence of 5G brings with it increases and surges in data traffic and speed demands of next-generation networks. Edge computing allows service providers to process much of the data locally from a cell tower or as close to end-users as a street cabinet. In addition, edge computing helps offload data to the cloud during peak times and information is sent to the central cloud only when the network becomes overly congested.
Why 5G and edge computing are perfect together
The shift from 4G to 5G networks is in its beginning stages and is becoming a reality faster than we can imagine with the help of new technologies. Thanks to 5G, people have warmed up to edge computing technologies. The reason behind it is most use cases for 5G require a lot more computing power at the network edge. With 5G comes more smartphones, more data, more network congestion, and unfortunately, centralized cloud computing may be too slow for devices that require data processed in milliseconds.
Take online gaming, for example. Edge computing promises to offer much better in-game experiences by reducing latency. Since latency is such a fundamental component of online games, and AR and VR the future of gaming is where you think it is – at the edge.
Also, the potential for edge computing to transform the telecommunications industry by leveraging existing cellular networks is revolutionary. As 5G networks roll out more fully worldwide and implementations by some of the big players like AT&T and Verizon have already begun, mobile edge computing plays a critical role in a telecommunication company's next-generation infrastructure strategy, regardless of business size.
Although the idea may seem like a huge leap, edge computing promises substantial economic benefits by reducing data center costs. Data center cost reduction alone is enough to drive many organizations to move to compute towards the edge as they transition to a network architecture that supports 5G.
Edge and increased cost savings
Having a centralized architecture model means storage is located close to compute resources. That works well for companies looking for pay-per-use billing, even if that means lower performance for end-users.
However, when vast loads of data are transferred on or off the cloud, data transfer is very expensive and as a result much less appealing. Costs associated with bandwidth, distance data is traveling and resources needed to monitor and configure the transferred data all tend to add up very quickly. The "edge" equation is quite simple - reduce data volumes sent to the cloud and automatically lower your costs.
Due to these cost savings, companies are forced to rethink their IT infrastructure approach to incorporate edge into their traditional data processing strategies. With edge server hardware, part of the data processing can be done closest to the users and not automatically sent to the cloud to be analyzed. This distributed offloading solution reduces capital expenses as well as bandwidth costs. And by reducing the distance by which data travels before it is processed, operators have more money to spend on modernizing legacy systems that previously were considered barriers to entry for infrastructure innovation.
What does this mean for cloud?
Finally, this brings me to question whether cloud is disappearing with the rise of edge computing. I realize the answer is 'no.' All it means is that the cloud is coming closer to us, the consumers of the data. An exciting challenge for forward-thinking companies will be how to deploy both cloud and edge computing technologies.