Edge computing is a current hot topic and has grown over the past several years to become one of the most important trends. It is both a continuation of digital transformation and a pre-cursor to further digitalization. The development of other trends such as cloud computing, internet of things (IoT), analytics and real-time analytics all depend on development and evolution of edge computing.
Gartner, the leading global research and advisory firm providing information, advice, and tools for leaders in IT, defines edge computing as “a part of a distributed computing topology in which information processing is located close to the edge where things and people produce or consume that information.” But, as with most definitions it does seem quite complex and furthermore there is no precise definition about what edge computing is although there is much hype.
A basic definition of edge computing is that it is the practice of processing data near the edge of the network where the data is being generated, instead of in a centralized datacenter or in cloud. It is against the notion that was once widely held that all information will be held in the cloud. Today’s modern edge computing definition means that cloud and edge computing co-exist. According to IDC, a leading analytics firm, the two approaches are complementing each other and they “interact in a smart and intelligent way”.at edge computing does at a basic level is that it brings computation and data storage much closer to the devices where data is being gathered. Considering that speed of data is limited by speed of light (the book Flash Boys: A Wall Street Revolt by Michael Lewis goes into much more detail on that) the distance to a central location which can be hundreds or thousands of miles away can be huge. By having the computation at the edge, the real-time data does not suffer from latency. In today’s world, especially in real-time analytics in certain sectors microseconds can be crucial. In addition, it reduces the costs for the companies as the amount of data that needs to be processed in a centralized or cloud-based location is significantly lower with edge.
What are the use cases?
Edge computing was developed due to the exponential growth of IoT devices and need for real-time analytics where performance and latency matters. Emerging technologies like the Internet of Things (IoT), autonomous vehicles and augmented and virtual reality (AR/VR) require large amounts of data to be processed with low latency. Centralized cloud system has limitations and legacy CDNs have outdated architectures and are unable to provide real-time visibility and control. In the case of autonomous cars currently, operation and security wise software components being system critical. Vehicles such as Tesla are adequately well with updating their navigation or ADAS software overnight with cloud computing. However, as the level of Autonomous Driving increase to Level 4 and 5, vehicles will be driven using Artificial Intelligence (AI) and will need to be able to respond to real-world data points, generated from sensors. At these levels, to the AI will have to make a decision to steer either left or right to avoid hitting a pedestrian or animal which will require real-time action regardless of internet access, when it comes to making a real-time decision certainly microseconds are crucial. In this sense, traditional cloud computing has its limitations. Today, It is no surprise that the world leading OEMs such as GM, Tesla, Ford, Toyota and leading Technology firms such as Apple are amongst the companies pouring significant Research and Development capital into Edge AI.
Cloud computing vendors Amazon Web Services (AWS), Microsoft Azure and Google Cloud Platform (GCP) are also increasingly taking their services to the edge to remain competitive. Amazon’s Echo is a primary example of a perfect edge computing technology. Up and coming companies such as Fastly, Cloudflare and Edgecast are growing phenomenally in this vertical. Fastly's edge computing platform offers services across content delivery, streaming, cloud security, and application delivery control. Their clients include: Slack, Twitter, Udemy, reddit, Accenture, Financial Times, Pinterest and many more. They estimate that the total addressable market for this vertical is approximately $18 billion in 2019 and growing to $35.8 billion in 2022 at a rate of 25% annually. NVIDIA announced NVIDIA EGX which is an accelerated computing platform that enables companies to perform low-latency AI at the edge. EGX perceives, understands, and acts in real time on continuous streaming data between 5G base stations, warehouses, retail stores, factories and beyond.
Having discussed the advantages, it’s easy to think edge computing magically solves any issues that cloud computing has, but it has its own challenges due to the highly distributed nature of edge systems. Each of the edge nodes will need to share information with other nodes and being able to achieve this consistently is a challenge. The system should allow them to work independently while at the same time keeping them in sync with each other. It is called the distribution, consistency, and synchronization problem. Another challenge is that the computers at the edge are not high powered and that will require higher computing power all around, which will also require higher electricity or battery solutions at the edge.
To summarize, edge computing is not replacing cloud, but in some crucial instances it will bring cloud computing closer to where data is being generated and combine with smaller but more numerous on-premise computing. For latency sensitive, computing power duties, having many computers closer to data is much preferable to having large centralized ones. The main advantages include speed, reduced costs, and increased security. While this brings its own set of challenges, it will be crucial necessity for the new technological trends and the companies who are able to adopt early will reap the rewards.