Cloudwards.net may earn a small commission from some purchases made through our site. However, any earnings do not affect how we review services. Learn more about our editorial integrity and research process.

What is Edge Computing: The Network Edge Explained

Mauricio PreussJackie Leavitt

Written by Mauricio Preuss (CEO & Co-Founder)

Reviewed by Jackie Leavitt (Chief Editor)

Last Updated: 2024-02-11T15:08:11+00:00

All our content is written fully by humans; we do not publish AI writing. Learn more here.

We are firmly in the cloud computing era, but more is being pushed to the “edge.” Those unfamiliar with the technical language of computer technology may be wondering what is edge computing and how is it reshaping data and networks. In this article, we’ll explain the next trend in big data and tell you what edge computing is.

Like the metaphorical cloud and the Internet of Things, the edge is a buzzword meaning everything and nothing. Over the years, we’ve seen paradigm shifts in computing workloads, going from data centers to the cloud and from the cloud to the logical edge of networks.

At its simplest, that’s what edge computing is: the processing and analyzing of data along a network edge, closest to the point of its collection, so that data becomes actionable.

It’s how a Nest thermostat knows to adjust the temperature at a certain time, how an Alexa device tells you the weather forecast for the day or how a self-driving car doesn’t run over a pedestrian or hop a curb Grand Theft Auto style.

We’re going to explain edge computing as it relates to cloud computing, as well as the similar fog computing, and give a few examples.

What Is Edge Computing?

Edge computing isn’t a new concept, but the trends of cloud computing and machine learning have helped propel it. Edge computing is a form of distributed computing, which dates back to the 1960s. Distributed computing covers a broad range of technologies, but its earliest success stories could be considered local area networks and the first internet, ARPANET.

The edge computing we know today dates back to the late 1990s, when Akamai launched its content delivery network aimed at resolving web congestion. Akamai, like Cloudflare, operates an edge network with edge servers to deliver content closer to the source of the request.

Edge computing is still considered a new paradigm, despite its history. That said, it continues to address the same problem: proximity. Moving the computer workload closer to the consumer reduces latency, bandwidth and overhead for the centralized data center, which is why it is a growing trend in big data.     

The world wide web and peer-to-peer networks can be considered distributed computing applications. A decentralized, distributed computing paradigm is also fundamental to torrenting and blockchain. To learn more about blockchain and its underlying principle, cryptography, read our bitcoin guide and our description of encryption.

Benefits of Edge Computing

By solving the proximity problem, you solve the latency problem. The on-device processing approach ensures that only non-critical data is sent over the network and that critical data can be acted upon immediately. That is important for latency-sensitive applications, such as autonomous vehicles, where having to wait milliseconds may be untenable.

The decentralized approach of edge computing also decreases bandwidth. Data processing starts at the point of collection and only the data that needs to be stored is sent to the cloud. That makes edge computing more efficient and scalable and reduces network load.

For instance, if you have multiple security cameras sending 1080p video over the cloud, that becomes a bandwidth problem. It’s impractical and cost prohibitive.

Outage reduction and intermittent connectivity are also improved with edge computing because it doesn’t solely rely on the cloud for processing. This can aid in avoiding server downtime, ensuring reliable operations in remote locations and avoiding unplanned downtime.

Theoretically, there is an additional layer of security with edge computing, too, because much of the data from IoT devices doesn’t traverse the network. Instead, it stays at its point of creation. Less data in the cloud means there is less data to be in a breach or leak.

That said, there is concern about edge devices themselves being vulnerable. There is a history of less-than-secure IoT devices, and that says nothing of potential privacy concerns. Many consider such devices little more than expensive wire taps.

Encryption and VPN tunneling will be increasingly important as edge computing grows. To learn more, read our explanation of virtual private networks and our guide to cybercrime.

Examples of Edge Computing

Edge computing has emerged with the proliferation of IoT devices and has been deployed in different circumstances. The network edge depends on use case. It could be a cell tower, a smartphone, an IoT device or a self-driving car.

If it helps, you could think of it as a baseball field. At the pitcher’s mound is the centralized cloud and, farther out, in the outfield, is where the “edge” resides. Along that edge are edge devices. In between, there may be edge servers, edge gateways or micro data centers that help facilitate local processing and reduce round trip data times to the cloud.

Autonomous vehicles may be the best example of why edge computing is important. Self-driving cars are replete with hundreds of sensors collecting data and, for processes such as collision detection, the vehicle can’t wait seconds for cloud processing. It has to be able to process that data instantly and make a decision.

What’s more, industry giants, such as Intel, estimate that self-driving cars will produce 40TB of data a day by 2020. Toyota predicts that the car-to-cloud data stream will reach 10 exabytes per month by 2025. All that data being beamed to the cloud is expensive and unnecessary, not to mention unsustainable by current networks.

Another example is CDNs, such as Cloudflare and Akamai. They provide an edge network by having global data centers and edge servers that store a cached version of content and deliver it to visitors based on their geographical location. That reduces latency and provides a closer entry point for data.

There’s also the IoT, which includes devices such as Nest thermostats or security cameras, electric appliances, voice assistants, such as Google Home or Alexa, and sensors. Edge computing allows those devices to learn behavior and react. Your Nest thermostat adjusting the temperature when you come home from work is one example of that.

The Cloud vs. The Fog vs. The Edge

It’s been predicted edge computing will replace cloud computing at some point. While edge computing could theoretically eclipse cloud computing, the cloud isn’t going anywhere. As edge technologies mature, the cloud will grow alongside them, establishing a continuum between the two domains.

That said, we may see a decoupling of functions from the cloud. Gartner predicts that the traditional data center will be dead by 2025, with 80 percent of data centers shutting down in favor of infrastructures better able to deliver services closer to the consumer. That is driven by workload placement, which edge computing is equipped to address.

Edge computing has also given way to fog computing, which will likely grow in equal steps. Like the edge, fog computing moves the workload closer to the network edge, reducing data travel, latency and bandwidth. Whereas edge computing moves the process to devices, though, fog computing happens across one or more nodes in a network.

In a way, fog is a standard and the edge is a concept based on that standard. It takes the workload from the network level to the device level.     

Final Thoughts

If edge computing sounds like the computing we used to do on those old, beige boxes with CRT monitors, you’re not far off. The difference is in technologies such as machine learning, AI and a cloud backend. To process data faster, and improve the end-user experience, the pendulum has to shift back the other way, so to speak.

Sign up for our newsletter
to get the latest on new releases and more.

Hopefully, we’ve helped distinguish edge computing from cloud computing and made clear why both are important. To learn more about all things cloud, follow our state of the cloud. As usual, drop us a comment or tweet, and thanks for reading.

↑ Top