What Is Edge Computing, and Why Does It Matter?

0
228
THINK A/Shutterstock

Cloud computing is a bigger deal than most people realize. Its influence isn’t just felt in the business world, where it’s displaced previous on-premises servers with nimbler, off-site alternatives. Even the layperson shuffles bits to the ethereal data center in the sky, thanks to services like Google Photos and Netflix. But is another revolution afoot?

We’re talking about edge computing. This new paradigm in IT aims to bring distant data centers closer to the people who actually use them. It’s particularly ideal for time-critical applications for which low latency is a must. Here’s what you need to know.

In the Beginning, There Was the Server

a40757/Shutterstock

For edge computing to make sense, it’s helpful to put it in an historical context, so we’ll start at the very beginning.

Corporate IT used to be a static affair. People worked in vast cubicle farms, toiling under the harsh glare of halogen light. It made sense for their data and business-critical applications to be located nearby. Businesses would shove servers into well-ventilated rooms on the premises, or they’d rent space in a local data center.

Then, things changed. People started working from home more. Businesses grew and opened offices in other cities and countries. Quickly, the on-premises server ceased making sense—especially when you consider the vast growth in consumer internet usage. It’s hard for tech companies to scale when they’re forced to buy, provision, and deploy new servers every few days.

Cloud computing services, like Microsoft Azure and Amazon Web Services (AWS), solved those problems. Businesses could rent space on a server and expand as they grew.

The problem with the cloud in its current incarnation is it’s centralized. Providers like Amazon, Microsoft, and Google have data centers in most locations, but these are often hundreds—if not thousands—of miles away from their customers.

For example, if you’re in Edinburgh, Scotland, your nearest AWS data center is in London, which is around 330 miles away. Meanwhile, if you’re in Lagos, Nigeria, your closest continental AWS location is in Cape Town, South Africa, which is nearly 3,000 miles away.  

The farther the distance, the higher the latency. Remember, data is merely light flowing through a fiber optic cable, and thus, it’s limited by the laws of physics.

So, what’s the solution? Well, arguably, the answer lies in history repeating itself, and bringing the servers closer to the people who are using them.

Life on the Edge

Sashkin/Shutterstock

To sum it up, edge computing means bringing applications and data storage closer to where the people who use them are located. For large corporations, this could include a purpose-built server facility within close proximity to their main offices. On the consumer front, it might be helpful to think of IoT devices performing certain tasks, like facial recognition, with their own local computing resources, rather than farming it out to a cloud service.

There are a few advantages to this. First, it helps reduce the amount of network traffic that has to be sent. When you consider that many large corporates often pay steep fees to shuffle bits between data centers, it makes sense to bring those closer to home.

Second, it reduces latency. Often, a huge portion of the time required to perform a task is devoted to moving traffic across the network. Bringing computational power closer to home can reduce that latency and speed things up.

This could potentially open the door to new forms of computing, for which immediacy is key. One oft-touted example is a “smart city,” in which the local government can collect information on things like utility usage and road traffic patterns in real time and, subsequently, take swift action.

There are also potential uses for edge computing in the industrial sector. These include allowing manufacturers to gather data on equipment and make rapid adjustments and, thereby, reduce energy use and equipment degradation.

On the consumer side, edge computing has the potential to make things like cloud gaming a more satisfying experience. If graphical number-crunching is closer to players, they’re less likely to experience unpleasant lag, which can be the deciding factor in who wins an online game.

The 5G Factor

Zapp2Photo/Shutterstock

Coinciding with the steady rise of edge computing is the introduction of 5G connectivity. Although it’s still in its infancy, 5G promises markedly lower latencies than previous mobile standards. As a result, you can expect it to play a huge role in the evolution of edge computing as a paradigm.

What does this mean? In the logistics sectors, you’ll see a greater emphasis on analytics and data, as trucks and vans transmit information to be analyzed and actioned in real time. There’s also the prospect of “smart farming,” which will make vast swathes of agricultural production automated. Not only will this improve crop yields, but it will also prevent waste.

Then, there’s the consumer side. By bringing the computational “heavy lifting” closer to people’s phones, you unlock newer, more immersive entertainment experiences for things like virtual reality (VR), augmented reality (AR), and gaming.

Of course, that is still a long way off. Carriers and developers have to build it first. However, when they do, you can expect the same seismic change that occurred when cloud computing first burst onto the scene.

RELATED: What Is 5G, and How Fast Will It Be?

READ NEXT

  • › How to Disable Notification Bubbles on Android
  • › How to Quickly Switch Between Virtual Desktops on Windows 10
  • › What Is Edge Computing, and Why Does It Matter?
  • › How to Restart an Android TV
  • › How to Open Microsoft Edge Using Command Prompt on Windows 10