By Mahesh Jaishankar March 17, 2022

In a study of data access rates by Deliote Digital it was found that “0.1ms change in the load times influences every step of the user journey ultimately increasing conversion rates. Conversions grew by 8% for retail sites and by 10% for Travel sites on average”  This translates into millions of dollars of sales for ecommerce sites.  Hence the latency of the content matters.  This latency is defined by the closeness of the content or the data to the end user.

(source – Milliseconds make Millions Milliseconds_Make_Millions_report.pdf)

The article emphasises the fact that how critical it is now for any digital presence to have high speed connectivity for their networks.  Edge Collocation, where the servers are in Datacenters closer to customer rather than in large central datacenters, allow lower latencies to customers.

Gartner defines edge computing as “a part of a distributed computing topology in which information processing is located close to the edge—where things and people produce or consume that information.”

Edge data centres typically are smaller facilities that are located close to the edge of the network. They provide the same services found in conventional central data centres provided in a smaller footprint, closer to end users and devices. Edge data centres deliver cached content and cloud computing resources to these devices. The core principle is a distributed IT architecture where the computing and storage is processed as close to the source and users as possible.   As the edge DCs are closer to the end users or source they deliver faster services with minimal latency.

As per gartner there will be 25 Billion devices connected by end of 2021 generating humongous amounts of data.  Edge Computing seeks to address this data tsunami with a distributed IT architecture that will move the datacenter resources towards the network periphery.

Latency: By virtue of physical closeness, time-to-action drops and the analysis of the data occurs locally rather than at a centralised distant data center or cloud.
Congestion: Edge computing can also relieve the growing pressure on the wide-area network. This can improve efficiency and keep bandwidth requirements low rather than overwhelming the network with a onslaught of relatively irrelevant raw data, edge devices can analyze, and process the data locally.

A 2019 study ran by Google in Denmark, demonstrated that on average consumers were 10% more willing to recommend a web shop if load-time was reduced from 13 seconds to 10 seconds. A further reduction from 10 seconds to 3 seconds gave an estimated 26% increase in advocacy.

(source – Jonas Christensen, Google Denmark/June 2019/Mobile, https://www.thinkwithgoogle.com/intl/en-154/search/#?query=Mobile

The challenges of edge

The edge has its benefits however it also comes with its own challenges.

The security is seen a key concern as a decentralised architecture will make the network more vulnerable to attacks by exposing more possible entry points.
The cost of distributing computing across versus the economy of scale a centralised computing facility need to be weighted in the context of the application.

At Arc we work with our customers to understand their applications and services.  We build solutions that can bring benefits of the Edge Architectures to our customers.
We connect Datacenters and understand the middle east network topologies to enable our clients with low latency services.