Magazine Button
How far will Edge Computing be the biggest driver of new data centres?

How far will Edge Computing be the biggest driver of new data centres?

DataData CentresDeep DiveFacilities & ServersNetwork & StorageOperations & SystemsTop Stories
How far do you agree that edge computing will be the biggest driver of new data centres?

For this month’s Editor’s Question we have put the focus on Edge Computing, following research carried out by Business Critical Solutions (BCS), which found that half of respondents said they believed that Edge Computing would be the biggest driver of new data centres.

The Summer Report, now in its 10th year, was undertaken by independent research house IX Consulting, who capture the views of over 300 senior data centre professionals across Europe, including owners, operators, developers, consultants and end users. It is commissioned by BCS, a specialist services provider to the digital infrastructure industry. 

To find out more, we asked industry experts to provide their insight and outline to what extent they thought Edge Computing would be the biggest driver of new data centres.

Here’s what they had to say…

Jonathan Leppard, Director at Future Facilities 

The arrival of 5G is a truly transformative technology. For the data centre (DC) industry, high speeds and in particular Ultra-Reliable Low Latency Communication (URLLC) is set to cause a huge market shift.

While hyperscalers, colo and enterprise DC will continue to expand and increase as data usage naturally grows, Edge DCs are set to see explosive growth over the next few years thanks to 5G. 

Driving this growth will be the fact that a lot of technologies people get excited about with 5G, such as autonomous vehicles, VR and AR and the plethora of IoT devices, will only really deliver their full potential with Edge DCs.

In fact, one really exciting possibility for mobile operators in particular, is the chance to use Edge Computing to offer Infrastructure as a Service (IaaS) directly from a network – potentially opening up a whole new revenue stream for them. 

Avoid downtime with digital design

However, despite all the excitement around Edge DCs, there are a number of challenges that will need to be overcome. Crucially among these is the challenge of designing a smaller DC to fit within a more confined space.

This can prove difficult for even the most well-resourced business. For starters, combining the increasingly high levels of computing, which generate a lot of heat, with physically smaller space means heat will need to be carefully managed if they are to work efficiently and prevent downtime.

Recent independent research commissioned by Future Facilities found downtime has been costing the average DC £122,000 per year.

However, with Edge Computing set to increase the amount of high-level computing work being done, there’s a real chance the cost of downtime will only increase. The good news though is that physics-based digital twins already exist and can help in preventing downtime.

Digital twins have long been used in the electronics and aerospace industries to ensure designs work as efficiently as possible. It is now a key way of ensuring DCs better manage their heat too.

It works by creating an exact digital representation of a DC and predicting the outcome of various changes, enabling operators to test and refine designs and deployments over the DCs lifetime. This significantly lowers the risk of downtime while enabling DCs to run at a highly efficient level. 

It’s pretty clear Edge DCs will make a significant contribution to the number of new DCs completed over the next few years.

However, challenges around heat management will always remain. What’s more, while Edge DCs will grow at a fast rate, the more traditional setups of colo, hyperscale and enterprise will not be standing still as we all continue to create and generate more data. Edge DCs may, therefore, make up the bulk of new DCs but ensuring they deliver their full potential will mean overcoming existing DC challenges. 

James van den Berg, Technology Solutions Professional, Applications and Infrastructure at Altron Karabina

Edge Computing is defined as a form of distributed computing where compute and storage is pushed outside the data centre to the ‘Edge’ where it is generated and used in order to improve response times and reduce network traffic.

This practice has gained significant popularity in the IoT space as these devices generate vast amounts of data, but quite often the data only carries value in an aggregated form, so there is little to no value in sending all data to a centralised location.

It is much more feasible to place some compute capabilities closer to the IoT devices and aggregate the data at that point and then only transmit the aggregations.  

An additional benefit of this practice is the reduction of network bandwidth required that can result in cost cutting as well as saving on centralised storage needed. With cloud, where cost is directly linked to usage, cost savings can be realised even more.

There are, however, certain shortcomings with Edge Computing, which mainly revolve around the fact that storage and compute management becomes more complex with a distributed model as opposed to a centralised one.

Edge Computing will have some impact on data centres in the future to a certain extent in certain circumstances. Some organisations will be able to leverage compute capacity and storage at the Edge, allowing them to distribute information processing and storage across a wide-reaching network, creating scalability never before seen in a traditional data centre.

However, other organisations which adhere to very strict and restrictive data protection controls will not be able to make use of Edge Computing as this will make control of data dissemination and flow nearly impossible.

Edge Computing allows organisations to extract extra compute and storage capabilities out of investments they already made at the Edge, for example, an existing onsite server at a local office, without the need for additional investment in their data centre. This is extremely alluring where it is possible and can be a game-changer for organisations who can unlock additional value out of existing assets.

Edge Computing capability is one of the great untapped resources most organisations have not leveraged at all as they have at least one on-site server at local or regional offices and, in the future, the ability to effectively leverage this resource might be one of the major competitive differentiators between the organisations that leverage the power of the Edge compared to those that do not.

Simon Bearne, Commercial Director, Next Generation Data

Edge Computing will become an increasingly important driver of new data centres but not until the IoT and 5G are truly ubiquitous will the full extent of its impact be seen. That’s still a few years away as currently Edge Computing is based on specific business cases which then create a particular need for so called Edge data centres. But right now it’s still quite patchy.

Meanwhile and for the foreseeable future, there is no slowing down in the continued massive demand and growth in cloud computing – public, private and hybrid – which remain the real engine room of growth for the colocation data centre industry.

The hyperscalers (major cloud services providers such as Amazon, Microsoft and Google) have a huge, seemingly insatiable appetite for consuming data centre capacity globally. According to Synergy Research Group’s latest global cloud infrastructure services report, cloud revenues are now at a run rate of almost US$100 billion per year.

They say Q2 2019 revenues compared to the same quarter in 2018 have jumped 39% and there’s no end in sight to such stellar growth.

Added to this, NGD is seeing significant demand coming from the ramping up of High Performance Computing (HPC) activities by both the public and private sectors. Therefore, as with cloud hosting, HPC requires data centres with high scalability in terms of available space, vast reserves of highly concentrated power, specialist cooling, plus diverse high-speed connectivity. 

In time Edge Computing will certainly fuel the overall demand for new data centres even further. Data centres located close to the Edge of the network can often serve local applications and services requirements more efficiently by virtue of being in closer proximity to them and the users or machines concerned, therefore minimising latency which in certain applications is mission critical.

This is because real-time processing of specific operational and production data enables extremely accurate decision making. Consider the much vaunted driverless car or delivery vehicle: to work efficiently and safely zero latency is an absolute must.

Of course, Edge data centres aren’t necessarily the big beasts we are mostly familiar with today in terms of their physical size and power resources. Out of necessity, they are more typically the micro variety, comprising anything from one or a handful of servers in a rack, or a small, containerised solution complete with inbuilt UPS and cooling; up to a small modular facility.

However, now and in the future, as Edge Computing proliferates, facilities like these will still need to interoperate with ‘traditional’ much larger data centres at the core of the network; for processing, analysing and archiving much of the very large volumes of ‘big data’ being generated at the Edge.

Edge devices and their data centres will focus on processing the nuggets of operational data they need for their own immediate purposes and actions.

In summary, alongside cloud and HPC, Edge Computing will add substantially to the mountains of data already being generated for processing and storage, with much of this being done in large fit-for-purpose core facilities such as NGD’s.

As to which of these will be the single largest driver for new data centres in the future, only time can tell.

Jeff Ready, CEO at Scale Computing

According to a study from IDC, 45% of all data created by IoT devices will be stored, processed, analysed and acted upon close to, or at, the edge of a network by 2020.

So, as IoT and the growing global network of sensors add more data than the average cloud has ever had to handle, Edge Computing will increasingly take on workloads that struggle on hosted cloud environments. It will move them towards more efficient technologies for local data storage and processing, like HCI platforms.

While the rolling out of new data centres will be partly driven by the uptake of Edge Computing, the very concept of the traditional data centre will likely also be re-written.

This is because Edge Computing allows organisations to process large amounts of raw data onsite before sending it, in a more refined state, efficiently to the central data centre.

Edge Computing brings with it the need to deploy many micro data centres of varying sizes and a platform that can scale in both directions to accommodate businesses digitalisation needs. 

Essentially, a good Edge deployment can be looked at as a micro data centre combined with intelligent automation. Data centre functions such as compute, storage, backup, disaster recovery and application virtualisation can be consolidated into a single, integrated platform – this is a hyperconverged infrastructure.

Infrastructure silos that are difficult to manage in a centralised data centre become unmanageable at the Edge and thus convergence of these into a single platform is both efficient and cost effective.

It doesn’t always make sense to send data all the way back to the traditional central data centre only to be processed and sent back to the same site where it was generated. Unlike full data centre implementations, Edge Computing is small enough to not warrant dedicated IT staff. Due to this, the infrastructure needs to be easy to implement and manage, and easily connected back to the primary data centre or even the cloud as needed. 

There are exceptions to this, such as automated video surveillance use-cases. A network of surveillance cameras will create a massive volume of valuable and sensitive data every second. To action intelligent automation in real-time, Edge Computing devices can harness the data collected from the cameras remotely.

But, in order to store this data long-term, traditional data centres still remain a big part of the picture, as the data is fed back to the data centre from the Edge device.

Trent Odgers, Cloud and Hosting Manager, Africa at Veeam

Africa has always been considered a mobile-first environment. Now, thanks to the growth of the cloud, the continent is finding increasingly innovative ways to leverage mobile with data availability strategies. 

In certain respects, the data is coming closer together thanks to the public and managed cloud, but in other respects it is also moving further away because of the Internet of Things (IoT) and Edge Computing.

The rise of micro data centres

The use of smaller, distributed, connected data centres that are closer to concentrations of users and generators of content/data is becoming necessary to support modern workloads. These not only augment traditional data centres but improve performance across the entire organisation by reducing pressure on enterprise networking resources. 

Today’s technologies allow Edge environments to be managed from a centralised location with complete visibility into and control of the micro data centres at the remote sites. And, with the growth of virtualisation and solutions such as software-defined architectures and hyper-converged infrastructures, micro data centres are becoming much easier to deploy and manage than ever before. 

Whichever technology you use to build your Edge Computing micro data centres, you must make sure that you focus on availability and deploy modern availability solutions that ensure the reliability, recoverability and effectiveness of your data, wherever it is located. 

Leveraging data

According to the 2019 Veeam Cloud Data Management Report, companies plan to spend close to R600 million on deploying cloud data management technologies to build an intelligent business within the next 12 months. Part of this includes using technologies such as Artificial Intelligence (AI) and IoT to drive organisational success.

Opportunities abound on the continent when it comes to leveraging data in progressively innovative ways. In healthcare, for example, the data captured from pacemakers can enable medical professionals to detect the onset of a heart attack even before a person is showing external symptoms. 

Backup remains fundamental

When it comes to data availability, it shouldn’t matter whether your underlying technology platform comprises of a centralised data centre, multiple clouds, an Edge Computing environment or some combination of all the above. And it shouldn’t matter if you are supporting centralised workers, remote workers, mobile workers or connected devices. They must all have uninterrupted access to and from the applications and resources they need at any time from any location. 

Even though businesses are still figuring out how best to generate income from Edge devices and IoT, the fundamentals of backup, recovery and continuity still need to apply. 

It is imperative that the appropriate redundancy measures are in place in the event of a disaster. To do any less risks significant problems not only from a compliance perspective but a customer-service one as well.

Click below to share this article

Browse our latest issue

Intelligent Data Centres

View Magazine Archive