Magazine Button
Infinidat expert on what to expect in data storage this year

Infinidat expert on what to expect in data storage this year

AnalysisNetwork & StorageTop Stories
Eran Brown, EMEA CTO, Infinidat, gives his predictions for data storage for 2020.

Eran Brown, EMEA CTO, Infinidat, gives his predictions for data storage for 2020.

Last year saw significant changes in the cloud and nothing tells the story better than a quick straw poll survey. At every event I spoke at in the last year, across Europe, I asked these questions:

  • Do you see your business units (BUs) taking applications to the cloud?
    Almost all raised a hand.
  • How many of you see the BUs doing it to save costs?
    Almost no one raised a hand.
  • How many of you see the BUs doing it to accelerate time-to-market?
    Almost all raised a hand again.

This survey has never failed to yield these results, whether I spoke at a technical event (such as VMworld,) at a C-level event (Gartner) or a local event (TechUG) – it was the same in the UK and in mainland Europe. We all see a new de-facto policy, where businesses compete on speed and are paying a premium to the public cloud providers to achieve it. Without saying it, we have created a new exchange rate between time (to market) and money. Time is the more valuable of the two.

While public clouds have benefited businesses in accelerating the launch of new applications, they have created two new challenges: the increase in cost and the move of a portion of the IT from CapEx (Capital Expenditure) to OpEx (Operational Expenditure). Since increased OpEx hurts the company financially (reduced EBITDA for example) this change is not an easy one for CIOs, CFOs and CEOs who would have preferred to keep more of the expense in the private cloud, where they spend CapEx. This will make 2020 an interesting year, where we’ll see four main trends or challenges:

Strategic demarcation between public and private cloud data storage

Companies will need to draw a strategic line between how and when they use the public and private cloud for data storage. This decision will require careful evaluation of the agility, flexibility, speed-to-market and cost offered by each approach. It will be impacted by the quantity of legacy data to be stored, as well as what data companies want to ‘own’ and store locally, on premises. Not all data is suitable for storage in public clouds.

We are speaking to senior executives responsible for cloud strategy who are telling us they have a plan to move to cloud solutions in totality and shut down all their data centres, though there isn’t necessarily a timeline in place to support this goal as yet. In parallel, while they have committed some hosting to the cloud, there isn’t a set strategy for what goes where in the long term, which suggests there is still a lot of fine-tuning to be done.

The trend to move towards cloud solutions in totality is shared by Sharad Saggar, Managing Director at Core DataCloud, who says: “Market demand for a ‘cloud first’ strategy is gaining momentum.”

In line with this the company, a UK-based Cloud Service Provider, has already revised its enterprise storage system to support growing customer demand for its cloud-based disaster recovery and backup services business.

Increased focus on the economics of data storage

As data growth continues unabated, larger organisations will be under increased pressure to more clearly understand the ‘cost versus benefit’ for different storage models as the volume of data created, shared and stored expands exponentially.

Companies will need to re-evaluate their private cloud solutions if they want to offer an alternative to public cloud storage that is not cost-prohibitive. In this new standard where we convert time to money, private clouds are cheaper but yet often they lose business to the public cloud for agility (time). Any business wanting to improve its IT spending will need to improve the private cloud’s agility.

This is not to say that private cloud needs to be as agile as the public cloud, only agile enough so that it no longer makes sense to pay the public cloud premium. For example, if the public cloud takes 15 minutes to bring up a new environment and the private cloud takes a month, BUs will go to the public cloud. But if the difference becomes 15 minutes versus one day, it’s a completely different discussion.

Private clouds have  a cost advantage, but they have to step up their agility to be a true alternative to public ones. Without this change in agility, the BUs will keep spending more money to gain agility. If this happens, it will be hard to define a clear strategy on what should stay on premise, as often the time-to-market consideration will overrule this strategic decision.

Growing interest in multi-cloud solutions

Data gravity issues and digitilisation may result in increased interest in multi-cloud solutions that are better able to compete on price, availability and resiliency. Flexibility and accessibility may have been the key drivers behind initial decisions about cloud storage but in hindsight it has not always proved to be the most cost-effective solution.

Hybrid cloud models will continue to provide the most commercially resilient solution, especially for those companies that want to take a ‘cloud first’ approach. A subset of defining a cloud strategy is also to consider that inadvertently, you are defining what doesn’t go across to the cloud.

Greater alignment between IT and business units

This still needs to be top of the agenda. Leaders need to make better business decisions about data storage, especially where demand for ‘always on’ service delivery is high.

Jon Nicholas, Cloud and Storage Infrastructure Manager at Pulsant agrees. “As the business grows, we expect to be given targets of our own within the IT team in order to maintain a system that enables the company to do more and offer more. Part of this will include looking at how we can do more with the leading-edge technologies already in place.”

Choosing between on-premises and public clouds should not be driven by hype, but rather what will enable the business units at a cost point that is acceptable for the long-term viability of the business and after examining the choices between funding alternatives (CapEx vs. OpEx, pay-per-use, etc.)

One thing that made the decision a bit easier for IT leaders last year was the introduction of pay-as-you-grow business models for private clouds. However, these usually came as OpEx only, so this just replaces one problem (time to market) with another (increase in cost).

This year, we will see more customers asking their vendors for CapEx-based pay-as-you-grow solutions, and these will solve this issue, leaving only the agility challenge.

To solve the agility challenge, we will need to change the way customers interact with the vendors they rely on to power their private cloud. We will need to see customers not only asking their vendors if they support the right type of API for automation but rather asking them to be enablers for IT transformation, bringing enterprise grade tools that deliver DevOps like agility without the need for each customer to develop these capabilities themselves. In this new era, customers will choose the technology not only by performance and cost etc, but also by what changes the technology allows them to make in their private cloud to better align it to the new needs of the BUs.

Click below to share this article

Browse our latest issue

Intelligent Data Centres

View Magazine Archive