IT departments must implement strategies that are easily accessible for the whole business, and having the right technology in place to achieve this is crucial. Jeremy Atkins, UKI Sales Director – Enterprise & Public Sector at Commvault, explains why agility and flexibility will be key to success as we prepare to emerge from the pandemic and retreat back to reality.
In the initial stages of COVID-19 and lockdown, there was furious activity to transform the workplace in the UK – enabling people to work from home, changing business processes and mobilising apps and data to work as well at the edge as in the office. Now that this has been in place for several weeks and the dust has settled, there are two key things a senior IT leader should now be asking themselves:
- Can I deliver the right service, securely and efficiently as long as the lockdown endures?
- How has this situation impacted the overall organisation’s mid-long term strategy and budgets, and do I need to change my IT strategy to re-align with the new world?
Of course, like any IT function, there will have been a whole range of in-flight projects running when COVID-19 hit. The majority of them will have been put on hold while it has been ‘all hands to the pumps’ to get over the initial changes – now, weeks later, which projects do you restart?
With many organisations facing reduced income and no clear notion of when that revenue might return to ‘normal’, IT budgets have been hit hard. Furthermore, business emerging from the pandemic may be in a completely different structure and go-to-market than when they went in.
In order to decide which projects are still viable and which should be paused or shelved, it is vitally important for businesses to evaluate them by measuring their impact across cost and risk during these uncertain times.
Driving down cost
From a cost-saving perspective, a backup database can provide one of the most complete end-to-end views of a business’ estate, the rate of growth, the rate of change and the location and types of key data. Of course, if a business has multiple backup products, it may prevent a single source of the truth, but it also presents an immediate cost-saving opportunity and improvement in service. More often than not, there can be a 25% reduction in the operational cost of running backup when consolidated, as well as providing the opportunity for further automation and enhancement.
The next step in driving cost out is to then use that single view to eradicate duplication and identify the static workloads. Servers or applications that are now obsolete must be decommissioned. At this point, a business may find it really effective to implement a suite that interfaces with the backup solution that will not only provide the required GDPR and PII compliance, but also conduct end-to-end file optimisation. This will take the identification of redundant, obsolete and orphaned applications, data and users to the next level. IT teams that remove over 40% of their primary storage estate will see a knock-on effect of driving further server, software and operational savings across the estate. An added bonus is that it will also take a substantial chunk out of potential future cloud utilisation, too.
Now that a business has identified and optimised its storage need, the IT team must ask whether it is on the right platform. There is a very simple way to view a storage strategy:
- If the data does hard or specialised work, place it on discrete flash arrays
- If data is accessed and used a lot, put it on hyperconverged or cloud
- If data needs to be stored safely and/or has secondary use, object store is most efficient
- If none of the above apply, delete it!
While the discrete platform is a relatively easy decision to make, migrating data to either the cloud, hyperconverged infrastructure or object store is a different matter. With a plethora of hardware- and software-based solutions, and a potential management nightmare, it can be quite a difficult decision for businesses to make.
Choosing the right platform is an article in itself, but at a top level, a platform that removes the barriers between on-premises and cloud gives true data mobility and abstracts the hardware layer, which can take over 60% out of a business’ existing storage cost base.
The risk profile and attack surface may have materially changed during the current situation, with the change in working practices, acceleration to cloud and a new set of threats created. This not only creates a security and ransomware protection nightmare, but also presents a new set of challenges around compliance, data sovereignty and information management.
Firstly, with the push to home working, there are now more mobile devices in use in organisations than ever before, often by people not used to having key and critical business information sitting on their kitchen table. So, not only do we have to ensure that the basic hygiene of backing these devices up is followed, but that we don’t allow these to be an entry point for ransomware.
Modern backup solutions feature AI-driven pattern analysis on file behaviour. Over the first week in which a file, or group of files, are backed up, its rate of change and pattern of behaviour is measured and assessed. Should there suddenly be a change to that behaviour pattern, an alert is raised to the control centre and passed to the service management platform to indicate that there could be an early stage of ransomware attack. This is extended out to all endpoints, thus hopefully stopping the attack in its tracks at the edge, rather than allowing access to the data centre.
Honeytrap files (or, files that are particularly attractive to ransomware programmes) are scattered around the estate to hopefully lure a ransomware programme to break cover before being able to attack any real, valuable files. If a ransomware attack does get through, then having a backup solution with an immutable database is absolutely key, as well as a thoroughly written and tested recovery plan that covers all of the business’ recovery, not just restoring data.
Some businesses, especially in the finance industry, are now looking to create air-gapped data vaults to further protect against ransomware. Solutions where data can only pass one way, in a prescribed route at a prescribed time, mean that recovery is then made into a clean-room test environment before being released to the organisation.
We live in an age of ‘Schrödinger’s Data’ where we want to simultaneously have greater access and use of data while making it more secure, compliant and less accessible. There are tools out there to make this achievable where they can have a dual role of driving cost out of the infrastructure and service, as detailed above, while lifting the value of the data and providing solid regulatory compliance. If a business does not have the budget to achieve infrastructure modernisation and meet compliance regulations, then perhaps combining the two projects under a single programme will drive the results in both camps from a much reduced expenditure.
A final word
There is a lot for the IT leader to think about as we begin to emerge from this crisis, but at the same time, there is some real opportunity to drive a faster Digital Transformation agenda that helps remove the legacy cost and complexity, drive innovation and efficiency, and prepare IT to step up and lead the lines of business into the new world. We have no idea what the next three, six, 12 months will look like for society or organisations, nationally and globally, so agility and flexibility will be key to success.