As organisations attempt to efficiently store and operationalise data, understanding how to transition to the cloud is a critical initial step. Vincent Gaorekwe, CTO at BITanium, tells Intelligent CIO Africa and Intelligent Data Centres about how companies can cope and succeed in an ever-changing data landscape.
What are key motivators driving organisations’ decisions to shift to the cloud?
Cloud computing is simple and cost-effective.
Scaling up using on-premises infrastructure is an expensive affair. It requires additional investment in hardware, network equipment, software licenses and in-house technicians. When you factor in maintenance installation costs, these expenses go through the roof.
When you move to the cloud, you can eliminate the investment you make in redundant infrastructure. Although cloud computing involves initial setup costs and training, you can achieve economies of scale at a much faster rate compared to on-premises infrastructure. Cost-effectiveness is one of the major reasons businesses across the globe are choosing the cloud over traditional systems.
The cloud is flexible and scalable.
One of the major benefits of using the cloud is its scalability. Cloud services can scale up or down very easily based on user requirements. If you use only on-premises infrastructure, you need to invest heavily in physical servers, networking equipment and software licenses to scale up your growing business.
The cloud offers better insights from big data.
Businesses generate huge volumes of both structured and unstructured data on an everyday basis, collectively known as big data.
Deriving valuable insights from big data requires cost-effective ways of information processing. On-premises storage systems may not be able to keep up with high-volume data generation in the long run. Even if you try to do everything the traditional way, you need substantial investment in your infrastructure to make this happen.
The cloud drives collaboration efficiency.
Cloud computing can also enable great efficiency in the work processes of various organizations. Cloud technology allows collaboration on a much larger scale among employees within an organization. It allows multiple users from different departments or geographic locations to access the required information.
The cloud ensures business continuity and disaster recovery.
The cloud has dramatically changed how businesses store and retrieve data. This especially comes in handy when businesses need to recover quickly following an unforeseen disaster. Cloud backup helps businesses recover their data quickly so they can continue their operations without any downtime.
Why are Organisations today increasingly looking at public clouds as a key enabler for executing on their enterprise data warehouse (EDW) modernisation strategy?
In today’s world, enterprise data warehouses are a critical component of any organization’s technology ecosystem. They provide the backbone for a range of use cases such as business intelligence (BI) reporting, dashboarding, and machine-learning (ML)-based predictive analytics, that enable faster decision making and insights.
The rise of cloud has allowed data warehouses to provide new capabilities such as cost-effective data storage at petabyte scale, highly scalable compute and storage, pay-as-you-go pricing and fully managed service delivery. Companies are shifting their investments to cloud software and reducing their spend on legacy infrastructure.
How can enterprises accelerate time-to-value through end-to-end management of transactional, operational and analytical data across any cloud?
Cloud-based data warehouse solutions can offer faster, easier scalability than on-premises environments. You can start with a short-term proof of concept and then scale up to match the requirements of the project when it goes live.
What are the current pain points of organisations efficiently storing and operationalising their data?
Consolidating data from disparate sources into a single system is one of the greatest challenges all organisations grapple with.
Getting Real-Time Insights
Another pain point is getting real-time insights. When we move from analyzing static data to handling inputs in real-time, a great deal of complexity is added. While innovations such as fully automated Extract, Transform, and Load (ETL) engines make life easier, there is still a long way to go for organizations looking to use real-time insights.
Siloed Analytics and Competing Results
If one department of an organization has access to certain types of information and other departments don’t, then this could provide skewed results across the organisation.
Lack of Skills to Interpret and Apply Analytics in Business Context
Even if an organization has the required technical staff, the lack of skills to interpret and use analytics among business staff can be another significant barrier to effectively using data and making data-driven decisions.
Data Governance and Security
Companies source data for decision-making from many internal and external data sources. However, the governance of this data is very much an issue. Another data challenge for organizations is ensuring their security and integrity. Due to the criticality of the data, even a minor incident can result in enormous losses for the organization.
What is the next generation of IBM Netezza Performance Server and what is it doing to make it accessible to everyone across cloud, hybrid and on-premises deployments?
IBM Netezza Performance Server is the advanced cloud-native data warehouse designed for unified, scalable analytics and insights available anywhere. Built with data lake integration, Netezza Performance Server empowers you to run highly complex queries and machine learning to support critical business decisions across your organization. It provides a secure, and scalable source for your analytics that is simple to use and flexible enough to run demanding analytical workloads across fully managed cloud, hybrid and on-premises environments.
IBM Netezza Performance Server is available on IBM Cloud®, AWS, Azure and IBM Cloud Pak® for Data Systems.
Can you explain more how Netezza helps predict data warehousing costs, secures collaboration and automates maintenance and operations?
Elastic scaling handles both planned and unplanned spikes in data warehouse requests or demand for computing power and helps accommodate these short-term needs while continuing to meet service-level agreements (SLAs) for downstream applications. Elastic computing also enables elastic pricing—you can scale up and down while only paying for what you use. The Netezza Performance Server is a cloud-based data warehouse that can be brought up and taken down with minimal investment in time and resources. Netezza provides you with the ability to move to the public cloud or augment your on-premises Netezza Performance Server for IBM Cloud Pak for Data System environment with data marts. For example, you can add cloud-based data warehousing resources for those few days each quarter when your regional offices need to run financial analyses.
What is the future of running complex analytics on very large data volumes with speed and simplicity?
Data warehouses have a long history in decision support and business intelligence applications, though were not suited or were expensive for handling unstructured data, semi-structured data, and data with high variety, velocity, and volume. Data lakes then emerged to handle raw data in a variety of formats on cheap storage for data science and machine learning, though lacked critical features from the world of data warehouses: they do not support transactions, they do not enforce data quality, and their lack of consistency/isolation makes it almost impossible to mix appends and reads, and batch and streaming jobs.
A data Lakehouse is a new data management architecture that combines the flexibility, cost-efficiency, and scale of data lake with the data management and ACID transactions of data warehouses, enabling business intelligence (BI) and machine learning (ML) on all data.
Netezza Performance Server now makes it easy to perform analytics in your data lake using its robust massively parallel execution engine and SQL on Parquet capabilities.
What advice would you give to organisations who want to achieve their data goals while ensuring their warehouse is built for simplicity, sophistication, scale and speed?
One of the biggest challenges facing organisations is having the expertise to cope with the changing data landscape and demands for more data. This is why we have invested so heavily in our team over the years to a point where we now have the largest certified team of IBM Netezza professionals outside of the United States. Regarding advice, whatever technology or architecture works best for you, make sure you have a team capable of supporting your organization throughout the design, build and run process.Click below to share this article