advertisement
Did cloud kill backup?
No, but it does necessitate a reinvention of data protection for the new normal of the multi-cloud world. This vendor-written…
No, but it does necessitate a reinvention of data protection for the new normal of the multi-cloud world.
This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter’s approach.
With enterprises rapidly adopting hybrid and multi-cloud infrastructure and migrating traditional workloads to the cloud, distributed architectures have become de-facto standard, but traditional backup and recovery strategies have not kept pace. A new cloud-first approach to data protection is required.
advertisement
According to IDC, 70% of CIOs have a cloud-first strategy, and it is safe to assume most enterprises have a multi-cloud infrastructure, deploying applications on the best suited cloud whether private, public or managed. This evolution to multi-cloud has created two transformative shifts that are disrupting the application tier of the infrastructure world.
First, next-generation applications born in the cloud are being deployed on next-generation distributed, non-relational databases such as Apache Cassandra, MongoDB, Apache HBase, and many others. As non-relational databases, they offer high-availability but compromise consistency. For analytics applications, businesses are now rapidly deploying either on-premises analytical data-stores such as Apache HDFS / Hadoop or cloud-native databases such as Amazon Redshift and Google BigQuery. To further complicate matters, these next-generation applications are deployed both on public cloud infrastructure and on-premise private clouds.
Secondly, traditional data center applications are migrating to the cloud. While these applications are still predominantly deployed on relational databases such as Oracle and Microsoft SQL Server, the balance is shifting towards deployment on next-generation cloud-native databases such as Amazon DynamoDB. The explosive growth of Amazon Web Services database business, growing to more than $2B in just three years, is but one example of this shift.
advertisement
Data protection that keeps pace with your cloud migration
Any enterprise that has a multitude of applications and databases is living in a multi-cloud world and the implications are profound. From a CIO’s perspective, there are several strategic takeaways.
First, applications dictate the choice of cloud. For example, if you have applications that leverage Oracle’s Exadata platform, you are not going to move the Oracle Exadata platform to AWS, but rather to Oracle Cloud. Similarly, for Microsoft SQL Server-specific applications, you will likely move these applications either to Microsoft Azure public cloud or Amazon AWS. Not surprisingly, new and modern applications that are deployed on non-relational and modern databases will be deployed from the get-go on cloud-first infrastructure.
Second, use-cases cross cloud boundaries. In addition to protection of entire applications that have migrated to the cloud, organizations need to move data sets to the cloud for testing, development or analytics, migrating inactive data to the cloud for cost efficiency, and bringing data back on-premises for compliance and governance.
advertisement
The bottom line is CIOs need a new backup and recovery strategy, as part of an overall data management strategy, to thrive in the multi-cloud world. CIOs need to proactively plan and execute a data protection strategy that not only provides data protection for hyper-scale, distributed applications born in-the-cloud, but also provides the freedom to best leverage all their cloud resources as dictated by application requirements.
The requirements for data protection in a multi-cloud world require a fundamentally different approach than traditional data protection. There are a number of key capabilities to look for when opting for a backup and recovery strategy that can keep pace with your overall cloud migration:
· Cloud-first elastic – To fully harness the power of the cloud, data protection needs to be elastic and compute-based providing completely seamless scalability.
· Hyper-scale and distributed –The common theme of the multi-cloud world, of next-generation applications born in the cloud, and of traditional applications migrating to the cloud, is that of hyper-scale. Multi-cloud applications are, by definition, hyper-scale and distributed, therefore any data protection strategy must be grounded in addressing protection at hyper-scale.
· Application-centric – There is no concept of a LUN or an ESX VM in the cloud. All the underlying infrastructure is exposed as cloud-native services such as elastic block storage (EBS) or elastic compute cloud (EC2). In the cloud, the value is moving up the stack towards applications. Therefore, any data protection strategy should be application-centric instead of infrastructure (e.g. LUN, VM) centric, eliminating any dependencies on underlying infrastructure.
· Performance at scale – Multi-cloud data protection must eliminate the inherent shortcomings of legacy media-server based architectures. Instead, data must move directly and in parallel from the source to the destination.
· Efficiency at scale – Deduplication technologies found in traditional data protection solutions don’t work in a multi-cloud environment. Instead, look for next-generation deduplication that is application-centric and can provide the highest order backup storage efficiency in the cloud.
· Global data visibility – Due to the distributed nature of multi-cloud, data protection needs to provide global data visibility enabling backup anywhere, recover anywhere, and migrate anywhere capabilities.
· Universal data portability – To maintain complete independence from the underlying multi-cloud infrastructure, data protection should provide native format, always consistent data versioning enabling complete data recoverability, portability, and mobility.
When done right, adopting a cloud-first data protection strategy in a multi-cloud environment can unlock benefits previously unattainable within traditional boundaries, such as delivering complete availability and performance, which provides failure resiliency to the data protection infrastructure as well as flexible recovery point objectives from minutes to hours. And using application-centric deduplication can provide space-efficient backups, achieving up to 70% reduction in secondary storage cost.
A cloud first data protection strategy can also facilitate hybrid adoption of the cloud by migrating data to, from, and within the cloud. Ultimately, cloud-first data protection can enable a backup anywhere (one cloud or multiple-clouds), recover anywhere (on-premises or in the public cloud), and migrate anywhere (to the cloud, across clouds, or from the cloud back to on-premises).
For CIOs looking to keep pace with cloud transformation, a cloud-first data protection strategy can also help determine the value of their data. The premise of this monetization is simple: while backup or application-consistent versions allow enterprises to meet operational recovery needs, it is monetizing the secondary data that truly drives business results. For example, enabling businesses to run application instances directly from the secondary copy instead of restoring data back to the primary or production instance and then bringing the application back online can save time and money.
So, did cloud kill backup? Most certainly not! But it does necessitate a reinvention of data protection for the new normal of the multi-cloud world. Digital transformation is driving mainstream adoption of a multi-cloud infrastructure, ushering in a new era of hyper-scale, distributed applications which are becoming the bedrock of organizations’ go-forward strategy. To keep pace with this transformation, CIOs need to ensure their data is always available, and this requires them to take a fresh look at their requirements and the technologies they use to address these challenges.