Moving your data and business apps to the cloud might seem like an obvious, cheap, hassle-free solution, but it’s not always that way. There are issues with flexibility, costs and security to consider – and in this context I’m impressed with NetApp’s Data Fabric strategy. Here’s why.
As the IT infrastructure environment shifts from privately housed data centre applications to publicly hosted data centre applications, ensuring data integrity and control across all of these landscapes has become extremely difficult. Even moving from one cloud to another leads to the same problems - and each public cloud provider has its own set of management tools which can really slow down adoption and migrating to and from the different platforms.
Many see the cloud as a great environment to run different workloads - from development and test right through to production. But what good is it if it proves hard to move your applications around the different cloud platforms?
And there’s another potential problem with cloud hosting: spiralling costs. When an applications environment in the public cloud grows over time, so do the costs. And if it is difficult to move those applications from the public cloud environment you may have a multitude of problems all resulting in added costs.
Public cloud outages can also hit the availability of applications, in some cases it can mean the complete outage of a service with little control over the data. You could find yourself at the mercy of the provider.
Data security is also high on the list of things people are weighing up in the ‘to cloud or not to cloud?’ debate, especially in the context of public cloud offerings.
How to regain control?
For all these reasons NetApp’s Data Fabric strategy is making a lot of sense to me.
The Data Fabric strategy brings to the table a single, joined-up ecosystem that puts the control back into the hands of the user. It provides a single, familiar interface that makes moving data from private to public cloud much less complicated. It means that you have the control back.
It delivers on the following five major design principles:
- Control. Securely retain control and governance of data regardless of its location: on premises, near the cloud, or in the cloud.
- Choice. Choose cloud, application ecosystem, delivery methods, storage platforms, and deployment models, with freedom to change.
- Integration. Enable the components in every layer of the architectural stack to operate as one while extracting the full value of each component.
- Access. Easily get data to where applications need it, when they need it, in a way they can use it.
- Consistency. Manage data across multiple environments using standard tools and processes regardless of where it resides.