“Data never stops. Data’s there, it’s got gravity,” says Cloud + Data Center Transformation (CDCT) SVP and GM Shawn O’Grady in a video of the same name as this article
Surely, these statements have never felt truer. We generate roughly 2.5 quintillion bytes
of data every day. Ninety percent of all data
in history was created in just the last two years. Industry analysts are forecasting data growth to continue surging at an aggressive pace. And as organizations embrace new technologies like Artificial Intelligence (AI), machine learning, and GPU computing, the ability to access that data will be paramount to remaining competitive. Access is one aspect of data decisions, of which there are many: storage and related costs, data protection, integrated security, etc.
Meanwhile, the cloud is an attractive platform option. Cloud and hybrid cloud models are helping businesses increase the rate and scale of innovation.
For those who are developing new workloads in the cloud, it’s natural to wonder whether data should reside there as well. But this is a big question given that, currently, most enterprise data is in private infrastructure. And, the majority of applications (58%) are not optimized for cloud, according to a recent Insight-commissioned IDG survey
. CDCT Chief Architect Juan Orlandini explains in this video
that remedying this can be quite a lot of work.
At CDCT, we often find that companies reconsider the location of their data when preparing to make a big data storage investment. The question at hand is whether or not that storage be procured at the cloud versus on-premises. They are wise to ponder this; enterprise cloud users are wasting 27–35% of cloud spend.
The overall landscape has changed though, as leading technology makers have disaggregated data management software from the physical layer upon which data is stored. Traditional offerings were packaged appliances that contained task-driven software that handled networking, storage, data protection, etc., and ran on industry-standard hardware. We’re now seeing countless examples where vendor offerings seemingly communicate a new paradigm. As O’Grady says, “It’s the software that matters and the data can be anywhere, including in the cloud.
So, if data can be anywhere, how do you make sure it’s optimally positioned to support your business goals and outcomes? Here are key considerations to make:
- What kind of data are you evaluating?
- Data in motion?
- Data at rest?
- Long-term retention data?
- Primary data critical for daily operations?
- In terms of applications and workloads:
- What are the dependencies involved?
- Are you envisioning a lift-and-shift move to cloud? Moving a VM to cloud? Other?
- Do you want to rewrite applications in containers? PaaS? Cloud-native services?
- From a macro perspective:
- Is your network prepared for cloud?
- How will you ensure cloud security?
- How do you plan to protect your data and workloads in the cloud?
- What compliance considerations need to be made?
- Do you have an IT governance program in place? If so, how will you update it to account for the impacts of cloud?
- What long-term value are you hoping to extract from your data?
- How are you planning or wanting to leverage new technologies and models such AI, machine learning, and GPU computing?
A logical first step might be inventory. What data, applications, etc. do you actually have in your IT estate? Shadow IT is a growing concern, as rogue divisions or teams adopt cloud platforms independently of the IT organization. DevOps can add to this issue, as can the proliferation of monitoring tools and related data sets, as CDCT National Portfolio Director – Consulting Services Peter Kraatz points out in this video
Clearly, there's much to think about, and each consideration requires input from more stakeholders than simply those in your IT organization. According to a recent IDG survey
, determining which workloads should move to the cloud is most often ranked as the number one challenge in executing a cloud strategy. Addressing new tool needs, managing change, and choosing cloud deployment models also pose difficulties.
O’Grady offers up his top takeaway: “More and more clients are asking us where their data should go — that’s a great question. And I don’t think there’s a company in the world — and I’m not exaggerating — that can answer that question better than we can
because of our legacy knowledge of data, data management and storage, combined with newer knowledge around public cloud and cloud workloads, and our relationships with major cloud vendors like Microsoft.” In short, CDCT is a viable resource for helping you refine your data strategy and make well-informed platform decisions. Contact us
today to start the conversation.