Skip to main content
Exploring what’s next in tech – Insights, information, and ideas for today’s IT and business leaders

Managing storage: It's all about the data

Simplifying storage management makes focusing on using the data more efficient.

By now, most IT organizations have shifted a massive amount of information to public clouds in an effort to make everything more widely accessible, and to affordably hand off responsibility for those bits and bytes to qualified third parties like Microsoft Azure and Snowflake. For years, cloud has set the standard when it comes to agility. Line-of-business leaders and DevOps teams value the simplicity, self-service, automation, and manage-from-anywhere capabilities the cloud experience gives them. But most enterprises, especially those with an aversion to risk, have continued to keep proprietary and sensitive information on premises.

All of that data—about a third of all workloads—still has to be managed. And as organizations have grown more digital and cloud-centric, it has become nearly impossible to effectively manage their local data infrastructure. In fact, 82 percent of executives in a recent Enterprise Strategy Group (ESG) survey said storage management is currently a key challenge, and 93 percent complained it is impeding their digital transformation.

Please read: Enterprise storage agility: 3 keys for success

"Infrastructure simply cannot keep up with the speed of business today," says Scott Sinclair, a senior analyst who follows storage systems for ESG. "Traditional data and storage management is too complex, it's highly manual, and it requires more expertise than most enterprises have because there are so many knobs and levers that need to be turned and pulled for everything to run smoothly."

That's why a new data sheriff is coming to town. Indeed, an alternative approach called Unified DataOps is bringing cloud management capabilities to the enterprise through subscription services.

Traditional storage management is so yesterday

"This model makes traditional storage management passé," says Simon Watkins, worldwide product marketing manager for storage at Hewlett Packard Enterprise. "It shifts the paradigm from being about how you manage your vast storage arrays to how you get the most from your data, no matter where it resides."

Unified DataOps works by creating cloud data services and cloud-native infrastructure platforms for accessing and overseeing all of an organization's data and application workloads—from the network to its edge and the public cloud.

These cloud data services and platforms are highly automated with artificial intelligence (AI) and machine learning (ML), which means they can deploy, provision, secure, and adjust storage capacity with minimal human intervention. Importantly, they also abstract the storage management lifecycle, bringing a level of simplicity to an often obfuscated process. They are also delivered as a service, whereby organizations pay only for managing current amounts of data and apps.

Please read: Cloud backup should put you in charge

Watkins says one of the most attractive aspects of the model is that Unified DataOps collapses data and infrastructure management silos as opposed to data silos—in other words, you can stop following the data with multiple management tools and instead just manage data wherever it lives from a single destination or single pane of glass with cloud-native control.

Dealing with data management siloes

Most companies are plagued by this kind of complexity, with one in four having more than 50 distinct data silos, according to a 451 Research survey. From an operations standpoint, this can be troublesome. When it gets in the way of innovation or hampers how companies deliver customer experiences, it's been known to be financially devastating.

Consider this example: Every marketer today knows you have to provide personalized experiences to customers or they will leave you. So savvy companies push out a steady stream of customized offers, promotions, and perks based on consumer window shopping and purchase histories. When they do this well, 60 percent of consumers are more likely to become repeat buyers.

But when they fail, 44 percent of consumers are likely to take their business elsewhere. Customer data is the lifeblood of such efforts, and if various internal departments cannot freely share it in a timely manner, true personalization isn't possible. And when these business processes fail, it is often because companies have failed to unify how they manage their data and data infrastructures and end up being unable to access the information that will lead to a successful business outcome.

"Unfortunately, too often, organizations don't connect the digital dots to deliver those kinds of experiences or to pursue opportunities that could benefit the business, and that's a shame," Watkins says.

The trouble with tools

Sometimes IT teams avoid going down that path because they are too overwhelmed by the massive amounts of data coming their way to do much more than scramble to manage it. Other times, it's the result of a cultural tendency to cling to old approaches that worked well in the past. As data sprawl intensifies, some leaders reason that they can easily throw more management tools at the challenge and be fine.

Please read: AI-driven storage soars on the pay-as-you-go current

Nothing could be further from the truth, says ESG's Sinclair.

"The average organization today uses 23 different data management tools," he says. "That makes no sense. Who is using them? One person? Multiple people? Are they talking to each other? How much time are they wasting in meetings trying to correlate data collected from these 23 tools? Do they even have enough qualified staff to operate all those tools? This problem is spinning out of control. No wonder 59 percent say data management is increasing in complexity."

With a cloud data services approach like Unified DataOps, however, organizations can think in new ways about data management. Adoption enables a unified data experience that streamlines and simplifies workflows across the data and infrastructure lifecycle. Most notable, organizations can quickly balance or move workloads across on-premises, hybrid, multicloud, and public cloud environments with ease. Similarly, because AI and ML are involved, the system can automate and optimize otherwise laborious application deployment. IT teams simply enter the name of an app into a cloud console along with the amount of storage it will need, and the AI generates recommendations for where and how to best deploy it.

"So it reduces provisioning from what used to be a day-long process to mere minutes," says Watkins. "That automation can radically improve application development and deployment lifecycles."

Accelerating decision-making

Not everything needs to be handed over to cloud-based machines, though. When urgent or crisis situations such as cyberattacks or looming equipment failures call for human involvement, the AI can reference historical data and make recommendations to enhance and hasten decision-making. This is valuable because, when time is short, people can't often review all relevant information and many will just trust their gut. While this may occasionally pay off, it can also lead to disaster in complex scenarios.

Watkins says a sometimes-overlooked advantage of shifting to a cloud or unified data management approach is that it allows IT leads to monitor and manage both critical and non-critical situations anytime and from almost anywhere. With more of the world working remotely because of COVID-19, that could be huge, he says.

Please read: Operationalizing machine learning: The future of practical AI

"If you're an IT manager for a large manufacturing company and you're sitting on a beach somewhere, you could globally monitor and control your entire environment because you've got a 100 percent cloud-managed infrastructure," Watkins says. "Being able to do that from a single console, while having built-in automated intelligence with AI, is a powerful capability that hasn't existed in on-premises storage."

Sinclair agrees.

"The industry and vendor community needs to move in the direction of integrating intelligence in as many places as possible so organizations can extract maximum value from their data," he says. "It isn't easy, and it will take time. But I see things moving in the direction of this Unified DataOps vision."

It's time to take advantage of cloud operations, data-centric policies, and automation and AIOps across the data lifecycle to simplify data management.

This article/content was written by the individual writer identified and does not necessarily reflect the view of Hewlett Packard Enterprise Company.