https://sukutechnologies.com/wp-content/uploads/2017/11/1-6.jpg

As data storage technology has evolved with more choice and options for different use cases—the flavor of today is AI-ready storage—determining the right path for a data storage refresh requires a data-driven approach.

Decisions for new data storage must also factor in user and business needs across performance, availability and security. Forrester found that 83 percent of decision-makers are hampered in their ability to leverage data effectively due to challenges like outdated infrastructure, teams overwhelmed and drowning in data, and lack of effective data management across on-premises and cloud storage silos. Leveraging cloud storage and cloud computing, where AI and ML technologies are maturing fastest, is another prime consideration.

Given the unprecedented growth in unstructured data and the growing demand to harness this data for analytical insight and AI, the need to get it right has never been more essential. This article provides guidance on that topic by highlighting what not to do when performing a data storage refresh.

Mistake 1: Making Decisions without Holistic Data Visibility

When IT managers discover that they need more storage, it’s easy to simply buy more than they need. But this may lead to waste and/or the wrong storage technology later.

A majority (80%) of data is typically cold and not actively used within months of creation yet consumes expensive storage and backup resources. Plus, given that you can now purchase additional storage instantly and on-demand in the cloud and with storage-as-a-service on-premises, there’s no reason to overprovision.

To avoid this common conundrum, get insights on all your data across all storage environments. Understand data volumes, data growth rates, storage costs and how quickly data ages and becomes suitable for archives or a data lake for future data analytics.

These basic metrics can help guide more accurate decisions, especially when combined with a FinOps tool for cost modeling different options. The need to manage increasing volumes of unstructured data across multiple technologies and environments, for many different purposes, is leading to data-centric rather than storage-centric decision-making across IT infrastructure.

Mistake 2: Choosing One-Size-Fits-All Storage

Storage solutions come in many shapes and forms – from cloud object storage to all-Flash NAS, scale-out on-prem systems, SAN arrays and beyond. Each type of storage offers different tradeoffs when it comes to cost, performance and security.

As a result, different workloads are best supported by different types of storage. An on-prem app that processes sensitive data might be easier to secure using on-prem storage, for instance, while an app with highly unpredictable storage requirements might be better suited by cloud-based storage that can scale quickly.

This again points to the need to analyze, segment and understand your data. The ability to search across data assets for file types or metadata tags can identify data and better inform its management. Avoid the one-size-fits-all approach by provisioning multiple types of storage solutions that reflect your different needs.

Also, less than 25% of data costs are in storage: the bulk of the costs are in the ongoing backup, disaster recovery and protection of the data. So, consider the right storage type and tier as well as the appropriate data protection mechanisms through the lifecycle of data.

Mistake 3: Becoming Locked into One Vendor

Acquiring all your storage from one vendor may be the simplest approach, but it’s almost never the most cost-effective or flexible.

You can likely build more cost-effective storage infrastructure if you select from the offerings of multiple vendors. Doing so also helps protect you against risks like a vendor’s decision to raise its prices substantially or to discontinue a storage product you depend on.

If you have other vendors in the mix, you can pivot more easily when unexpected changes occur. Using a data management solution that is independent of any storage technology is also a way to prevent vendor lock in, by ensuring that you can move data from platform to platform without the need to rehydrate it first.

Mistake 4: Moving Too Fast

A sense of urgency tends to accompany any major IT migration or update, storage refreshes included. Yet, while it’s good to move as efficiently as you can, it’s a mistake to move so fast that you don’t fully prepare for the major changes that accompany a storage refresh.

Instead, take time to collect the data you need to identify the greatest pain points in your current storage strategy and determine which changes to your storage solutions will deliver the greatest business benefits. Be sure, too, to collect the metrics you need to make informed decisions about how to improve your data management capabilities.

Mistake 5: Ignoring Future Storage Needs

You can’t predict the future, but you can prepare for it by anticipating which new requirements your storage solutions may need to support in the future. At present, trends like AI, sustainability and growing adoption of data services mean that the storage needs of the typical business today are likely to change in the coming year.

To train AI models, for example, you may need storage that can stream data more quickly than traditional solutions. Likewise, implementing data services in order to support FinOps goals might mean finding ways to consolidate and share storage solutions more efficiently across different business units.

Conclusion: The Importance of a Storage Refresh

As organizations move from storage-centric to data-centric management, IT and storage architects will need to change the way they evaluate and procure new storage technologies.

The ability to analyze data to make nuanced versus one-size-fits-all storage decisions will help IT organizations navigate many changes ahead – be they cloud, edge, AI or something else still on the horizon.

Leave a Reply

Your email address will not be published. Required fields are marked *