How to overcome enterprise storage problems


1. data blindness pain

Many storage systems are not very good at data awareness. They are kind of dumb or mute for a better word; Most storage systems don't tell you the details of the data or what clients are currently doing with it. It is possible to get answers to these questions in other ways - but all of these means lead to complexity.

Your storage system should be able to answer specific questions about your data, such as: What is consuming all this throughput? Where the heck did my capacity go on Sunday? What's eating up my capacity right now? What do I need to back up? What can I safely archive? When do I need more storage?

Most high-end storage providers offer visibility tools. In any case, storage in your data center is best positioned to tell you things about itself and the things that access it.

To get a handle on your data, research data visualization tools - do they answer the questions you've had lately when evaluating your incumbent solution? Can you monitor and control data access and usage in real time?

If you value integration with your management system, you should require API access. If your storage vendor doesn't provide access, ask for it.

"Moving from no analysis of legacy systems to comprehensive analysis of Qumulo has changed the game. Accurate, real-time trending information improves decisions, which saves lives." - David Emery, Senior Systems Administrator, Riverside County Sheriff's Department

2. data loss pain

It hurts me to say it. I cringe even at the words "data loss." To state the obvious, data protection is very important. In this industry, data is the real thing we work on and change - so lost data is lost time, lost money, lost jobs. And in the case of a successful ransomware attack, it can mean a loss of business continuity. For these reasons, you should look at systems that can help you protect your data storage as part of a holistic security strategy. Here are a few suggestions:

Make sure the "rebuild performance" matches the "drive population." If it's going down, it's moving in the wrong direction. It needs to increase as the drive population increases. There needs to be some sort of parallel rebuild system.

Stay at the minimum possible level of protection - don't jump to a low level with the goal of trying to protect everything; it will cost you and increase the cost of small random writes.

There are some object and scale-out systems that provide per-file data protection. Avoid this if you can. If you have a small file count, this may not be a big deal, but as the file count grows, this strategy will not work.

Make sure The business continuity strategy includes a disaster recovery plan That adds an extra layer of defense with backup of data storage in the cloud.

"Somewhat NAS solutions performed well on large files, others on small, but Qumulo was good on both. This was the first factor that really counted for us, as it allowed us to do computations directly from the storage array. No more jumping back and forth between the array and the servers, with all the associated risks of data loss." - Vincent Julié, Chief Operating Officer, Portal

3. data locality pain

Does this pain resonate with you or someone you know? It goes like this, "I have data on-premises that also needs to be in the cloud! In this case, I made a mistake moving it to the cloud and now I want to repatriate my data! I have multiple sites and just need to share a namespace between them!

In this interconnected world, more and more companies of all sizes are opening multiple sites, whether physical or virtual. Users in different locations and with complementary roles need access to the latest data sets to collaborate efficiently.

Ensure that data can be moved easily using replication and/or optimized data transfer.

Ensure that the file system provider can run its software effectively locally and in the cloud.

You should prioritize the flexibility of your storage solution beyond the scalability mentioned above. The ease of moving data through tools such as replication or caching or tiering can provide great value if you have multiple physical or virtual sites.

4. data migration to the cloud pain

Knowing "what" makes sense to move to the cloud can help you avoid the unpredictable cost issues, data locality issues (mentioned above) and shared responsibility for security issues. Sounds complicated, but it's really just a matter of common sense. What high-performance workloads make sense in the cloud versus on-premises? Your use case should help you decide whether to run in a private cloud, public cloud or hybrid cloud environment.

For ultimate flexibility, however, you need a file system that supports workloads on-premises and in the cloud. Ask if your storage provider has a cloud strategy, can it run the same software in the cloud, is it easy to move records to and from the cloud? What file access protocols are supported and does that cover the apps you need to run in the cloud? Can your file system scale capacity linearly with performance? Can you run your high-performance workloads at the scale you need, or is there a limit? You can migrate the tricks and treats of explore to the cloud - but the key is flexibility.

Just say no to data minimums and caps. You can run in the cloud and store and manage as much or as little file data as you want - whether you're running a managed service or managing your own workloads in the cloud. Choose carefully - and you won't have to worry about data minimums or caps as your workloads change.

Make sure you have the flexibility of a hybrid cloud strategy so you can run in a private cloud, hybrid cloud or multiple public clouds and take advantage of multiple cloud providers, from cloud-native services to managed cloud services to industry-specific virtual environments. Flexibility also means multi-protocol file access to a single namespace (file data lake) that is easily accessible from your data center and multi-clouds.

Insist on controllable, predictable costs - by ensuring your file system or managed service provides the capability, with a fixed price per TB. This will help you avoid surprises with predictable expenses.

Many companies are taking the first step into the cloud by having a backup and disaster recovery second site in the cloud. This allows them to create backups of their most sensitive data outside the data center in case ransomware or a natural disaster compromises their main site. To do this, however, your file system must allow you to move your data to and from the cloud.

Before moving to the cloud, find out about the storage provider's track record of customer satisfaction. You shouldn't be left alone to figure it out. Check their NPS scores to see how their customers rated them.

Don't go it alone. Check to see if your data storage partner has an experienced partner ecosystem of integrators from which to draw when creating and executing on your cloud migration strategy.

LihatTutupKomentar