Storage Tiering

 

What is storage tiering?

Company data growth has vastly exceeded expectations and IT and Storage Managers are faced with making sure their organization’s data is labeled and stored for ongoing, immediate or much later access. Most organizations typically use storage tiering to place their data into defined silos based on business requirements and user-demands, usually driven by certain workloads and applications. Defining storage tiers includes many other variables in order to identify what data set goes into which storage tier and when it gets moved from one tier to another.

The primary goal for most organizations is to leverage their storage capacity cost effectively. Storage can be utilized much more effectively simply by using a different approach, for example, digging deeper into individual workloads and applications rather than viewing each as a whole for storage decisions. 


Intelligent storage tiering

What is it you're truly trying to achieve? All companies want to reduce their storage costs but in order to get the real value of doing any kind of storage tiering, you need some level of analysis that occurs on the data itself. That in itself will help automate the process of moving data to the proper tiers and maximizing each level of storage. Forklifting data that hasn’t been combed through or identified, just to squeeze more data into your storage to utilize space, is not at all cost effective. It’s a false premise, and here’s why.

Using the least expensive storage, typically your cloud storage, has other inherent values beyond just to use for replicating data that you no longer have to backup. If you’re paying a premium for an archiving tool or a special storage device that does the duplications, you have to consider the cost of that as well, versus utilizing cloud storage based on policies defined within each of your data sets and applications. It’s not just the volume of data, it’s the actual cost of the physical storage mediums.

Cost savings are realized by putting data in storage, based on knowing what your data is, not on how much volume it consists of. Consider the amount of time and energy needed to moving data around purely based on different volumes and re-allocating your data based on storage needs, purely around volume. You do not need to do traditional backups if you’re effectively moving data based on the varying degree of business requirements, user-needs and accessibility for compliance or litigation purposes.

Intelligent storage tiering is where you extract specific information, not just blocks of data. This is information that needs to be retained for an extended period of time and then managed independent from a database application. Another example is email. Typically, older email is not viewed after it’s a year old unless search is required for legal or compliance reasons. Old email does not need to reside in your inbox and therefore, does not need to sit on an Exchange database environment. You can take this old data, that is essentially in a read only state or fixed state, and move it out of its primary application. This allows for minimized backup requirements and management of email. Instead, that accumulated data is designed to sit in a repository for massive storage at the lowest possible cost, with the capability to search and retrieve.


Automating Storage Tiers Maximizes Capacity

The question now is how to automate the movement of that data to lower-cost repositories. What if you had an intelligent connector that does the analysis to manage your data. It’s the ideal way to store your data efficiently and write policy to it so you can track it and manage it. Bishop offers HubStor, an innovative SaaS solution to manage your file system data before you move it to a cloud archive. It will identify frequently touched data that requires high-access storage and point to the data that is no longer relevant to the current state of business and placed into a higher-capacity storage.  


Lowering costs through data analysis, not storage analysis

How do you put the right data in the right storage silo and how do you do in an automated fashion? Most organizations do not manage moving their data to different storage tiers automatically. If they do, they’re likely using a pretty antiquated methodology for doing so.

The most common exercise is to tier storage by putting different applications on to different storage types. Take database applications, for example. An organization might put their databases in the fastest storage tier and put their email on a medium-fast storage tier and finally, put all their backups into more high-capacity storage. Then there’s the cloud. They’re not sure quite how to leverage cloud storage and most will use it for backup. That’s all very high-level and organizations are making the determination based on the type of application and storage capacity when the reality is, storage requirements should be viewed within each application and break down storage tiers at a deeper level. Applications provide variable uses and within most of those applications, some stored data will need to be accessed more frequently than other stored data within that same application.

Companies need to look within their database or email to identify what requires quick-access and what does not, to make the most out of your storage space. Otherwise, if you’re just pushing all your data into a storage tier based on very high-level rules and policies, the bottom of your storage is never getting touched and you will continue to buy more expensive disks to support the entire application as the database continues to grow. That's where real cost comes into play.

 

Leveraging SaaS, cloud platforms and Bishop services for storage efficiencies

The challenge companies face today with the continuance of rapid data growth and the need to keep storing that increased volume of data, is finding a way to identify what you have before you label it for a particular storage level. This is a critical step for data storage to be effective and meet both storage demands and business requirements.

Bishop offers consulting and managed services to support solutions that can manage your organization’s data storage analysis and cloud archiving. 

Get a Quote!



Storage Tiering