adv

Time to manage data, not storage

Tag:

Since IT systems churned out bits and bytes, businesses have always looked at the best ways to house data. But big data, which forced businesses to view data as hidden gems of information, has turned the world of storage provisioning upside down.

According to Enterprise Strategy Group’s White Paper Optimizing Data Management Through Efficient Storage Infrastructures, authored by Mark Peters, Senior Analyst and commissioned by IBM, big data is creating a “data conundrum”. Users expect more data to be collected through new sources and methods, but the cost of retaining this data in storage infrastructure is “not getting cheaper at the same rate as more data is being collected.”

This has made many businesses to rethink: “It is about data specifically rather than storage generically,” said the white paper.

Why rethink?
They are many reasons why a rethink is needed. Here are three highlighted in the white paper:

1. Underutilized hardware
IBM research showed companies are purchasing 24% additional capacity per year, yet are often using less than half of the capacity they already have. This shows underutilization and a lack of tiering or automation features.

2. Budgets are maintenance driven
IT budgets are too focused on its upkeep. However, the reverse is often true for leading companies, where the focus is on new technology projects that unlock business value from data.

3. Rising complexity
It’s a fact that many storage infrastructures are built through sporadic and unplanned acquisitions, increasing complexity and management costs. ESG noted that this may have been done purposely “because [the IT organizations] know that if they purchase something else or try to integrate a heterogeneous mix, they will also be buying themselves months of tedious operational problems.”

IBM’s alternative approach
IBM sees the future in the total separation of physical storage via a virtualization layer—often referred to as software-defined storage. Determining where data is placed—automatically and non-disruptively--allows improvements in the economics of storage while improving competitiveness becauseof better insights from data.

To enable this, IBM is putting storage virtualization as the foundation of its entire storage line. They include:

•    Storwize V7000: Offers greater performance for big data and analytics workloads, reduces acquisition cost by as much as half, simplifies management; reuses existing assets; and offers seamless scalability.
•    SAN Volume Controller (SVC): Dramatically reduces or eliminates storage acquisitions, virtualizes larger configurations for greater consolidation advantages, uses two HDD tiers for more performance/cost options, and supports larger storage and server configurations for operational cost savings.
•    IBM FlashSystem: Makes flash and data virtualization more economical for midmarket and growth markets while lowering entry points, and increases connectivity and tiering options.
•    XIV Storage System: Offers greater enablement and economics for service providers, and enhances security for public, private, and hybrid clouds
•    IBM DS8870: Boosts performance of database applications by up to 3.2x compared with the prior system; and Easy Tier updates provide new flash options that enable flexibility to meet current and future storage needs.

Original Author: 
IBM Hong Kong