Hitachi Instance Manager Helps Reign in Big Data
By 2015, some estimates suggest the annual global data center IP traffic will reach 4.8 zettabytes—roughly 4GB of data per connected user, per day—which represents a compound annual growth rate of 33 percent! (In case you’re wondering, a zettabyte is a number with 21 zeroes after it.)
A recent Wired Magazine article, The Big Data Snowball Effect, highlighted one of the problems with this Big Data growth:
Adding to the avalanche, businesses have to protect these applications [and their data] — by creating remote copies for disaster recovery, local disk copies for fast recovery, backups on external disk systems for longer retention, and archives on cheaper disk or tape media.
If you follow the snowball effect of a single production application, you can often find over more than a hundred copies stored across all data repositories in multiple locations, so an application with a ten-terabyte database can result in a petabyte management avalanche.
This constant duplication and copying of data only complicates the fact that organizations often don’t know who made the copies, what tools they used, how long to retain the copies, or even how the organization uses that data. With the variety of possible data formats and the regulatory compliance issues to consider for each piece of data, the growth of Big Data and subsequent data storage needs has presented companies with some significant challenges.
One of our partners, Hitachi Data Systems, has recently announced a new solution that we think will help enterprises solve their data copy nightmare—the Hitachi Data Instance Manager (HDIM). According to a recent Hitachi press release:
Hitachi Data Instance Manager delivers the first step of the company’s data instance management vision by providing a holistic approach to solving customers’ instance management challenges – from laptops to protection, and from remote offices to cloud. With HDIM, organizations can apply the optimum information protection and recovery technologies, resulting in much more efficient and effective backup, versioning, replication, archiving, dedupe, and continuous data protection (CDP).
It looks like they’ll be integrating it into their Hitachi Content Platform (HCP) in 2013, and then perhaps integrate it into their storage arrays. Stay tuned. We’ll be blogging more about Hitachi news as the year unfolds.