I just spent a full day at a meeting of the Active Archive Alliance and as I was flying home it occurred to me that it’s time for data storage managers to rise up from the sleepy status quo of buying more disk arrays to address runaway data growth problems. It’s time to wake up and smell the sweet aroma of freshly made modern data tape (sort of like that new car smell if you don’t know).
Why do that you ask? Because best practices and undeniable facts say so. Consider the following:
Data goes through a lifecycle from hot to cold, that is to say from a period of active use to a period of inactivity. This can happen in as little as 30 days or less.
Inactive data should not stay on primary storage devices. It takes up space on expensive storage media, consumes more energy and adds to the backup burden.
For over five decades, CERN has used tape for its archival storage. In this Fujifilm Summit video, Vladimir Bahyl of CERN explains how they increased the capacity of their tape archive by reformatting certain types of tape cartridges at a higher density.
Previously, Storage Switzerland blogged about the merits of employing a tape storage hierarchy to cut backup storage costs. Tape media can furthermore add value as a tier in the broader disaster recovery strategy, as well.
As Lead Analyst George Crump overviewed in a recent video, applications are not all created equal when it comes to recovery time objectives (RTOs, the amount of time that it takes to get an application back up and running following an outage)
Check out George’s blog for more details and to view the video:
I often hear from customers that are sitting on scores of legacy tapes with unknown contents beyond a generic “business data” level, and 99+ percent of them are not known at a granular level. As we all know too well, disaster recovery backups morphed into unintentional data archiving these past 10 – 15 years thanks to litigation and government regulatory investigations, along with general business obligations to retain certain records. The duty to preserve has forced businesses to preserve backup tapes if at least one file on the tape might be under some form of preservation obligation. The IT staff almost never has the equipment or human resources to perform targeted restores of data under preservation and stack it together with other similar data, so they take the easy way out: buy more tape and retain existing tapes vs. overwriting their contents. Companies change backup software providers and migrate to newer backup platforms and get stuck paying maintenance and support for software and hardware they no longer use, but might one day.
Brookhaven National Labs (BNL) has grown from 60 PB of data archived in 2015 to 145 PB of data archived in 2018. In this Fujifilm Summit video, David Yu explains how BNL is using tape storage to cost-effectively manage this data growth. In addition, BNL uses an active archive system to provide easy access to data that is frequently needed by the BNL data center and other research institutions.
In this white paper, Brad Johns explains how “a modern tape solution that incorporates StrongLink, a small disk cache and two tape copies of all data, provides a responsive and much lower cost solution while protecting the enterprise’s valuable information.”
The vast volumes of data created daily, coupled with the opportunity to derive value from that data, is making active archives an increasingly important part of organizations’ data management game plans across the globe.
In this Q&A, Active Archive Alliance Chairman, Peter Faulhaber, FUJIFILM Recording Media, U.S.A., Inc., shares his perspective on the role of active archives in managing the data deluge.
Q: What are some of the key trends driving the shift to active archive?
A: I would say the relentless rate of data growth and how to manage it. The answer lies in proper data classification and moving data to the right tier of storage at the right time. Analysts say that 60% of data becomes archival after 90 days or less. So there is a need to cost-effectively store, search for and retrieve enormous volumes of rapidly growing archival content.
Explosive data growth continues to be a top challenge for today’s organizations and this growth is only going to increase in the future. In fact, according to analyst firm IDC, by 2025 worldwide data will grow 61% to 175 zettabytes, with as much of the data residing in the cloud as in data centers.
New technologies and approaches are continually being created to help address this data storage deluge. Members of the Active Archive Alliance from Fujifilm Recording Media, U.S.A., Inc, Spectra Logic, StrongBox Data and Quantum recently shared their insights into what the future looks like for active archives and data storage in 2019. Here are some of their top predictions:
In this Fujifilm Summit presentation, Molly Presley, founder of the Active Archive Alliance, explains how an active archive can provide visibility into your applications and machine generated data with actively assigned metadata no matter what tier it is stored on.
I had the opportunity to attend SC18 last month in Dallas. Every year the Supercomputing Conference brings together the latest in supercomputing technology and the most brilliant minds in HPC. People from all over the world and different backgrounds converged this year for the 30thSupercomputing Conference.
As you can imagine, some of the demonstrations were absolutelymind-blowing and worth sharing. For starters, power consumption in data centers is becoming more of a challenge as data rates continue to surge. Fortunately, 3M was live on the trade show floor tackling this issue by demonstrating immersion cooling for data centerswhich has the potential to slash energy use and cost by up to 97%. As this technology continues to evolve,we could see huge gains in performance and in reducing environmental impacts.
Usage of Cookies