While backup remains an active use case for tape due to its value for fast site restores and anti-cybercrime, tape’s future growth opportunities lie in many new and emerging areas. With the Internet, cloud, big data, compliance and IoT waves promising unprecedented data growth, the timing for advanced tape functionality couldn’t be better.
In this Storage Switzerland video, Tab Butler, Senior Director of MLB’s Media Management and Post Production talks to storage analyst George Crump about how the MLB Network manages all of its data and why the organization views tape as a vital component of the process:
Have you ever watched a TV show featuring some dangerous activity and the warning comes up “Do not attempt this yourself?” It’s usually good advice and reminds me of an experience I had recently.
I had the pleasure of co-presenting at “The Reel Thing,” a part of the Association of Moving Image Archivists Conference in Portland, Oregon with Steve Kochak from Digital Preservations Laboratories. During my part of our presentation I was able to explain to the audience how an LTO cartridge is made and explained in detail each of the different components in a cartridge and what their functions are.
The magnetic tape data storage industry has withstood numerous challenges from its own past performance, from the HDD industry, and mainly from those who are simply uninformed about the major transformation the tape industry has delivered. Early experience with non-mainframe tape technologies were troublesome and turned many data centers away from using tape in favor of HDDs. Mainframe tape technology was more robust. Many data centers still perceive tape as mired in the world of legacy tape as a result. However, this view is completely out of date.
In this new white paper, Fred Moore, president of Horison Information Strategies, explains why it’s time to take advantage of the many benefits tape can bring to your storage infrastructure.
In a recent Storage Switzerland blog, Lead Analyst George Crump talks about how, because IT is perpetually working to lower both capital and operating expenses associated with backup storage infrastructure, backup workloads are common targets for migration to the cloud. However, this is not necessarily the most effective strategy for optimizing cost efficiencies.
In this video, he talks with IT consultant Brad Johns about why IT organizations should holistically evaluate the total cost of ownership (TCO) of their backup storage infrastructure, as opposed to focusing solely on immediate costs such as upfront infrastructure acquisition.
I just spent a full day at a meeting of the Active Archive Alliance and as I was flying home it occurred to me that it’s time for data storage managers to rise up from the sleepy status quo of buying more disk arrays to address runaway data growth problems. It’s time to wake up and smell the sweet aroma of freshly made modern data tape (sort of like that new car smell if you don’t know).
Why do that you ask? Because best practices and undeniable facts say so. Consider the following:
Data goes through a lifecycle from hot to cold, that is to say from a period of active use to a period of inactivity. This can happen in as little as 30 days or less.
Inactive data should not stay on primary storage devices. It takes up space on expensive storage media, consumes more energy and adds to the backup burden.
For over five decades, CERN has used tape for its archival storage. In this Fujifilm Summit video, Vladimir Bahyl of CERN explains how they increased the capacity of their tape archive by reformatting certain types of tape cartridges at a higher density.
Previously, Storage Switzerland blogged about the merits of employing a tape storage hierarchy to cut backup storage costs. Tape media can furthermore add value as a tier in the broader disaster recovery strategy, as well.
As Lead Analyst George Crump overviewed in a recent video, applications are not all created equal when it comes to recovery time objectives (RTOs, the amount of time that it takes to get an application back up and running following an outage)
Check out George’s blog for more details and to view the video:
I often hear from customers that are sitting on scores of legacy tapes with unknown contents beyond a generic “business data” level, and 99+ percent of them are not known at a granular level. As we all know too well, disaster recovery backups morphed into unintentional data archiving these past 10 – 15 years thanks to litigation and government regulatory investigations, along with general business obligations to retain certain records. The duty to preserve has forced businesses to preserve backup tapes if at least one file on the tape might be under some form of preservation obligation. The IT staff almost never has the equipment or human resources to perform targeted restores of data under preservation and stack it together with other similar data, so they take the easy way out: buy more tape and retain existing tapes vs. overwriting their contents. Companies change backup software providers and migrate to newer backup platforms and get stuck paying maintenance and support for software and hardware they no longer use, but might one day.
Brookhaven National Labs (BNL) has grown from 60 PB of data archived in 2015 to 145 PB of data archived in 2018. In this Fujifilm Summit video, David Yu explains how BNL is using tape storage to cost-effectively manage this data growth. In addition, BNL uses an active archive system to provide easy access to data that is frequently needed by the BNL data center and other research institutions.
Usage of Cookies