In this video Brendan Sullivan, Fred Moore and Chris Dale discuss the IT environments that have existed over the past 30 years that have resulted in the mountains of unstructured data being backed up or archived on a plethora of different tape and data formats, and the reasons why the vaulting strategies created at the time do not serve the legacy data issues of today.
Watch the full video here.
Modern tape libraries are part of an overall data management lifecycle strategy that offer many benefits including lower cost, energy savings, increased security and long-term shelf life.
We’re excited to partner with Spectra Logic and Iron Mountain on this new Storage Switzerland eBook: Reintroducing Tape to the Modern Data Center. The first chapter debunks some of the common myths of tape storage around reliability, access and operations. Read more about it here.
Stay tuned over the next few weeks as we reveal the next four chapters covering topics such as disaster recovery and backup, performance, cost, and offsite storage.
Interested in learning more on this topic? Register for our webinar 5 Reasons Modern Data Centers Need Tape on September 26th at 11:00 am EDT.
Ever wonder if you are getting the best deal on your data storage? Understanding the total cost of ownership (TCO) is critically important to any data storage purchase decision.
Today we introduced our new TCO Calculator, an updated version of our online tool that helps IT professionals assess and compare TCO for automated tape storage, disk-based storage, and cloud-based archive storage. The new TCO Calculator raises the maximum user storage baseline from 10PB to 100PB, integrating the IBM TS4500 enterprise library using LTO-8 drives and media for initial capacities over 10PB. Amazon S3 Glacier Deep Archive and bulk retrieval service is now also included in cloud storage cost comparisons.
After entering data into the TCO Calculator, users can download a customizable results report which includes an executive summary, key cost assumptions, and TCO by cost category and type (e.g., energy costs, offsite costs, service fees, labor, bandwidth, etc.).
Find out how you can start saving on your data storage costs now. Access the free TCO Calculator here.
By Carrie Monaco
Sometimes it seems like the green movement fades in and out of focus for many organizations. But at FUJIFILM Recording Media U.S.A, Inc. (FRMU) one of our primary goals is to enhance the quality of our environment and the community in which we work and live. Our sustainability initiatives play an important role in that effort through our reduction in Co2 emissions and in our efforts to use recycled and environmentally friendly materials in our own production and requiring the same of our suppliers.
The solar panel installation project at our Bedford, Massachusetts manufacturing facility began in response to a FUJIFILM corporate mission of energy conservation and Greenhouse gas reduction to address issues of climate change. With 1,870 solar modules, our solar installation has produced 2,977,000 kwh since its inception in November of 2013. That is the equivalent amount of energy used by 4,666 homes during an entire month. It is also the equivalent to a reduction of 1,787 metric tons of carbon dioxide per year. Over 20 years, this would equal the carbon sequestered by 7,280 acres of U.S. forest in one year.
Archival data is piling up faster than ever as organizations are quickly learning the value of analyzing vast amounts of previously untapped digital data. Industry studies consistently find that the vast majority of all digital data is rarely, if ever, accessed again after it is stored. However, this is changing now with the emergence of big data analytics made possible by Machine Learning (ML) and Artificial Intelligence (AI) tools that bring data back to life and tap its enormous value for improved efficiency and competitive advantage.
The need to securely store, search for, retrieve and analyze massive volumes of archival content is fueling new and more effective advancements in archive solutions. These trends are further compounded as an increasing number of businesses are approaching hyperscale levels with significant archival capacity requirements.
Relentless digital data growth is inevitable as data has become critical to all aspects of human life over the course of the past 30 years. Newly created worldwide digital data is expected to grow at 30% or more annually through 2025 mandating the emergence of an ever smarter and more secure long-term storage infrastructure. Data retention requirements vary widely, but archival data is rapidly piling up. Digital archiving is now a required discipline to comply with government regulations for storing financial, customer, legal and patient information. Most data typically reach archival status in 90 days or less, and archival data is accumulating at over 50% compounded annually.
Many data types are being stored indefinitely anticipating that eventually its potential value might be unlocked. Industry surveys indicate nearly 60% of businesses plan to retain data in some digital format 50 years or more and a growing amount of archival data will never be modified or deleted. For most organizations, facing terabytes, petabytes and even exabytes of archive data for the first time can force the redesign of their entire storage strategy and infrastructure. As businesses, governments, societies, and individuals worldwide increase their dependence on data, archiving and data preservation become a critical practice.
It’s time to develop your game plan! Check out this white paper from Horison Information Strategies to learn more.
The rise of big data, business analytics and stricter compliance regulations necessitate that enterprises retain far more data than ever before. Against this backdrop, disk and cloud-based backup implementations are scaling quickly. In fact, many enterprises are dealing with secondary storage repositories that are ten times larger than their production storage repositories.
Especially for large enterprises with diverse data stores, tiered backup strategies are critical to meeting recovery service level agreements (SLAs) without breaking the bank; not all data is mission-critical to recover immediately in the event of an incident, and thus can live on a less expensive storage media.
As discussed in the video below, tape storage media is often perceived as a viable fit only for archive and long-term retention use cases today, but it can also play an important role in the backup hierarchy.
View the video here.
Read the full Storage Switzerland blog here.
While backup remains an active use case for tape due to its value for fast site restores and anti-cybercrime, tape’s future growth opportunities lie in many new and emerging areas. With the Internet, cloud, big data, compliance and IoT waves promising unprecedented data growth, the timing for advanced tape functionality couldn’t be better.
Check out this new white paper from Horison Information Strategies to learn how the tape renaissance is ushering in the era of modern tape.
The selection of data storage technologies has never been more robust. Today’s choices range from ultra-high capacity, low cost storage at one end of the hierarchy to very high levels of performance and functionality at the other. These choices define the unique levels or tiers of today’s storage hierarchy.
The foundations of tiered storage had their beginnings over 30 years ago when disk, automated tape libraries and advanced policy-based data management software (HSM) combined to effectively migrate less-active data to less-expensive storage devices. At the highest level, tiered storage refers to an infrastructure capable of optimally aligning storage systems with application requirements and their required service levels. The business case for implementing tiered storage is compelling and becomes increasingly so as the storage pools get larger. Tiered storage integrates hardware and storage management software to provide a seamless operation for customers to realize the huge TCO and ROI benefits available today.
A tiered storage environment consists of two or more kinds of storage technologies, delineated by differences in four primary attributes: price, performance, capacity and functionality.
Check out this white paper from Horison Information Strategies to learn more about these different storage tiers and how your organization can more cost-effectively store its data based on various policy requirements.