Press Release: January 19, 2010
As the global economy recovers from recession, merger and acquisition activity is set to pick up (as evidenced by Krafts recent bid for Cadbury and Novembers announcement that BA will merge with Iberia), ultimately requiring more businesses to integrate their systems. Additionally, many organisations delayed planned storage upgrades during 2009.
Instead they added disk space to existing storage infrastructure as a temporary measure while waiting for the economy to recover. But with data volumes growing at a rate of 50-80% annually (International Data Corporation), more companies than ever will need to embark on a data migration project in the next year.
Moving data on a large scale is a complex task that is often underestimated by business leaders. The goal should be to move it transparently from A to B without any change to the profile of the data. For instance, we recently moved a large corporations data centre from London to Prague without any disruption to business operations. Taking a structured approach avoids costly delays and excessive disruption to business continuity that can take the form of downtime during the transfer process to staff struggling to locate files afterwards.
It is not surprising that there currently exists no protocol for data migration (DM); it would be impossible to develop a standard that encompasses the full range of hardware and software. However the industry should have some guidelines.
Currently, organisations are left to build their DM strategies as they go and address issues as they arise. Any IT manager who has ever had to face a department head who cant find a crucial file knows that this situation is far from ideal.
In the absence of an industry handbook, we suggest three steps that apply universally and should provide a general framework for every data migration project:
1.Conduct a data audit
It might be a clich, but knowledge is power, and like any other business project, the first step should be for senior management to develop an understanding of what the company is dealing with. In terms of data migration, this means profiling the organisations data with a file analysis tool and asking the following questions:
What needs to be moved and why?
What is the new storage device and how will the data be read in the new environment?
What is the desired profile of the data post migration?
Transferring data can be approached in a number of ways, each with its own pros and cons. When assessing each method, IT managers should base their selections on the data and unique needs of the business it is far easier to create a package of technologies to suit the data, than to try to tailor the data to a specific method. Questions they should be asking are:
What does it copy and how?
Does it simply copy the file, or does it also copy all other attributes, metadata and associated information?
Who needs to give permission to access the information?
A decade ago migration was purely physical, with data copied to disk or tape and transported to its new environment where it was replicated onto a new setup. While relatively unreliable and labour intensive, some organisations still choose this to keep costs down.
This method developed into one using wide area links in which data is transferred down a line, chewing up bandwidth and interrupting real time working on both sides of the network. As a process that could take weeks, this poses a significant challenge to business continuity.
Now, more organisations use data migration software, which profiles data, replicates it to a consolidated device and then mirrors to both locations. It then creates snapshots of new volumes which can be tested in parallel before final cutover. This process is increasingly popular as it uses fast and secure technologies such as compression, deduplication and encryption.
In reality, most organisations use a combination of tools, with mission critical data frequently being physically transferred to the new site, while large volumes of less crucial data are moved over the network and using software.
3.Decide internal network migration strategy
Once the data transfer has been completed, most organisations will re-evaluate how data is stored internally and opt for a cost-effective tiered storage system. In tiering, different categories of data are assigned to different storage media, based on factors such as frequency of use or security level required. This is an effective method in which to reduce total storage costs, as more expensive storage systems are reserved for Tier One data, with less critical files being saved on cheaper, high capacity drives. It ensures that organisations make the most of the storage space available to them, but it does require a strategy for transferring new data to different storage environments.
The company needs to do two things: classify data according to access criteria, and decide how much of the migration across the internal network will be manual or automated. Then, transfer could involve host-based (including data backup) applications or migrating between virtual machines on different physical servers.
While organisations dont have access to a step by step guide to transferring data, if they come prepared, and take time to understand their data, they are in a much better position to avoid many of the pitfalls commonly associated with data migration.
Zycko is a value-add distributor of best-in-class convergent IT infrastructure solutions through a channel of resellers, systems integrators and service providers.
Zycko is privately held and has been profitable since inception in 2000, when the companys original charter was to market data networking accessories to resellers as a wholesale distributor. Zycko now employs over 250 staff, serving over 3000 resellers around the world from twelve offices on four continents. The company enjoys an annual turnover of more than $180m.
Zyckos provision of best-in-class IT products and logistics management is supported by true value-add professional services - such as pre-sales expertise, technical support, custom configuration, an industry leading accredited training program, and in house marketing support. These vital services and support enable our customers to quickly deliver profits and invest in new market opportunities, allowing them to differentiate in a crowded market. Zycko is the channel partner of choice.
Zyckos strategic partner base includes world-class companies such as Avago, Asigra, Eaton Powerware, Exagrid, Force10, Hitachi Data Systems, Huawei Symantec, Intransa, JDSU, Isilon, Lifesize, Powerdsine, ProLabs, Spectra Logic, Riverbed, USystems and Xsigo.