Machine Learning Authors: Zakia Bouachraoui, Liz McMillan, Roger Strukhoff, Pat Romanski, Carmen Gonzalez

Related Topics: Microservices Expo

Microservices Expo: Article

SOA World Special: What Makes One Data Migration Work Where Another Fails?

Flexibility Is Key to a Successful Data Migration

Data migration has rapidly risen from distant acquaintance to the CIO's very best friend. In this article, Celona Technology's Paul Hollingsworth explains why data migration is the new hot topic of 2008 and examines what can be done to ensure migrations are delivered on time, to budget and aligned to business need.

Data migration is almost universally dreaded by IT professionals undertaking complex, application-level consolidation or renewal projects. A recent survey Celona conducted amongst telecoms IT professionals showed that 93% of them were fearful of application-level migration. This might seem a surprisingly high number, but unfortunately there are sound reasons for this level of unease. According to Bloor Research , over 80% of data migration projects are not delivered on time or to budget. Bloor's Phil Howard explains that research has revealed that Forbes 2,000 companies already spend at least $5 billion per year on migrations, and yet cost overruns average 30% and time overruns average 41%.

You might ask how this is possible? The answer is largely due to the fact that data migration is still not regarded as a valued skill and practice in its own right, but is treated as the final hurdle in a complex project - an afterthought once the functionality has been developed. At a recent British Computer Society (BCS) meeting BT's CIO Phil Dance bemoaned this fact. "Data migration is the single biggest thing that kills you," he warned. "You spend lots of money and effort on building functionality, but then you don't spend the effort on moving the data across…and now moving data isn't a once-in-a-career event, it's continuous. And it's even worse than that because now [we're a 24x7 company] we have no time to do it in."

So what makes one migration work where another fails to get out of the box? What ensures one comes in on budget, while another is massively overspent? Well, a successful migration is founded on getting three things right: people, process and technology. Getting any one or more of these three wrong will destabilise the entire project. That doesn't sound too difficult does it? Get the right experience, select a good method and then develop or buy a tool to deliver the data for you. But the problem is that large migrations are extremely complex on a variety of levels and the sheer scale of them soon leads to challenges.

Let's consider one of these three key success factors - migration technology. Analysing different migration projects reveals there are five main types of data migration, as shown in Figure 1. However, it is unlikely that any one of these will entirely fit all the requirements of a complex project. In other words, the chances are that any programme of complex transformation will require that a range of approaches can be delivered, as a single project may move through a number of approaches over time or even combine approaches in parallel.

For example, an enterprise might initially decide to go with a 'Don't Migrate' approach in order to get a new customer service up and running without any delays. Some information may then be synchronised with existing systems (eg revenues written back to the old accounts receivable system). However, following the launch and trial with new customers, the business may need existing customers who take up the new service to be migrated with their old service information on an event-driven basis. Then, after the new systems have stabilised, the business might decide that incremental or even bulk-load strategies should be used to migrate more customers to the new systems, while at the same time continuing to migrate individual customers when they order a new service.

Figure 1 - The five main types of data migration, Celona Technologies 2008

As we have seen, a complex project may require multiple data migration approaches to be used at different times or in combination, in order to deliver the results needed by the business. However, data migration tools - whether proprietary or third party - have tended to be built to deliver a particular type of data migration. For example, although they have been used for other types of migration, extract-transfer-load tools (ETL) are designed to bulk load data warehouses. Since most proprietary and third-party tools are designed specifically to handle a migration in a certain way, they are often not able to accommodate business changes that impact the project, or handle the scenario where the business requires more than one data migration approach to be used. Delivering a complex application migration requires flexibility in the data migration tool used, as it is not always possible to predict how things are going to change during the course of the project, or exactly which approaches will be needed at the start of the project.

In response to the demand for flexible, multi-purpose tools, a new generation of data migration tools is emerging, as shown in Figure 2.

Figure 2 Third-generation data migration tools are more flexible

Progressive migration tools are one of the third-generation of third-party tools that are coming onto the market. They employ a flexible strategy for application-level data migration, enabling different migration approaches to be used during the course of the project, or in combination, as is needed. The flexibility inherent in progressive migration tools means they support both project and business change, and deliver a lower risk, faster and business-driven migration. Key benefits of progressive migration include:

  • flexibility - migration approaches are not cemented at the start of the programme and the tool is flexible enough to cope with change
  • business-driven - the business remains in control, deciding which data is moved and when
  • lower risk - progressive migration minimises the risk of failure
  • faster time-to-value - progressive migration supports multiple migration approaches which means value from the target systems can be seen earlier
  • accelerated return on investment (ROI) - from earlier delivery of new revenues and faster consolidation of legacy systems, as well as by reducing project risk and costly overruns, lowering rework costs (by minimising errors and focusing effort on business priorities) and because a single tool can be used to support all migration approaches required by the business.
Progressive migration delivers against one of the three pillars of a successful migration, but it is also essential that the other two pillars (people and process) are not neglected. In the next article in this series Celona's CTO Tony Sceales and data migration guru Johny Morris will look at other aspects of achieving a successful migration by exploring the golden rules of a successful migration.

More Stories By Paul Hollingsworth

Paul Hollingsworth is Director of Product Marketing at third-generation migration specialists Celona Technologies (www.celona.com).

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

CloudEXPO Stories
The precious oil is extracted from the seeds of prickly pear cactus plant. After taking out the seeds from the fruits, they are adequately dried and then cold pressed to obtain the oil. Indeed, the prickly seed oil is quite expensive. Well, that is understandable when you consider the fact that the seeds are really tiny and each seed contain only about 5% of oil in it at most, plus the seeds are usually handpicked from the fruits. This means it will take tons of these seeds to produce just one bottle of the oil for commercial purpose. But from its medical properties to its culinary importance, skin lightening, moisturizing, and protection abilities, down to its extraordinary hair care properties, prickly seed oil has got lots of excellent rewards for anyone who pays the price.
The platform combines the strengths of Singtel's extensive, intelligent network capabilities with Microsoft's cloud expertise to create a unique solution that sets new standards for IoT applications," said Mr Diomedes Kastanis, Head of IoT at Singtel. "Our solution provides speed, transparency and flexibility, paving the way for a more pervasive use of IoT to accelerate enterprises' digitalisation efforts. AI-powered intelligent connectivity over Microsoft Azure will be the fastest connected path for IoT innovators to scale globally, and the smartest path to cross-device synergy in an instrumented, connected world.
There are many examples of disruption in consumer space – Uber disrupting the cab industry, Airbnb disrupting the hospitality industry and so on; but have you wondered who is disrupting support and operations? AISERA helps make businesses and customers successful by offering consumer-like user experience for support and operations. We have built the world’s first AI-driven IT / HR / Cloud / Customer Support and Operations solution.
ScaleMP is presenting at CloudEXPO 2019, held June 24-26 in Santa Clara, and we’d love to see you there. At the conference, we’ll demonstrate how ScaleMP is solving one of the most vexing challenges for cloud — memory cost and limit of scale — and how our innovative vSMP MemoryONE solution provides affordable larger server memory for the private and public cloud. Please visit us at Booth No. 519 to connect with our experts and learn more about vSMP MemoryONE and how it is already serving some of the world’s largest data centers. Click here to schedule a meeting with our experts and executives.
Darktrace is the world's leading AI company for cyber security. Created by mathematicians from the University of Cambridge, Darktrace's Enterprise Immune System is the first non-consumer application of machine learning to work at scale, across all network types, from physical, virtualized, and cloud, through to IoT and industrial control systems. Installed as a self-configuring cyber defense platform, Darktrace continuously learns what is ‘normal' for all devices and users, updating its understanding as the environment changes.