How to improve the rate of success when migrating data to a new system of the environment?
Engage the business
Data migration is not solely an IT problem. Active participation and direct involvement from business stakeholders are vital to the success of the data migration project. Everyone is fully committed to their day jobs, and even though it will be painful, an organization must make these essential resources available during the migration process.
Failure to make business resources available for the current state and to-be discussions will lead to an IT-centric project that involves guesswork and assumptions. Most of these assumptions will remain undetected until very late in the project when the data fails to load correctly or when functional testing fails.
Ongoing involvement from business executives and managers will be required to approve data remediation to manage data quality issues. Also, business approval will be necessary to handle any discrepancies in how the new application or software manages specific business processes and the underlying data, including financial data. Data profiling will provide visibility to remediation needs.
Assign dedicated resources to the data migration team
Don’t think about the data migration project as a one-off event involving shifting data from one system or platform to another.
Instead, think of setting up a data migration team responsible for the delivery of an end-to-end, repeatable process flow from source to target with an associated methodology and purpose-built data quality engine at the core.
Start working on data profiling and quality months before the implementation project starts.
The best approach to uncovering risk early and fully understanding your data is through data profiling. The data profiling results can be used to engage stakeholders in aspects such as:
- Discussions about the actual and real state of the data.
- Root cause analysis that can point to business process flaws that create data issues.
- Discovering and understanding discrepancies between the documented and actual states of the environment is based on the data.
- Engaging in discussions about how “good” the data needs to be, including prioritization of mandatory vs. ‘nice to have’ requirements.
- Understanding the level of effort required to make the data “fit for purpose.” Without these types of discussions and associated analyses, any estimate about the actual time and effort needed to migrate the data successfully is simply guesswork.
Iterate frequently and adapt to changes
Each iteration executed by the data migration team reduces risk and improves the final output quality through enhanced visibility into the data and the introduction of additional cleansing rules per iteration. Hundreds of iterations will be required during the migration process. The team members (and executives) will need to remember that change will be the only constant during the migration process.
Factors to consider when formulating a data migration strategy
The more carefully your company and executives plan its data migration, the less likely you will encounter surprise costs, unplanned downtime, and, eventually, exposure to project failure. The less likely it is that your end-users will be frustrated or inconvenienced during and after the migration. You’ll want to establish goals, budget, timeline, and brainstorm with your technical and business teams to anticipate any challenges that you may encounter.
There are five principal determinants you should consider when determining how you’ll approach the data migration project:
Type of application
Specialized applications like complex financial, CRM, ERPs, or Accounting software requires more in-depth planning than applications that are non-essential for the core operations of the business. Any changes in your current environment will need careful planning, but some applications are more critical than others. In some cases, data can be moved with software tools specific to the type of data being migrated. These tools are usually provided by the software vendor or third party companies.
There are multiple things to consider when migrating data, but one of the most important is the go-live strategy. You can transfer data in stages while keeping the source and target systems running parallel. Alternatively, you can plan a big-bang approach to migrate the data outside of regular business hours. This second approach involves more risk but is probably the best alternative in most projects.
Amount of data
Even systems with few records require thorough planning. When examining the amount of data, the first factor is volume, but you should also analyze how complex and intertwined these records are. For example, if you need to migrate leasing contracts, you will need to include all the underlying transactions linked to each deal in your analysis. There could be cases where you initially consider the volume of the system’s main entities like customers or vendors but fail to understand the volume of other related entities.
One aspect that if often overseen in most projects but becomes critical on the go-live date is the time to complete the data transfer. There are many considerations to this point, like server memory, optimized SQL queries, and ETL structures. The amount of data being migrated and the speed of the processing tool will determine how long data migration takes. Suppose your go-live strategy considers a 48-hour window (during a weekend) to complete the data migration. In that case, the data processing step can’t take all of these hours since your business and technical teams will require time to complete other tasks related to the project.
Migration of Historical Data
This is probably the most common request in data migration projects. Business users request that all current and historical data should be migrated and available in the new system. While this could be a valid request from a business perspective, you might want to consider migrating transactions and records related only to active customers. Migrating historical data introduces an additional level of complexity to a project that is already complex. Historical data needs to be available for business users, but consider other solutions like a data warehouse, or data lake.