Who hasn’t already made the painful experience of how bad communication has led to additional costs and reduced the quality of a project at the same time? We don’t want that to happen at all and therefore most project leaders today are very aware of the importance of communication within their project.
To help keeping your communication straightforward in a migration project, let’s have a closer look at the mapping specification as a crucial document in migration projects.
When introducing new systems or adding new areas to a new system usually data migrations are needed to integrate the data from the old system or other areas into the new system. During such data migrations often a migration tool is used to get the data from the source into the target system. This migration tool transforms the data to be compliant with the new system and imports the data into the system.
In such a scenario generally these questions come up: How to validate the data migration? Do we need to validate the tool itself? Do we need to validate all details of all rules and functionalities that are available in the tool and could be used in theory?
The implementation of a standardized DMS like D2 or the > Dell EMC Documentum Life Sciences Solution Suite requires the migration of documents from the old to the new system or a transforming of the data model within the system in order to work properly with the new application. Often this old data is not fully standardized and structured, but either based on a less controlled system such as Documentum Webtop or on a controlled system with slightly different structures such as CSC FirstDoc or Cara.
As both EMC Business Partner and OpenText Technology Partner we are excited about this development. Our product migration-center enables EMC Documentum and OpenText customers to choose the best of both worlds and migrate their content to either platform without risk!
OpenText announced that it has entered into a definitive agreement to acquire Dell EMC’s Enterprise Content Division (ECD), including Documentum [http://goo.gl/tRA5LL].
What does this mean for OpenText and EMC Documentum?
Primarily, customers will have an even stronger partner to accelerate their digital transformation in order to remain competitive and to stay in business. “We are at the beginning of the Digital revolution where extreme connectivity, automation, and computing are converging,” said OpenText CEO and CTO Mark J. Barrenechea. “This acquisition further strengthens OpenText as a leader in Enterprise Information Management, enabling customers to capture their Digital future and transform into information-based business.”
Out of curiosity to explore the new features of D2 4.6, I upgraded one of our D2 4.5 projects a few weeks ago. In this blog post I explain the new features, the upgrade, as well as our experiences with the new version of D2.
So apart from all the technical stuff, which new aspects will benefit the end users? The new release introduces a couple of valuable features.
Document management is no longer greenfield terrain. By now many companies have already determined their need for a document management system and understand the benefits it offers them, especially in the Life Sciences. For this reason, nearly every company dealing with mission-critical documents has implemented some sort of solution in recent years.
The question that is now more likely to arise is whether the performance level and user friendliness of today’s system are still adequate, and/or capable of meeting growing requirements. And it could be that other providers have developed a newer, more attractive solution.
1 | MIGRATION VS. DECOMMISSIONING
IBM Domino has been around for decades. Mid-sized to very large companies have built hundreds and thousands of applications around IBM Domino that support their business processes; many of those applications are business critical, e.g. because they store information that is subject to legal requirements.
But the IT world keeps changing. New technologies arise nearly every day. Companies need to continuously review their IT strategy in order not to fall behind. Regarding IBM’s Domino offering there are quite a few reasons why companies reassess their investments into that technology. Many of those companies already have decided or are in the process of deciding to move away from IBM Domino. They are looking at Enterprise Content Management (ECM) systems like Alfresco, EMC Documentum, OpenText, SharePoint etc. to replace the applications they have built on IBM Domino technology over the last 15 to 20 years. Another option is to archive the information from these applications in an Enterprise Archive like EMC’s InfoArchive platform. And – what a surprise, this is not an easy job. IBM Domino as well as the applications built on top of it has unique features that the target platform might not support and that might be difficult to implement – even with huge customizing efforts.
Maintaining a silo with millions and billions of documents is a huge challenge. At some point the need to make changes to the system configuration, metadata and object model or even to the physical environment, either in terms of hardware or with software- version upgrades might arise.
Usually this does not seem to be a very big deal since there are plenty of migration tools and products out there specifically designed for such a task. The obvious way is to put the corresponding system in read-only mode, do necessary changes and/or upgrades, move documents from A to B and activate the new platform. (In fact this is not as easy as one may think- but that is another story)
But what happens if there is no option for a large maintenance time-frame? No option for a single system-downtime at all? There may be situations where the system must be up and running for nearly 24 hours a day, seven days a week…
Good news is there are two approaches how such requirements can be met. The first one, compared to the other, will take a lot of time but makes it theoretically possible to migrate without a single second of downtime. read more
DIFFERENT SOLUTION APPROACHES
When the URL consists of the physical host name
Assign the old physical host name additionally to the new hardware (host) and define a new virtual host in the JSP container handling the requests targeted to the old host name (Figure 1 ) or handle all requests to the old host names by a dedicated server (host). This dedicated server redirects the requests to the new web server host (Figure 2).
When the URL consists of a DNS alias (instead of a physical host name)
This DNS alias is switched from the old host to the new host (Figure 3). The DRL / link component inside the appropriate web application is subject to being updated. This new DRL / link component is able to identify »old« object IDs either by the repository ID portion of the object ID (the repository ID is part of the object ID) or by querying the Documentum repository for the old object ID.
Fig. 3 Redirections handled by a dedicated server
HANDLING OF COMPLEX CASES
When a single cabinet is moved from one repository into another
The redirection logic have to be handled by specific code. In case the object ID has not been found within the source repository, specific target repositories are queried for the old object ID.
A lot of complex cases and the cases are not homogenous
A centralized Oracle table could hold all information regarding old object IDs mapped to new object IDs. This Oracle table might be queried by the redirector code to obtain the new object ID. By the repository ID portion of the object IDs, the redirector might also be able to identify the repository holding the corresponding object.
fme now incorporates 17 years of project experience in the field of migrating applications, content and data. Our migration experts have developed best practice methods and various solutions which can be applied to different system architectures in order to fix broken links. If you need further Information please contact us – we welcome any challenge!
For detailed information see our technical white paper
EMA is NOT an out-of-the-box product, it’s a tool set (framework) that ONLY the Documentum Professional Services Team uses. This tool is exclusively available through EMC IIG Services. Once the engagement is over, EMA leaves with the team and cannot be used for additional migrations. Partners and customers are not able to use EMA without IIG Consulting in a project because EMA bypasses the API (DFC).
The main use case for EMA is the high speed cloning from Oracle based on-premise Documentum installations to MS SQL Server based off-premise (EMC onDemand) installations. For this approach a simple dump&load is not feasible and a tool for cloning is needed. In addition EMC addressed some other use cases at EMC World 2013 like version upgrades (DCM to D2 life sciences and Webtop to D2 or xCP) and third-party migrations.
Speed vs. Business Requirements and methodology
Cloning or a 1:1 migration of all objects located in a repository without reorganization and clean-up has no additional business value, the result is just a new platform and/or version (garbage in, garbage out). With EMA, changes on business requirements can not be applied easily during the migration (e.g. new metadata, new object types, business logic etc.). The results of the actual migration can not be discussed with the business department before the content is imported in the target system. If needed, duplicates can not be dictated and managed. And furthermore it is not possible to apply changes during the project with just a few clicks of the mouse as you could with migration-center.