Who hasn’t already made the painful experience of how bad communication has led to additional costs and reduced the quality of a project at the same time? We don’t want that to happen at all and therefore most project leaders today are very aware of the importance of communication within their project.
To help keeping your communication straightforward in a migration project, let’s have a closer look at the mapping specification as a crucial document in migration projects.
Last week I attended the Tech-Conference of Amazon Web Services – AWS re:Invent 2017 in Las Vegas. It lasted five days, a period of time that is not always easy to take off from your daily work. Following are the most important pieces of content from my perspective in 7-10 min for reading.
* 10 Seconds Management Spoiler *
Serverless, Machine Learning, the Machine Learning Camera DeepLens, Alexa for Business and Kube as a managed service are the main highlights of this year’s re:Invent. By extending and making existing and established services such as EC2, S3, Glacier or DynamoDB more flexible, AWS helps customers to map many requirements directly in the managed service and reduce the need for workaround implementations. It will be fascinating and at times frightening, what will be possible in the future due to the combination of these powerful services.
This is the fourth post of my series “Insights into the development of migration-center 4”. In the last post I showed you how to configure and run a scanner in order to read all the desired files and their metadata from the source system. In today’s post I will show you how to organize your scan runs into so called migration sets. All further processing in the migration-center, i.e. transformation, validation and import will be based on a migration set. So the definition of a migration set is an important step in the whole migration workflow.
When you press the Organize icon in the main navigation bar, the client shows you the list of existing migration sets. You can use the icons in the command bar above the list to create, edit, copy or delete a migration set.
IBM Notes has been around for so many years – yet still many companies use it. Some are heavily relying on it as it supports many different processes – it can be used as an email tool for communication, as a database with or without any kind of documents and even as a feature rich application for (critical) business processes and workflows.
Notes is flexible and supports all kinds of different and custom scenarios.
However, there are several reasons to replace it with other platforms, especially when it comes down to document management. I do not want to talk about them in detail because you might know those already and therefore are reading this right now.
I want to talk about the solution to move to other platforms and I want to clear up doubts about the possibility to lose any information during the migration.
The typical migration scenario with migration-center is to read – or scan, as we call it – documents and their metadata from the source system (1), do necessary transformations on those documents in order to fit the target system (2), and finally write – or import – them into the target system (3).
Migration is an ongoing IT topic and there are many good reasons for that: Switching platforms or architecture e.g. when changing an operating system or database, virtualization and moving to the cloud are some reasons to be considered. In addition, the integration or merger of systems and applications can make content migration a relevant topic, as cost efficiency is a big deal in these cases.
Are you currently facing a migration project? Especially any Documentum migration task? Have your colleagues given you any serious advice to handle Documentum object IDs very carefully? A common example for that: Published links in intranet or e-mails referencing important documents through object IDs. Have you been asked to keep these documents’ object IDs and all their links working?
In this blog post I want to share some thoughts, experiences and solutions on migration regarding Documentum object ID concerns with you.
Today we are back with a new issue of our blogseries on the development of our product migration-center 4. In this post I would like to focus on the “Analyze” part and show you again some screenshots of our new migration-center client.
As I have described in the first article of this series, the new user interface emphasizes our step-by-step migration approach of Analyze, Organize, Transform and Validate, and finally Import:
EMA is NOT an out-of-the-box product, it’s a tool set (framework) that ONLY the Documentum Professional Services Team uses. This tool is exclusively available through EMC IIG Services. Once the engagement is over, EMA leaves with the team and cannot be used for additional migrations. Partners and customers are not able to use EMA without IIG Consulting in a project because EMA bypasses the API (DFC).
The main use case for EMA is the high speed cloning from Oracle based on-premise Documentum installations to MS SQL Server based off-premise (EMC onDemand) installations. For this approach a simple dump&load is not feasible and a tool for cloning is needed. In addition EMC addressed some other use cases at EMC World 2013 like version upgrades (DCM to D2 life sciences and Webtop to D2 or xCP) and third-party migrations.
Speed vs. Business Requirements and methodology
Cloning or a 1:1 migration of all objects located in a repository without reorganization and clean-up has no additional business value, the result is just a new platform and/or version (garbage in, garbage out). With EMA, changes on business requirements can not be applied easily during the migration (e.g. new metadata, new object types, business logic etc.). The results of the actual migration can not be discussed with the business department before the content is imported in the target system. If needed, duplicates can not be dictated and managed. And furthermore it is not possible to apply changes during the project with just a few clicks of the mouse as you could with migration-center.
Gerade blicke ich auf die Client-Server-Architekturen zurück, mit denen ich in meinen mehr als 25 Jahren als Software-Entwickler zu tun hatte. Dabei sehe ich vor meinem geistigen Auge so etwas wie ein Pendel, das zwischen Client und Server hin- und herschwingt.
Während meines Studiums in den 80er Jahren waren Mainframes/Hosts mit 3270er Terminals angesagt. Das Pendel hing also beim Server.
Als in den 90er Jahren die PCs dank Intel-486 immer leistungsfähiger wurden und durch sinkende Preise weite Verbreitung fanden, schwang das Pendel zur Client-Seite. Anwendungen wurden typischerweise als Fat-Clients gebaut.
Doch dann der Zoo genutzter Windows-Client-Plattformen zur Jahrtausendwende: 95, 98, ME, 2000, NT mit diversen Service-Packs. Die Client-Anwendungen waren sehr empfindlich… Stichwort DLL-Hölle. Das Pendel schwang wieder zurück zum Server, Web-Applikationen im Browser waren angesagt… bis die vielen Seitenladevorgänge wieder die Suche nach etwas Neuem anstießen.
Wie mache ich mein DMS rechtssicher?
Gesetze und Regularien verpflichten, Dokumente (Dokumentationen) nach bestimmten Vorgaben aufzubewahren. Geschäftsführer und Vorstände haften bei Verstößen persönlich und gesamtschuldnerisch. Hinzu kommt das Eigeninteresse des Unternehmens, bestimmte Geschäftsdokumente und Unternehmenswissen organisiert abzulegen und auch gezielt unter Einhaltung von gesetzlichen Vorgaben. Rechtskonformes Dokumentationsmanagement bezeichnet die effektive Steuerung zur sicheren Aufbewahrung und die kontrollierte Vernichtung von Dokumenten und Dokumentationen über den gesamten Lebenszyklus. Die Technologie ermöglicht auf diese Weise von der Collaboration-Phase über die Dokumentenmanagement-Phase, die Records Management-Phase bis in die Archiv-Phase den kontrollierten Umgang mit Unternehmensinformationen. Nicht mehr notwendige Dokumente können so kontrolliert vernichtet werden.