Sometimes not the leading edge technologies are causing you headaches, but also solid requirements like synchronizing your Document object’s attributes with SAP.
In this blog post I will explain the differences and purposes of the OpenText Documentum Archive Services for SAP and OpenText Documentum Content Services for SAP as well as the challenge to synchronize only modified SAP data into OpenText Documentum.
OpenText Documentum Archive Services for SAP
The main purpose of the OpenText Documentum Archive Services for SAP (ASSAP) is to accept content (e.g. the printable bill) delivered by SAP. For this, the ASSAP exposes as ArchiveLink server. With the ArchiveLink protocol, SAP is not only able to archive content but also able to retrieve that content for display purposes. Such content can be for example billing documents. So the active part is SAP. OpenText Documentum is the passive part. The ASSAP will create the link information with SAP archive maintenance data.
The continuously growing number of contracts and their precise handling is a constant challenge to many organizations. Therefore, fme developed an OpenText Documentum D2 based Contract Management Framework. With this framework, clients can efficiently manage their contracts and ensure that they are accurately recorded and audited to meet compliance guidelines. But what’s behind all this? Let’s take a closer look!
OpenText Documentum D2 – a solid backbone
The OpenText D2 background provides a configurable and adaptable basis with which the contract management solution can easily be adapted to customer needs.
Main functions of the fme D2 Contract Management Framework
The solution contains all basic settings for the setup of contract management documents and processes: a set of document types with attributes, lifecycles and workflows, permission control and search and reporting functionality.
Additionally it contains a specific clause library functionally to compose contracts of already reviewed and internally approved text blocks, which are organized as part documents and serve as contract template parts. This reduces risks of inconsistency and ensures organizational compliance.
The typical migration scenario with migration-center is to read – or scan, as we call it – documents and their metadata from the source system (1), do necessary transformations on those documents in order to fit the target system (2), and finally write – or import – them into the target system (3).
Migration is an ongoing IT topic and there are many good reasons for that: Switching platforms or architecture e.g. when changing an operating system or database, virtualization and moving to the cloud are some reasons to be considered. In addition, the integration or merger of systems and applications can make content migration a relevant topic, as cost efficiency is a big deal in these cases.
Are you currently facing a migration project? Especially any Documentum migration task? Have your colleagues given you any serious advice to handle Documentum object IDs very carefully? A common example for that: Published links in intranet or e-mails referencing important documents through object IDs. Have you been asked to keep these documents’ object IDs and all their links working?
In this blog post I want to share some thoughts, experiences and solutions on migration regarding Documentum object ID concerns with you.
Despite the ongoing discussion about the prospects of new web technologies, progressive clients or even the need of an unmitigated new user experience, many enterprises are still using the “good old Documentum Webtop” quite contentedly. In general, Webtop applications integrate smoothly with other systems that have spawned over time such as Jive, Jira, CRM and others.
Nevertheless, let’s be honest. There have been some flaws and one of them has always been the content transfer mechanism UCF which can be hard to maintain especially in complex network settings and which causes an annoying dependency on the client’s Java Runtime Environment. With modern browser vendors reducing plugin support and the decision of Oracle to deprecate the Java browser plugin in Java 9, a new content transfer mechanism was long overdue.
(German version below)
For many years now we have seen a trend away from physical hardware and towards virtualization. The reasons are diverse: costs need to be reduced, activities should be automatized and downtime decreased. For a long time virtualization was a synonym for replicating complete server including the operating system. But since the launch of Docker in 2003 the so called „container-based virtualization“ is finding its way into the IT infrastructure of companies.
This trend does not stop even for ECM based software so that DELL EMC (in the meantime acquired by OpenText) launched the Documentum Content Server Version 7.3 with Docker support in November 2016 for the first time.
Imagine, you have a set of user stories for a new enterprise content management (ECM) application. And you have an existing ECM platform up and running or you have selected a new one, but you are uncertain about the platform’s future. You will have to invest money for custom development to make users happy, and you want the solution to be supported for the next decade.
You have two options: stay with your current ECM platform or select a new one. But the decision process takes time – or you give it a try and test a new platform within your project. There are a lot of pros and cons and no general recommendation. It depends on your situation and the requirements for the new application.
The implementation of a standardized DMS like D2 or the > Dell EMC Documentum Life Sciences Solution Suite requires the migration of documents from the old to the new system or a transforming of the data model within the system in order to work properly with the new application. Often this old data is not fully standardized and structured, but either based on a less controlled system such as Documentum Webtop or on a controlled system with slightly different structures such as CSC FirstDoc or Cara.
Interesting announcement from Alfresco. They started an aggressive > swap out program against Documentum pushing the fear that Documentum is going away or going south soon. First of all, Alfresco is a really good ECM platform – fme has been an Alfresco partner for many years. There might be good reasons for the one or the other company to decide to change their ECM platform within the next weeks or months, but it should have nothing to do with the OpenText acquisition. Fear has always been a bad advisor. Documentum is not going away any time soon, or why do you think OpenText payed 1,6 billion USD? They have to keep up the flow of maintenance money for quite some years to make the deal work.
Pretty much every “ECM expert” is giving us guidance and advice these days on the recent OpenText acquisition. fme group has been working in the ECM industry for roughly 18 years. We do consulting around platforms like Documentum (ECD), SharePoint and Alfresco. I will refer to the “Dell EMC ECD” business as Documentum. I never liked the acronyms that were used for the Documentum brand, anyway ;-). We have been partnering with Documentum for 17 years. Our focus industries are Life Science and Industrial Manufacturing. Across all industries we are well known for our migration expertise. We move content from pretty much every major ECM system to pretty much every other major ECM system, including OpenText and Documentum (> see our products director’s thoughts on this). fme group has seen quite a bit of the ECM universe. Here are my 2 cents on the recent OpenText acquisition – written after a two days and two nights quick assessment – so have mercy on me 😉