Our Portfolio:

Mendix Low-Code Development

Mendix Low-Code Development

Mendix Low-Code Development

Explore the world of Low-Code application design

System Integration

System Integration

System Integration

Create sustainable infrastructure with our expert services

Artificial Intelligence

Artificial Intelligence

Artificial Intelligence

Use machine learning and neural networks in your projects with our help

Custom Development

Custom Development

Custom Development

You need a script or program that would do exactly what you need? Our programmers will prepare just the right solution for you

Sentiment Analysis

Sentiment Analysis

Sentiment Analysis

Enrich your decision making process with information extracted from sentiment analysis of your customers and key market players,

Legacy Data Export

Legacy Data Export

Legacy Data Export

Our expers will extraxt old valuable data from system such as Lotus Notes / Domino

Reissue stranded documents and data
Lotus Notes data extraction
Lotus Notes to Office 365
Enterprise data migration
Lotus Notes data export
Legacy data migration
Lotus Notes to Outlook
Legacy migration solutions
Domino data extraction
Legacy data conversion
Lotus Notes data migration
Legacy data export
Shadow
Slider

Legacy Data Export – Extract Valuable Data from Lotus Notes / Domino

Old data migration is essential for digital transformation of any enterprise. There are various reasons why it is required to keep old data – 10 years retention period due to legal regulations or GDPR.  Historical data are valuable even now for forecasting or as input for machine learning. They may contain information needed for your present business.

These old data are often stored in legacy systems such as Lotus Notes / IBM Domino or even as unstructured data from log files or in various other formats. We offer you the possibility to export, cleanse, and map the old data to new structures. We export data from Document Libraries, TeamRooms, DocStores (Document Stores) or extract old Mailboxes.

Another regular source of data are SQL databases based on older OpenSource systems such as WordPress, Forums, or any kind of knowledge management system similar to Wikipedia.

The target can be either structured text in folders or we can export into new SQL databases, Mongo DB or provided even into a REST Based API.

The regular steps to perform a proper migration are outlined below:

Data auditing

During the initial data audit, we try to understand in which format the data are, how large they are and how complex are the data structures within the data. Important is also the level of confidentiality that needs to be kept.  At the end of this phase we try to secure a representative data sample for development and testing. This can be non-critical data from the source system or even synthetic data in a proper structure.

Workflow specification

In this phase we describe the steps of the migration procedure.

The migration procedure covers following:

  • Access / Authorization / Security
  • Obtaining – Connecting to a Working Copy
  • Connecting to a Target System
  • Parsing
  • Data transformation
  • Reporting – Export Log with checksums

Testing

Before exporting a large system, a life smoke test is performed to make sure that everything goes as expected during the actual data export of a larger database. Smoke tests can be performed with test data in the actual customer environment.

Workflow execution

When everything goes according to the plan the actual data export follows. For this, it is necessary to allocate a system that is frozen, and no further changes will be performed on the source.Copy of the original is provided for security and performance reasons.

There are flowing steps performed during the execution:

  • Setup of migration Environment
  • Smoke Test
  • Execution of the Data Extraction Procedure
  • Test of Extracted Data (Completeness and proper format)
  • Execution of the Data Transformation Procedure
  • Test of Clean Transformed Data (Completeness and proper format)
  • Execution of Data Import into the target system
  • IT Quality Tests of Imported data
  • Securing data from each phase on a backup medium
  • Cleanup
Export from Lotus Notes
IBM Domino 9 / Data Export

Handover and Support

After a successful data export, we perform official handover to the customer. After large and complex data exports we are able to provide “hands-on” service and help with further data cleansing and adjustments in smaller follow-up sessions. Our goal is precision, quality of the extracted data as well as predictable project assessment and overall customer satisfaction.