Business Issue: The client faced financial and operational risks when
an enterprise-wide maintenance planning system became superseded by later
releases and was about to be desupported by the vendor. The system ran over an
Oracle database which was also out of active support.
The system had been heavily customized over a long period of time to meet the
complex operational requirements of a refinery. As a result, there was no
vendor-supported upgrade path to the current version.
Additionally, several satellite applications had emerged over time, such as
timesheets, customised work-order lodgment, materials receival and issuance,
which both read from and wrote directly to the database. The satellite
applications were implemented in a wide range of technologies from the
relatively modern .NET 2.0 platform to legacy ASP, Visual Basic 6.0 and VBA.
A large number of reports were also in existence, implemented as complex
applications in Excel (via ADO) and MS Access.
All of the reports and satellite applications had to be upgraded to match the
new database schema and system APIs.
Broad-based system integration services: The project went through a
formal discovery process to uncover the reports and satellite applications. The
satellite applications had to be altered so that all writes to the database took
place via web service calls through MIF, the Maximo Integration Framework.
Embedded queries in reports and applications were altered to match the new
A separate project was instigated to address the migration of data from the
old system to the new one. This involved
- Oracle scripts to extract and transform data
- Loading data via the MIF API and via direct database insert
- Automated data-checks using record counts and field-level checksums
- Manual data-checks involving screen comparisons of old and new systems
- Discovery and documentation of business rules
To minimize downtime at cutover, a strategy was adopted of pre-loading the
production system from a static snapshot. Software was built for extracting only
those records changed since the snapshot, which reduced the required load time
from 3 days down to a few hours.
Tips And Tricks: Some of the project issues encountered and solved
would be common to many upgrade and data migration projects. Issues included:
- Vendor API performance and stability - the project had to throttle the data
throughput to keep the dataload API within its capacity limits
- Platform performance and stability - the data-loading process consumed
considerable system resources (open cursors, virtual machine memory)
- Database performance - disk performance had to be carefully analysed and
tablespaces reorganized to properly balance database I/O.
- Data dictionary / data mapping quality and change management processes - a
dedicated data dictionary owner would be a good idea for a large project. Apart
from maintenance of field mappings, there is more subtle class of change that
may occur when records or field names don't change but the semantics of the data
- Tracking and governance of changing business and data-validation rules
- Data cleansing - as business rules became tighter in the new version, many
records from the old system now failed data-integrity constraints.
- Vendor API suitability for legacy data - the API enforced business rules
that were appropriate for daily operational purposes, but not for loading of
legacy data (adding properties to decommissioned locations, inactive companies,
- Application UI's impacted by changes in reference data, e.g. due to the
increased number of choices in the new system, elements would scroll off the
page or become inaccessible.
Benefits: On successful completion of the upgrade with minimal
business impact, the client now has a well supported and more secure platform.
The data in the system is also more consistent and database access more