by | Jul 11, 2021

    9 Recommendations For A Successful SAP & IBM Maximo EAM Interface

    BPD Zenith - Maximo SAP integration

    Integration is often a key component for getting the most out of your critical IT applications. Modern Enterprise Asset Management systems such as IBM Maximo are designed for integration which can expand their capabilities and allow the flexibility that the industry demands.

    Being able to share common data and make use of it across multiple applications can increase its usefulness and help plan and execute work while maintaining common references throughout. Here are BPD’s 9 Recommendations For A Successful SAP & IBM Maximo EAM Interface, followed by common challenges and the risks and mitigations involved.

     

    Approaching Integration with Maximo

    With Maximo, the Maximo Integration Framework (MIF) provides a very powerful and flexible integration tool, allowing any data held or used by Maximo to be interfaced using a variety of methods, from flat files to database tables and REST API calls.  While this flexibility provides a number of integration options and methods, it is still important to plan your integration with third-party applications thoroughly. Below are BPD Zeniths 9 Recommendations For A Successful SAP & IBM Maximo EAM Interface – we highlight some of the key items to consider when developing interfaces. This is followed by common challenges, as well as risks and mitigations that often arise.  It is also worth noting that this is based on our experience and is by no means an exhaustive list, however, does contain the items we come across most regularly and how best to resolve them.

     

    BPD’s recommendation to Integration

    1.    Keep it simple

    Often referred to as KISS (Keep it Simple, Stupid), this principle is key when designing interfaces, especially when developing interfaces between complex systems.  It is very easy to start defining a highly complex interface process with data expecting to flow constantly between systems.  Bear in mind that complex interfaces typically require more effort to support on an ongoing basis and mean future developments around the interface require more thought.

    2.    Ideally, define a master and slave system

    While not always possible, defining a ‘Master of Data’ for each data set enables clear segregation of data and processes.  Maintaining data in a single system, with other systems receiving a copy of this data (or a subset of the data if not all is required) is usually easier to maintain than trying to maintain data in two or more different systems.

    In addition, it is also worth making the master depending on where data will be used most.  For example, having company/vendor data is most used in procurement, therefore having the Company Master of Data in the same system as your Procurement (Purchase Requisitions, Purchase Orders etc.) is a good starting point to review.

    3.    Focus on integrating what you need, not what you can

    While there can be a tendency to consider interfacing all data shared between systems, it is always worth taking a step back and considering what the benefits are of developing an interface for each specific data object or element.  The benefits are often linked to the frequency in which the data is changed or updated.  For example, both ERP and EAM/CMMS solutions may reference data such as currency codes, units of measure etc.  However, how often is a new currency released, let alone the new currencies used?  Same with Units of Measure.  Rather than spend time and effort developing and supporting an interface that has little traffic, it may be easier to load the data to align them initially, then define a semi-automated or manual process to ensure they are maintained.

    4.    How often does each interface need to run to provide the necessary business process support?

    Once you’ve reviewed what data needs to be interfaced, the next stage is to think about how often data needs to be interfaced, and at what time in the day.  Real-time or frequent interfaces provide data instantly (or within a few minutes), however, if you have all of your interfaces running in real-time and have a high number of records and processes needed as part of the interfaces, you can start to increase the load on your environment.  Often (like with Maximo), interfaces are processed separately from the user interface, however, there are common elements such as the database which, if overworked, has an effect on all users and processes.

    Consider which interfaces can run less frequently, data required for visibility or that changes less frequently are often interfaces that may be considered.  However, depending on the data, it is also worth having a way to override and run the interface sooner if needed.  Previously I’ve seen Vendor master data being interfaced daily on an evening due to the infrequency in which the data changes.  However, when an urgent record is required for a new Vendor which has to be raised on the same day as the vendor, a daily interface quickly becomes a problem.  As these examples are usually infrequent, having a manual override is usually a better option than increasing the standard interface frequency.

    When interfacing data with less frequency, it is also worth considering exactly when the interface runs.  Often these are running at night when the systems are used less by end-users.  This means that more system resources are available and fewer users are affected by any common elements being over-worked.

    5.    Keep data in both systems aligned and up to date

    When looking at the individual fields that are being interfaced, it is key to ensure that the data type and size for each field are aligned.  Having the master data description field hold 150 characters and the secondary hold 120 may be sufficient to work most of the time and potentially get through basic testing, however, it will fail once live.  Due to this, it is always worth checking the field type, validation performed on the data, and the length for each field during the design and build.

    Once live, it is also worth keeping an up-to-date document in place which highlights what fields are interfaced where and what the agreed field size, type and validation is.  This then provides a reference point when changes are made in future, which also needs to be considered to ensure data continues to flow correctly between systems.  It is very easy to make a change to a system and interfaced systems are not considered.  While some changes can be made without additional changes, frequently these changes can cause sporadic issues (as some validation still passes, while others fail) or even a complete failure of the interface.

    6.    Ensure you carry out significant testing

    Through testing of any change is critical to identifying and resolving issues prior to moving the change into a Production environment.  This is equally true for interfacing.  When designing test cases, it is often a good idea to spend additional time testing boundaries (data that is on the limit of what is allowed), negative testing (aiming to break the interface using available functionality) and testing each type or route the record can take.  While this takes additional time to complete, resolving any issues found is significantly easier to complete before the interface is deployed into your Production environment.  Not just in terms of effort, but also user experience and attitude towards the system.

    7.    Ensure integrations are included in any future system development work

    As mentioned in Point 5 above, it is always worth keeping up to date and accurate documentation on your interface points as a reference point, which is used when designing future developments and changes.  Ideally, it is worth keeping this documentation clear, concise, known to all related developers and easy to review and reference.  This will help ensure that the documentation is reviewed and utilised.

    It is also worth ensuring that the updating of this documentation is added to the close-out steps of the work process.  While good documentation is important to ensure systems and processes remain aligned after changes are made, out of date or inaccurate documentation can cause significant additional effort and work while both developing, testing and fault finding.  Not to mention how frustrating it is as a developer! Having clear concise documentation and ensuring it is maintained as part of the work process helps ensure the documentation is up to date and aligned with actual interfaces at all times.

    8.    Collaborate for success

    When interfaces are being built, there are usually multiple parties involved, this includes third party and client technical specialists for each interfacing system, third party and client consultants/Business Analysts for each interfacing system, Business Subject Matter Experts for the specific data objects, Project Managers, stakeholders etc. As with any development/project work, ensuring there is an open honest atmosphere at all times is key to the work being completed successfully.  As with any project, interface work can sometimes go different to what was initially planned, items don’t work as expected, expectations aren’t as aligned as thought… the list goes on.  What is key is to ensure that there is an open atmosphere where issues can be raised, discussed and acted upon in an appropriate way, by working together as a single team, issues are resolved quickly and the chance of overall success is increased.  Due to the number of different parties involved, this is even more critical with interface work.

    9.    Perform Root Cause Analysis on post-go-live issues

    Once the interface is live, there is often a tendency to focus on resolving the individual record, rather than the underlying cause.  This is especially true just after the interface has gone into Production and there are a high number of issues being raised.  While resolving the individual issue ensures that the specific record in question is processed and work can continue, it does not stop the issue from occurring again and again.

    Instead, it is key for analysis to be performed on the issues raised.  Reviewing not only the object where the issue is occurring but also why the record failed.  By understanding and resolving the underlying issue, you can help ensure that the same issue does not occur again in the future.

    When resolving underlying issues for one data object/attribute, it can also be worth checking if the same issue is applicable to other similar data objects.  This can help resolve unresolved issues faster and make the most of the time spent analysing the initial underlying issue.

    It is also worth considering documenting the root causes and resolutions in a knowledge base.  This can help speed up resolution time and enable learnings to be brought forward onto future interface work.

     

    Common Challenges with Interfaces

    Challenges- Maximo SAP integration

    With these 9 Recommendations For A Successful SAP & IBM Maximo EAM Interface in mind, there can be further challenges to face when developing interfaces that can vary depending on a variety of factors. There are a number of common challenges we have come across in our experience of interfaces at BPD, these are listed below, together with comments and suggestions on how to overcome them.

    1.    Deciding which system will be the master for specific data

    There is often a fairly lengthy initial discussion around which system should be the master for each data set.  Ultimately there is no perfect right and wrong answer for many of the data objects, with multiple systems being more than capable of handling being the master data holder.  Due to this, we often recommend reviewing the following points:

    • Which system will the data be used most? – Having the data master in the system that will use the data most, or by most people is often a good approach to selecting the master of data as it ensures the data is up to date for the main set of users
    • Which system requires the data to be most up to date and accurate? For example, inventory is often required in the finance system and maintenance system, however, if the data is a day out of date, it would affect maintenance users far more than finance.
    • What current business processes are already in place and is there a big benefit to change this? Aligning interfaces to existing business processes reduces overall change for the business and its users.  If there is a significant improvement or the project is based on changing these processes, then change is required and is part of the project, however if not, it is worth questioning why the change is being made and what the Return of Investment (ROI) is in making the change.

    2.    Selecting the transportation method

    Once required interfaces and objects are defined, together with master and secondary data sets, the next challenge is often around what transportation method will be used to interface the data.  Most enterprise solutions support a range of transportation methods including flat files, XML, database tables, REST/APIs, Web Services etc.  As many of the options could be used successfully, it is often challenging to decide which will be used.  While decisions can vary, we would often consider the following points:

    • What is currently available? – Unless very out of date and needing updating, aligning new interfaces with existing interfaces can help simplify ongoing support and ensure all interfaces follow similar protocols
    • Existing transportation middleware? – Many clients use middleware to manage complex interfaces between multiple systems. When adding new interfaces, it is always key to consider what is already in place.  Developing new interfaces that do not use this middleware makes the new interfaces more difficult to support and less reusable, often resulting in increased development and support effort. Ensure that your existing middleware has a sufficient support lifespan and upgrade path.
    • Is existing experience available? – It is always worth considering the existing skills in the support and development teams when deciding on which transportation method to use. There is little point in implementing an interface based on REST calls if no one in the support team understands them!
    • Security considerations? – While security requirements should always be taken into consideration when deciding on the transportation method, if systems are in different clouds or a combination of on-premise and cloud, security is even more critical. Solutions and protocols such as HTTPS, MPLS can be used to help keep data secure, however, it is always worth keeping all data as secure as possible to provide multiple layers of security.  While data may also be seen as low importance or low risk, it is also key to consider what happens if it is tampered with or changed in flight?  Or if anything can be attached to the message to cause problems with the interfacing systems.
    • Size/Amount of data being interfaced together – When deciding on the interface protocol, it is also often worth reviewing how many records and the size of the data being sent across at a time. Some transportation methods handle large amounts of data better than others; therefore, it is worth taking into consideration
    • Connectivity issues? – It is also worth considering the quality of the connectivity between the system locations, some transportation methods are better suited to low latency/low-quality connections, therefore it is worth taking the quality of the connection into consideration.

    3.    Disjoin in data requirements

    While multiple systems can handle the same types of data objects (Purchase Requisitions, Purchase Orders, Inventory etc.) the required data to process each record can vary based on system and business processes.  Due to this, it is always worth carrying out a GAP analysis on the required data from each system and not make the assumption that the required fields for one system will be the same as another.  It is also worth taking into account any validation requirements and data translation requirements when interfacing data between systems.  While this can usually be handled with interface configuration, complex conversions can often require heavy customisation, which will then require ongoing support. Your integration will transfer the data between systems, there may be other steps that are required before that data can be used within the system, consider automating these steps wherever possible to make the integration more seamless.

     

    Risks and Mitigations

    BPD Zenith-Maximo SAP integration

    While Risks for an interface vary from project to project below provides a table of the common risks associated with developing an interface, together with methods of mitigating the risk:

    Risk

    Mitigation

    Defined Interface is overly complex Review what is actually required in order to satisfy business requirements and focus on core requirements, not optional elements
    Multiple options for Master of Data Focus on where the data will be most used and is most required to be up to date
    Existing interfaces use out of date/unsupported methods Consider using a new transportation method with a plan to migrate the existing interfaces at a later date
    Expected interface development costs are higher than expected Consider reusing existing interfaces or simplifying requirements
    A high number of issues after go-live Through testing prior to go-live using a range of testing types (negative, load, all-path etc.) Also focus on underlying issues rather than ‘firefighting’ individual records
    The interface method could be shortly out of date or unsupported Ideally ensure interface data is separate from transportation protocol, meaning that the transportation method can be changed while still using existing interface structure, this allows the transportation to be updated without a full interface redevelopment
    Changes made by multiple teams for different systems can cause the interfaces to be out of alignment Maintain a clear interface document which notes all interfaces and rules included, ensure this document is used and understood by all teams and ideally held in a central repository to ensure the correct version is used.
    Slave system data Unless your interface is bidirectional you may need to restrict access to data in the ‘Slave’ system to prevent data misalignment.
    Change of Master system If the integration is due to a change in the ‘Master’ system, do not forget to disable or remove previous automations that may continue to function, such as approvals of PR to PO.

     

     

     
    If BPD’s 9 Recommendations For A Successful SAP & IBM Maximo EAM Interface aren’t quite enough and you’d like to know more about recent integrations or anything else regarding Maximo, we’d love to hear from you. You can get in touch using any of the links below, and you can subscribe at the bottom of this blog for all the latest news and updates!

     

     

    Sign up to our free newsletter to explore emerging technologies, industry events and Maximo best practice.