Pulse is an advanced and versatile Data Management Platform that can be used for the following use cases: SAP Data Migration, SAP Integration, MDM, Landscape BAU Data Quality Management and Data Governance.

It can connect to a wide range of enterprise systems for source and target use, however has a sophisticated deep integration with SAP as a target system.

It has been design to fully automate the entire Data MIgration process for large SAP implementations utilising the approach of Build Once and Reuse. Through extensive automation, Pulse has proven to reduce the number of Developer resources required to support deliveries by >50%, reducing costs and improving the effectiveness of the Data Workstream.

See below the key features how Pulse can help your organisation:

  • Automated creation of thousands of SAP Data Validation Rules without human effort
  • Automated application of these rules against legacy data (zero effort required)
  • Automating the entire end to end data load process across hundreds of objects enabling full scope simulated migrations every night without human intervention.
  • Full end to end migration processes can run hundreds of times before cutover providing confidence in cutover outcome
  • Performant to manage millions/billions of records automatically, with parallelism control & load balancing integral to the process.
  • Real time visibility on accurate Data Workstream metrics. No more spreadsheets
  • Eliminate Excel Spreadsheet versions from the Data Migration process that add control risks to the process.
  • Multi Tenant Architecture - Supports complex overlapping release/waves deliveries.
  • Delivered with a suite of pre-created Data Load routines for SAP Standard objects
  • No SAP BAPI changes required for delivery
  • Smaller Data Workstreams required to load the same number of objects
  • Business & IT error states are transparent, realtime-refreshed, owned, proactively managed and governed by SLAs & KPI Benchmarks

Pulse has a suite of user facing interactive dashboards enabling programme & data managers to view overall status and also functional business teams to resolve issues assigned to them.

Business and Analyst teams have full control of the data movement throughout the process, able to stop, start, filter & add data to the process without any need for code changes.


Super Fast Data Load Creation....

A user interface is available for teams to map data sets to SAP entities. The SAP entities are automatically downloaded from SAP containing the data model and data definitions. No development is required and data validation rules will be generated automatically based on this information - no coding required.

Mapping source field to the SAP entities is a quick task, once saved, Pulse will automatically manage the Validation, Transformation, Load and Reconciliation process based on the input mappings.

Users only need to manage the Extract process which then plugs into the Pulse APIs

Pulse is designed to cater for the wide majority of Data Migration scenarios, for exceptional situations some very complex developments may be required and can be easily integrated to the Pulse process.


Mapping Variants - Accelerate Data Delivery

Data Migration Objects often exist as subtypes of a common parent object, for example a Material Master can be split into the following subtypes:

  • Production Materials
    • Fluid
    • Non-Physical
    • Sub-contracted
  • Non-Production Materials
  • Aftermarket Materials
These are all material masters, just different types which will, on the whole, have a very similar set of attributes and conform to a similar data model. However, there

will generally be subtle differences in how certain fields or structures need to be managed which means each subtype cannot be treated in exactly the same way.

Let's assume there are 80% commonalities across the subtypes and 20% uniqueness. Pulse has a process call Mapping Variants which allows the analyst to create 80% of the mappings just once, as common object level mapping rules, next the analyst can create Variant mappings/rules used only for the unique scenarios for each subtype. An example might be 'Material Viscosity is only mandatory for Fluids and not for any other type of material'.

Benefits – the number of data mappings that have to be created and most importantly maintained is kept to a minimum, eliminating duplication and delivering faster.


Thousands of SAP Data Validation Rules Out Of The Box

SAP comes pre-configured with a set of data attribute definitions as standard which are then enhanced or customised as part of your project delivery - examples are:

  • Field is mandatory
  • Field is numeric (Integer)
  • Field is constrained to a set of picklists values
  • Field has a fixed length of 20 characters
  • Field has a check digit
  • Field must conform to a defined pattern
  • ....and so on

Depending on the size of your programme there will be hundreds if not thousands of field & object level data validation rules that your legacy data needs to be tested against to ensure a successful Data Load.

Pulse has a solution here, using it's deep integration within SAP systems, it is able to extract the SAP Data attribute definitions and automatically generate data validation rules for all SAP objects you're using.


Flexibility

Pulse can integrate directly with the majority of non-SAP systems on your landscape and also the full suite of SAP systems from ECC, APO, CRM, EWM, SRM, MES, S/4HANA etc.

Pulse is available as an On Premise solution or as a Cloud Service.


Reduced Coding - Faster Data Deliveries

Pulse provides an abstraction layer that sits between the raw data migration code and the user, which is accessible through a Dashboard. Pulse has taken the majority of the standardized data migration tasks (Validation, Transformation, Load, Reconciliation) and automated them and then made them configurable through a user interface.

This process allows business users to create complex data load processes, test & deploy them faster than ever before and without the need for IT involvement

Pulse is delivered with a suite of cross functional, cross SAP System out of the box data load processes to support large manufacturing plant deployments which means within hours of deployment your teams can be analysing and loading data to SAP.


Multi tenancy - Managing Data Migration for overlapping Releases, Waves...

Global SAP Programmes are not normally one time data load activities - These normally involve the data load of the same underlying objects (Customers, Vendors etc) for different markets over successive releases or Waves, this is referred to as Multi-Tenancy. Ultimately a data migration solution is required to support Multi-Tenancy arrangements supporting the following tasks for overlapping programme activities over Dev, QA, Test etc

  • Data Analysis/Profiling
  • Data Field & Value Mapping
  • Data Error State Recording at Wave, Release & Global Level
  • Data Error State Capture & Reporting
  • Data loading & Reconciliation

Pulse allows enterprises to manage multiple parallel programmes and releases & waves within programmes. This feature is extremely useful when data validation rules are different between the Global definitions and those at Wave/Release level - for example Geographically different plants or Division consolidations from separate companies.


SuperRFC - Elevated Control of SAP Loading

Pulse has deep integration with SAP, meaning not only can it natively connect to any Standard BAPI, it can also perform the following activities without any further developments:

  • Dynamically cater for customer Z-customisations to Standard BAPIs without the need for additional ABAP developments
  • Provide Commit control to the user allowing the user to simulate a load without committing any data
  • Automatically reconcile the loaded data against what was submitted, ensuring full integrity throughout the process.

Test Load Mode - Client Contamination Control

Successful production data migrations rely on repeatedly performing full end to end data loads in test environments, over and over with the aim to observe incremental

improvements iteratively, testing the process against CRs delivered, defect fixes and data cleansing/enrichment activities.

One of the challenges with repeated loads is contamination of the target client. Once data has been loaded it can’t normally be re-loaded without changes to the process. For example if Vendor "1234-Microsoft" was previously loaded incorrectly, it cannot be re-created because it already exists.

There are various workarounds for this scenario involving changing source Identifying values (eg 1234a-Microsoft) to allow the records to be treated as creates - the cons for this approach are negative impacts for source to target reconciliation and also any dependent transactional object loads that need to link to the new master data records.

A cleaner approach is to perform a client refresh before each load iteration - this comes with Basis efforts, time delays and most likely cost impacts.

Pulse has a clean approach to minimise if not solve the underlying challenge. Any data loads performed using Pulse can be run in 'Test Mode'. This means:

  • a full data migration of 100+ data objects can be ‘loaded’ to a target SAP client
  • All 'load success' results will be recorded within the Pulse Status Dashboards
  • All 'load failure' results along with individual error states will be written to the Error Register
  • A full comprehensive data reconciliation report will be auto generated
  • Not a single record will be committed to the SAP client. The client will remain clean

Simulate full data loads without committing any data to your SAP client(s) using Pulse


KPIs & SLAs - Keep teams to task

All error states defined within Pulse have an SLA defined for their resolution. For example - A “Material invalid weight error” might have a 10 day SLA assigned for this

to be fixed. When PUlse first identifies this error state it will start the SLA clock ticking and assign a Green RAG status against the resolution ream. As the clock ticks by without resolution the error stats will move to Amber and then ultimately red once the SLA has been breached.

Pulse is designed around a proven Data Governance Model managing millions of error states across thousands of data objects across complex IT landscapes.

Pulse has been proven to:

  • establish a benchmark Data Quality Standard in environments where no standard exists
  • Establish Data Governance KPIs where none previously exists
  • Improve data processes and data quality standards
  • Improve the success of business transactional processes
  • Reduce the costs of Data Migration Workstreams
  • Improve the outcomes of Data Migration Workstreams


End to End Data Migration Failure Decomposition

A typical manufacturing plant data migration will involve around 150 data objects. The full process involves the following steps for each object:

  • Extraction
  • Transformation
  • Validation
  • Enrichment & Cleansing
  • Load
  • Reconciliation
  • Status Reporting
This equates to 1,050 steps (150x7) for each data migration cycle!! Most programmes aspire for 4 test cycles = 4,200 tasks!

Data Migration work streams fail and bring large programmes down with them by not recognising this complexity early enough in the delivery lifecycle, or by aspiring to manage the entire process manually ie via separate extracts etc. This approach invariably results in failure due to complexity of file version control and alignment of datasets - manual processes are simply too difficult to manage for enterprise data migrations.

Pulse provides a fully automated process of the 7 steps above for all objects. Not only is Pulse designed to nativey perform the end to end migration process for the entire 100+

data objects automatically, it is designed to perform this overnight without assistance, allowing the business and IT teams to review and act upon the results the following day:

  • Object Load Status Dashboards updated daily
  • Error States assigned to teams daily
  • SLAs tracked daily

Pulse provides a collaborative working environment enforcing the Data Governance Best Practice Standards for business and IT teams enabling a drum beat for the evolution of Data quality and fitness to support successful data load events.

End to end Data load events are instantly and automatically decomposed in minutes, providing instant insight to root cause and accelerating programmes to resolve issues faster.

The illustration below shows a starting position of 12k records to load, and only 5.3k loaded….why?

Pulse provides an instant breakdown on the end to end status enabling teams to action immediately. Traditionally these results days or weeks to produce delaying programmes further.