Archive for the ‘FREE EXPERT ADVICE’ Category

Long SAP CRM Customer Record Load Times

Thursday, October 1st, 2009

Have you experience of load large customer data volumes into SAP CRM? We are experiencing extremely long load times for the data load and subsequent delta data loads. How many customer records per hour could / should we be expecting? What are the areas where we should look to further fine tune our processes. We are currently looking at our infrastructure requirements to help speed things up however it would be useful to have a guide as to what else we should be looking at.

ERP loads can sometimes be too slow to process the entire set in a single big bang event. Unpalatable though it may sound the answer is often to split the customer base along some logical line. The key to this technique is to ensure you get entire customers fully on legacy or fully on the target. The last thing you want is a split customer record.

This does mean that you have to analyse the business carefully for problem customers. I was involved in a Telco migration where there was a complexity around people changing their price plans or moving house. The solution is to build some means of spotting these states and push the records concerned back to a later load. Eventually, when you have to “bite the bullet” on any remaining records there may be few enough to manage manually with the assistance of customer service staff. It is rarely written in stone, even as a matter of migration team pride, that every record in a migration must transfer via automated procedures.

Finally there is another solution called “trickle across” using products such as Celona http://www.celona.com. In this you set up a synchronisation between the two systems that allows the customer to exist on both, eventually turning legacy off at an appropriate moment.

I also have an associate who managed to avoid any of these solutions by renting a massive hardware architecture from Hewlett Packard that compressed the load times by brute force! However the client had both deep pockets and significant need, which is far from always the case. This solution is not always open. (Genuine commercial enquiries welcome if anyone is interested in this unusual approach)

So no, it’s not that rare an occurrence. And no, you can’t generally solve it by doing something right that you previously missed. It really can take that long per customer.

I hope this helps.

Don’t hesitate to call if you are in UK or near Europe and would like direct consultancy assistance!


How Should I Structure an ERP Migration Team?

Saturday, April 4th, 2009

 Firstly, if the team is of any size you will need both a manager and a deputy(s). Major DM management requires one executive manager setting the team’s direction and talking strategy, plans and requirements to the rest of the implementation programme and at least a second team lead dealing with issues that are impacting the team day to day, progress to plan etc. while acting as an analyst themselves.

Then look at the ERP facing or To-Be facing analysts: Plot out the target data objects that are likely to be required: material master, vendor master, open orders etc., and separate them into groups that would face off with the process teams. Process teams often arrange along lines suggested by the ERP vendor. For SAP this would be one for Finance and Controlling (FI and CO) one for Sales and Distribution (SD)… Pairing the To-Be migration analysts with the process teams who need the data allows them to bond as a cohesive design and upload unit. These To-Be facing analysts should be hired early an initially donated to the process team as ‘extra hands’. You can retrieve them into migration proper and they will already know all the people in the business and process teams, what’s going on with the design and how to get design updates. What do these analysts do? Together with the process team they come up with the target side of the mapping sheet and understand what it means in detail. They then obtain legacy data to match, from the ‘As-Is’ facing analysts (below) or by data creation ensuring there is a good bond between the legacy data, created data and ERP requirement. They also write and execute the the upload scripts so typically they need to know how to do that (typically LSMW and ABAP for SAP).

Your legacy ‘As-Is’ analysts need to start early enough to gain a thorough understanding of the data in your systems. Typically they will be a mix of people who already worked with the old systems and specialists who have been involved in SAP migration before, bring life-cycle experience and special techniques such as data profiling. Some ERP systems can be more fussy than legacy for example SAP on addresses, so the old data that used to be OK may now not be. You need to know what’s really in the old system, not what people and designs say there is. When planning the legacy side of the team you need to allocate analysts to groups of connected existing systems, not to the target as per the ‘To Be’ analysts. The job of your own experts is to thoroughly understand the reality of legacy, write or specify the extract routines and talk to the To Be analysts. The external experts will usually move towards arranging the necessary data cleanse in conjunction with the programme change managers.

Finally you need a small group of transformation analysts in the middle. Typically they execute rather than understand and are directed by the As-Is and To-Be analysts as to which extracts go to which targets. Expertise depends on method. You may need Datastage, Informatica, SQL Server or MS Access expertise depending on choice of transformation tool. The above is for large migrations. For smaller, scale it down and have people double up on a few compatible roles, but maintain the ethos/pattern.Well that’s the method. If you would like me or one of my associates to help you plan and implement it my blog is at http://www.sapdatamigration.co.uk , our website at http://www.vivamex.com and the telephone number 0794 109 5082

First posted on www.datamigrationpro ‘Ask an Expert’ 

(c) John Platten and DataMigrationPro

How do I migrate Siebel CRM to SAP CRM?

Saturday, September 27th, 2008

Hi John, I want to know how to map Siebel tables/fields with SAP CRM For example there is one table in Siebel S_ADDR_ORG. Now how to map this table to sap crm? I know Siebel well but do not have great experience of large migratons to other platforms, please guide me

The answer is to locate the right person on the SAP technical team and the right business stakeholder and arrange for them to be formally provided to your project as an expert resource.

While you may get a certain distance delving into technical matters on a target yourself to look these things up yourself and it is sometimes worth aquainting yourself with the basic principles you will still need experts in the SAP target system to arrange uploads or downloads and provide the related mapping spreadsheets relating to specific entities.

There are two main reasons for this:

1) SAP has it’s own load methods that tend to excercise data through a process that mirrors a user entering data via a screen, including the side effects and ramifications of that new data being present. e.g. causing further records to appear in other, related tables.

2) Knowing a table’s technical details will not provide the answer alone as SAP has multiple alternate workflows and levels of detail that a company can opt in or out of when it is first set up for use. I am sure this is the same in Siebel. So you will also need the input of key business users whose processes are engaged around each field and/or the input of the SAP Business Analysts if you are migrating onto a system that is in the process of being constructed.

So in summary you cannot “inject” data into SAP yourself, and will need an approach such as this:

  • Don’t look for a direct technical answer
  • Build a federated team
  • If you are the technical analyst on the project get your manager or his/her manager to contract this assistance for you.

If you try to ‘solve’ this scenario as a technical problem instead of treating it as a management issue you will probably fail.

Hope this helps


Why Follow Data Quality (DQ) Best Practice? Why use Data Quality tools?

Thursday, August 14th, 2008

I am often asked by line of business managers and programme managers why the use of data quality tools, now increasingly considered best practice, is important: And particularly so when they are asked to fund investment in these platforms!

The heart of this problem is that many otherwise excellent data migration leads tend to talk in technical terms when it’s a simple business case that is required.   It is not necessary to explain the need in Data Quality terms at all, as the benefits can be explained purely in terms of risk management and planning:

Traditional Data Migration can be visualised in terms of a project plan, in which we would on-board the team, progress with setup, start moving small items of data around, begin gradually to integrate these into larger sets and then finally move to a fuller picture encompassing the whole starting data set and also the target.  However we notice when we look at this plan that any integration problems would only be realised a significant way into the project, when a lot of money has been spent and promises have been made on the size of the team, the budget and even the feasibility.

It’s tempting when budgeting to look at the data movement as an excercise unto itself, but if the data does not arrive, the system stays empty and an entire business change progamme may founder.  Workflows will have been created, hardware has been ordered, training has been undertaken, managers have been moved and backfilled, and other strategic business objectives have been put aside  – all for nought.   Worst still, if the data is flawed and the business trust to luck instead of calling a “no-go”, they may find themselves stranded on an unworkable platform, unable to operate processes effectively and in danger of damaging customer and investor confidence directly.

So that:

Data Migration is a critical dependency of wider change and implementaton programmes that has a tendency to generate long thin plans with a high liklihood of trouble towards their end-points.

As managers what are we going to do about that?   I don’t mean the Data Migration Lead, I mean you, the key sponsor with his/her career on the line.

With any other such problem you would seek to bring the problem area forwards in the timeline, work on it in parallel, investigate the extent of the risk and from there form an opinion sooner rather than later so that countermeasures can be deployed and mitigation put in place, even to the point of cancellation if teh endeavour appears too risky for the organisation to procede.   …This is Project management bread and butter.

This is what DQ Tools allow your Migration Lead – the means of performing that transformation of project time and risk; Snapping the problem elements away from the troublesome end section and bringing them forwards in the timeline, closer to programme initiation and before significant damage can occur.

Bringing The Risk Forwards in Time

The technical detail of how this works is far less important than the impact in risk, planning and certainty, allowing the project to forge ahead with confidence rather than bravado.  


SAP Legacy Preparation

Tuesday, July 29th, 2008

As featured on Data Migration Pro “Ask an Expert

“I’m going on my 1st migration project: we are migrating data from a legacy system to SAP. What are some of the fundamentals that I need in preparing the data in the legacy system?”

From the nature of your question I am going to assume that you are one of the legacy experts co-opted onto the project and take it from there.

Firstly, it is important to realise what SAP is and how it works. At its heart SAP is not only a computer system but a set of business processes: A theory about how stock in warehouses should best be managed, how invoices should be written, how stock should best be purchased, how late accounts should be chased and so on. This redefines people’s jobs in the post go live organisation around the “new ways of working” and with this they will also receive a computer system and screens which embodies this – SAP.

So one thing you should be prepared for is the project environment itself. A major feature from the outset will be the Process Team. These are business analysts who will select the processes and design the screens. One of their major aims will be to interconnect the way the business works in a manner that may not have been possible before. So that for instance agreeing a sale with a customer will create a raw materials request and a production plan requirement immediately and those people will be able to see the sale go through and check its details themselves even though they are not in the sales department. This creates a lot of new joined up thinking across the company, increases its reaction times and hence decreases stock that needs to be held on a “just in case” basis. Returning to the data (in case you thought I had forgotten it!) joining up the business logic in new ways will require the data to be joined up in new ways.

So that in summary key things you need to remember are:

– People will not work the same way as they did before
– The supporting data will not exactly look the same as it did in legacy
– Data that was on different systems before may have to join perfectly for the SAP load

This last point can create great strain on the data preparation as while Referential Integrity is often good within systems it is not always good between systems.

The first two points mean that you cannot assume that data is required the same way in SAP as it is in legacy or even needs to work in the same processes as it did in legacy. In fact it would be easy to over prepare by making too many assumptions about how SAP works before this is revealed within the project lifecycle.

I would suggest that you:

1) Identify all the entities in the organisation, draw a big map of which are on which systems and which are the master systems in each case. Note particularly if you have two islands of information for the same object. E.g. Customer information behind the web site and also customer information on a telephone orders system. These may well have to be consolidated and de-duplicated for SAP into a single list.

2) When you are completely sure you have an enterprise wide view of what data is where start profiling the main entities using a data profiling tool. By creating profiles and storing them you are creating a resource you can return to when questions are asked later by the process team or the SAP upload team.

3) Locate all the manuals that say how the data works in legacy and what they support. Look especially for documentation of tweaks if the legacy system itself was a standardised package that has been altered or customised.

4) Return to your data map and try to put names of people in the organisation to each entity. Where do these things come from? Who really owns them or knows the meaning of the data well?

5) Create some simple “whole of entity” extracts. Prepare them in Access or Excel ready for the process team or the SAP upload team to view.

By the end of this process you should be able to demonstrate where things like Customer, Supplier, Chart of Accounts and Material Master lie on legacy, how large they are in terms of rows and what lies within them.

Only when you have these information sources to hand would I consider actually cleansing anything. And if you do clean anything go for things that you are really very sure will be a feature of the SAP system. It is a mistake to try and clean the whole of legacy because some of it will be relevant to old ways of working and not to the new – meaning it will get left behind!

In closing – I have said overall:

– Expect SAP to be different
– Prepare mostly by gathering information about data, not by adjusting data
– Don’t make too many advance assumptions about the target

If you achieve even part of the above you will find the incoming SAP experts incredibly pleased and surprised with what the base they have to work from.


Migrating JDE Data Objects to SAP

Saturday, July 19th, 2008

As featured on Data Migration Pro “Ask an Expert

We will soon be migrating from JDE to SAP.   How should we proceed?   The question where we perceive complexity is with Media Objects.   Can you offer some advice?

A Media Object is a file or text attached to a JD Edwards record. Media objects can be text, images, files or shortcuts. (Ref)

So that, as the questioner already knows, it is a very “mixed bag” of data. I would agree with him, that in technical terms this is likely to be a large sticking point.

Firstly he should look at making the problem just go away. I would suggest talking to the Programme Manager and the process team and highlight the attachments as a considerable challenge on top of that of configuring and running the SAP system. He/She will hopefully engage with the issue, look at what the attachments are in business terms and why history is required from a new process design perspective. In 4 out of 5 projects luck will be with the questioner and the Programme Manager will take the risk equation in hand and just declare the legacy attachments out of bounds for migration, backed up by a moth-balled JDE implementation if they do ever need to be accessed. So that the JDE record itself is migrated, if even that level of history is required on SAP, and potentially a single attachment is created in all cases where one or more previous items existed with a standard text such as, “This historical record formerly had attachments that may still be viewed on the frozen JDE instance”.

The main thing is to take the question outside the technical team and treat it as a project risk rather than something he has to hold onto and solve personally. My own experience is that SAP migrations are risky, costly and driven by very specific aims such as better control of finances. The delicate cost, benefit and risk balance of the migration often dictates that items such as attachments must be left behind along with very large volumes of history, because they are not directly associated with the sought business benefit or future operations, but are just an archive. For example, it is common practice to migrate open purchase orders only, and leave those already filled behind. This is often allied with deliberate business tactics to drive open business down such as pre-ordering and filling the warehouse.

But let us say that the questioner is unfortunate in this case, or manages to minimise the number of such records significantly as above, but is still challenged to move some of the attachment data.

The key questions in terms of SAP migration would appear to be:

1) How do we get the attachments out?
2) How and where do we put them back into SAP?

I feel what he should be looking for is some kind of JDE reporting facility, perhaps bringing alongside something like Oracle BI Publisher (which integrates well as Oracle own JDE) Then he can use the notion of “publishing” the Media Object attachment to a relatively neutral format such as PDF rather than trying to migrate or convert it at the data level. The reload task is then one of re-attachment, not of worrying about BLOB (large binary object) fields, their internal document formats and the like.

If he really does need to put his hand “in amongst the cogs”, which I do strongly suggest you avoid, a third party tool may be of use in unpicking the JDE data structures. There appear to be several here at EverestSoft.  (Note that this is not a recommendation. I have not used any of these tools myself)

But really – why would he do that? The Objects are attached “documents” to start with, so publish and reload makes complete sense, if indeed he has to migrate this data at all.


Customer co-operation in migration

Sunday, July 6th, 2008

We are working on a large project and as it has progressed I have become increasingly worried about the progress made. The legacy systems seem to be less well maintained and more complex in their interconnections than the client had said and things do not seem to move very far each week. We are contracted to do the in flight cleanse, transformation and upload but not the extraction.What do I do about the lack of co-operation from the customer side?

One simple answer to this complex problem is that your client appears underpowered in terms of large scale migration experience on their side of the contractual responsibility, but they cannot ask you for that help without further muddying responsibilities. They have agreed to share the problem but have no real way in which to deliver on that promise. What they need right now is their own migration guru, so that the joint responsibility in the contract is backed by joint knowledge and actual capability to act. Because this has not been happening so far you gave felt compelled to reach across the contractual divide to their side in an attempt to de-risk. In fact this just blurs the contract further and could ultimately place you in danger of looking responsible for the client’s shortcomings precisely because you tried to be helpful on things that were really for them to deal with. I think it will probably be better for you to keep a firm divide and invite them to “Bolster the Home Team” as described on our home page for both your benefits.

If responsibilities for action do become unclear and the project subsequently fails the client is likely to cause you upset with bad publicity and of course the personal professional disappointment of project failure. Equally, they are very unlikely to make a legal case for compensation stick because of the ambiguity or receive a delivered system either. This really is an “everyone loses” scenario that both sides would do well to avoid.

If you have been moved to write as an implementation partner the client is probably even more concerned at this point. A brief conversation with an independent data migration consultant, such as myself, may be enough to convince them to hire the help they need and minimise both their risks and yours at the same time.