When you look at the data flow from the DataSource to the MultiProvider you will notice that the path is made up of 2 fundamental aspects:
- Where data lives;
- Where the data flows through.
The relationship between these fundamental aspects can be visualised in a number of different ways:
- Network map;
- Ven diagram.
The network map is the most useful as it looks a bit like a street directory where you can see a 2D layout of both the fundamental aspects and their relationships.
Please keep in mind that there is a clear distinction between the definition of a Transformation versus its run-time declaration to process data between InfoProviders.
The network map not only caters to beginners and experts but also different styles of conversation.
Are you new to an environment and have no idea what your predecessors have built? When the custom data flow diagram features the skeleton from DataSource to MultiProvider, it allows us to absorb the system a lot quicker than just logging in and digging around.
This is why, in the lack of any official documentation, a seasoned BW Developer will head straight for the MultiProvider list and start bringing up the Network Map diagrams for the data flow below. The frustration with this approach is that you get the technical perspective without the developer, support and business insights that usually get added to a diagram of the same DataModel.
The diagram also allows a few other fundamentals to be included which makes the absorption process a lot smoother and way more efficient .
The vertical aspect of the diagram usually gets split up into the logical layers according to what that object is fundamentally ‘supposed to be doing’.
The connector lines easy indicate the type of data flow and the load mode:
- Full load;
- Delta load;
- Support repair load;
- ABAP SELECT statement.
The combination of icons and text cater for those who are skimming the diagram quickly as opposed to those who are studying it in detail. Being able to spot a DataSource versus an Application Process Designer (APD) pathway can save heaps of time (which is money) when planning out the activities you need to do for an enhancement.a
When comment boxes are added they usually contain fundamental hints about the component without drowning you in too much detail. I appreciate a Visio diagram that goes to that little bit of extra effort to highlight:
- The type of DataStore: Standard, write optimised or transactional;
- Transformations using summation aggregation on the key figures;
- Transformations with read ahead records filtering;
- Transformations with complex data transposing;
- Transformations that filter out records;
- Absolutely necessary DTP selection restrictions;
- Memory consumption traps and solutions;
- Virtual cubes that connect to external systems;
- Cubes that must be fully compressed;
- A cubes zero elimination preference;
- Links to detailed documentation.
For external DataSources that are usually located across the bottom of the diagram, it is extremely helpful when they identify the extractor technical id, the source data location and a link to any excel worksheets that contain the field mappings from source table through into BW. For Example: ZBW_0DOCTYPE_ATTR extractor from T161 table.
It is easy to inform about loading sequence dependancies between DataProviders into a single DataTarget with numbered circles. This is handy for BW support personnel when circular loads are involved. The numbered circles and additional comment boxes can make it clear; especially at 3am in the morning when the coffee just doesn’t have enough kick and you just heard the mouse whisper that you should be in bed asleep. [Really!]
Is the diagram blowing out beyond 10 pages wide? Perhaps you should consider breaking it up into multiple diagrams/files. You will know when this needs to occur because you find yourself spending more time scrolling left and right instead of focusing on the questions you have.
This introduces a need to cleanly indicate a “not here but over there” style of connector. It allows the current diagram to limit itself to a portion of the DataModel and tell its tale without trying to be everything to everyone. There is an internationally recognised standard connector for this but I’ve found people respond better to something like a lightning bolt because it has no practical purpose in a BW system; not even to represent statistical data related to electrical distribution sub-stations.
“the true purpose of a custom data flow diagram is
efficiency by being clear and simple; not the details”
With the many different ways to read a diagram of the BW DataModel, there is no official ‘best practice’ on how to logically group the separation of the DataModel into different diagrams/files. The best you can do is observe how practical the diagram is while you use it and experiment with the inconvenient pieces.
The top down approach could use the MultiProviders and everything below. This leads to a lot of redundant diagram maintenance by the time you get down to the DataSources. This is not practical.
The top down approach could use the type of data from the reporting users point of view. Finance, Sales, Inventory, Procurement, Marketing, Vendor, System Analytics, etc. At a high level this approach does work well for a lot of people as the diagram gets named according to a business point of view. However this approach by itself is a bit too big.
A further grouping by business function seems to get the necessary break up into small enough pieces to fill a single diagram without being too big. The most common approach is to roughly align it with the SAP Application Components. For Example: FI-GL, CO-IO, MM-IN, HR-PA, SD-IN, etc.
Adding a legend to the custom data flow diagram helps you clarify if it has become too complex by trying to be all things to too many people. Using an icon and a single sentence to describe its purpose, you build up a quick reference summary of what the diagram is trying to achieve. When there is no ABAP connector line and no execution sequence circles then the diagram cannot be used for process chain analysis, UAT preparation or late night InfoProvider repair activities.
What custom data model diagram features do I find useful?