We are regularly arriving at the peek of an excellent DataModel implementation that covers all the major topics of interest. Despite this, less than 12 months later we are back at the drawing board integrating a new perspective into our DataModel design decisions.
One major reason for the constant evolution of implementing a DataModel is that the topics of Processing and Storage won’t sit still. As the independent worlds of hardware and software grow in their own right, they are constantly impacting each other. This in turn shifts the focus and techniques used to implement SAP BW Best Practice DataModeling.
The line between ‘Processing vs Storage’ is reasonably clear; each has its own corner of the playground and our effort is mainly focused around getting them to play nice. The SAP Application Server (for processing) and the SAP Database Server (for storage) are proof of the lines existence.
“the topic of Process vs Storage is
a big game of … Tag, Your It“
Dancing with a foot on each side of the line is a core skill for all Basis, Developer, Support and Analyst personnel. With each activity, we are always mindful of and improving the techniques used to get these two to play nice and achieve the Functional requirement (aka Business Process).
A lot of the dancing between processing and storage has already been choreographed. The experts are constantly evaluating and refining the movement of data in, through and out of the BW DataWarehouse. These improvements are added to the BW system by upgrading. Some improvements are applied automatically during the upgrade while others must be done as a post upgrade enhancement.
Who really noticed that the Request Id (key field) of a DataStore was changed from being the real request number (human readable) to now use a surrogate id? Using the proven technique that we already know is used for Characteristic SIDs.
What matters is that the upgrade implemented this dance move for us automatically and we benefitted with a reduced demand on cpu cycles and storage space which results in faster DataStore loading.
When you take a step back and consider that your BW system has processed billions of records through a DataStore, even a 0.1% improvement makes a big difference in the Total Cost of Ownership (TCO). I’ve seen once case where the DataStore Request SID feature provided more than a 10% improvement.
BW v7 introduced the new DataSource, Transformation and DTP to replace the older DataSource, Transfer Rule and Update Rule. How many enterprises put the time aside to intentionally go and migrate all those ETL flows to the newer technology as a post-upgrade activity? [Not many]
A BW v7-4 Transformation has a feature to derive an InfoObject value by doing a DataStore lookup. The lookup of data outside the current ETL flow used to be confined to ABAP in the start, end or field routines of a Transformation. Hence, you need an ABAP coder to get this working with good performance.
If an enterprise invests the time to replace their ABAP based DataStore lookups with the Transformation DataStore lookup:
- The DataModel has less ABAP in it;
- Support personnel need less ABAP experience;
- It will be officially supported by SAP AG;
- Forces backend Z tables to be migrated into DataStores;
- That enables visibility of mapping data in the AWB.
There are plenty of other examples where a new feature is made available but requires post upgrade effort to implement it. Unfortunately, most of these new manual effort enhancements will not get implemented in the existing Data Warehouse (by choice due to prioritisation).
What new features are a must have for my manual enhancement list?