Do you use key figures, which are based upon the day granularity of the timeline? No, probably not.
It is not a currency, quantity, integer or floating point value. It is a date that is stored as an integer in the format of YYYYMMDD. It does not count, aggregate or summarise like its integer counterpart:
- It honours the calendar timeline, allowing only true date values;
- The timeline means it is also a sortable list;
- That allows the use of logical operators in key figure calculations;
- It does basic arithmetic like addition and subtraction.
This is achieved through the use of the DATS native data type, that is available to be used in any ABAP program and key figure for reporting. The date key figure is generally introduced into the data flow in the transformation layer. Its objective is to enhance the reporting capability alongside the existing time InfoObjects.
Seriously consider putting in redundant versions of the date InfoObjects, as date key figures, and experience the benefits of simplified query design and reporting. It expands the capability to report on merged data sets (cubes through a MultiProvider) that need to use logical operators to meet the business requirements.
For example: ‘delivery date’ – ‘goods receipt date’ = variance in days (calculated as pure algebra for KPI reports).
Keep in mind that a date key figure is fundamentally not able to be aggregated/summarised, as this would change the value reported. Do yourself a favour and force the correct exception aggregation relationship into the key figure definition. This will ensure the OLAP engine and future developers/system administrators understand the fundamental nature of the date key figure.
There is no one single correct definition to use. You will have to investigate the date’s relationship with the current record in that specific data set.
“the exception aggregation definition must
consider its use in both storage and reporting”
From time to time the data modeler underestimates the reporting usage due to time restrictions to achieve an outcome. This leaves the data integrity of the query result table at risk, as some combinations of InfoObjects will return incorrect key figure values; the overall result total will be wrong.
This is often masked by subsequent calculations and can take quite some time before anyone notices a number that is close but not right. By then the damage is already done and the business user community starts their journey of doubt about the BW data warehouse’s accuracy.
This lack of confidence can be avoided if the data modeler always remembers to assess the exception aggregation definition of all key figures before putting them into a cube.
This increased risk will occur if you just leave the date key figure definition at the default of maximum/maximum aggregation.
“calculations work when the exception aggregation
full unique key is defined properly”
This will require the use of compounded characteristics that reflect the business definition of the full unique key. If the date is a delivery date and the delivery date belongs to a confirmation item, then the characteristic that is the confirmation item must be compounded to the confirmation document characteristic.
When the exception aggregation is calculated, it needs the full unique key of the compounded characteristic.
This will force the OLAP engines run-time compression of the data set to honour the data integrity of the date key figure values. Obviously, dates do not summarise. Please define the date key figure to ensure the lowest risk to reporting data integrity:
- Aggregation: Maximum;
- Exception aggregation: No aggregation. (X, If more than 1 value unequal to 0 occurs);
- Reference characteristic: To be determined by you and the relationship with the record this date key figure will belong to.
There is also a whole separate topic that needs to be understood regarding the way BW v7-0 does not handle technical reversals of maximum aggregated key figures cleanly. This in turn forces the ETL request loading the cube to always be done as a full load, if you ever want to compress the records properly. That is a larger discussion for another time.
Hopefully you use a Layer Scaled Architecture (LSA) data model with a clear separation of the raw, transformed and reportable data. This will allow you to re-factor the data model in small steps as a secondary task along with your other work.
The goal is to add the date key figures into the transformation layer and then the reporting layer. The transformation layer will contain the redundant copies of the date in both the time dimension characteristics and the date key figures.
This will open up the choices in the reporting layer, as you only need to include the InfoObjects used to achieve the business requirements. The important thing to remember when building/re-factoring the data model, is that the transformation layer has everything available so that as new reporting requirements come along the only effort required by the developer is to build the ETL out of the transformation layer and further downstream.
The idea is to have the transformation layer fully loaded, fully regression tested and available with date key figures ready to go. Once the date key figures have been added to a cube, focus on making life easier for the power users in the queries. You will get the most out of these date key figures if you focus on these types of data sets first:
- Records that are fundamentally an open interval of time. They are missing the second point of time to complete the interval definition. For example: The Personnel Actions for when someone was hired is an open interval until they leave the enterprise;
- Operational data that identifies an action occurred within the business process. For example: Document reversal, delivery date or payment date;
- Different time granularities that still need to be reported together. For example: Nominating weekly data to a date to be used in fiscal period reporting.
As a power user you continue to evolve as the Subject Matter Expert (SME) for the area of queries you work with. This does not mean that you are an over-achiever who spends their weekend pondering the complexities of data set integration; do you? Really?
This leaves plenty of room for the power user and the system administrator to work together and build recyclable solutions that bridge this gap.
The gap is between the complexities of data modeling merged DataSets versus the successful implementation of a slice’n’dice query that will always return 100% accurate data, no matter what.
The better implementations leverage global calculated key figures and global restricted key figures that involve key figures that have the correct exception aggregation definition with the related characteristic.
What other advantages do I know of that encourage the use of date key figures?