You would have noticed that the topic of aligning and comparing the overlapping perspectives of different timelines is needed for KPI and management reporting. What is often overlooked and misunderstood is the alignment of the time zone in reporting based upon the posted transaction data records.
The distinct perspectives of time are normally defined without any specific mention of a time zone. It is also missing from the InfoObject definition itself in most cases. It is then usually implied through the relationship to the posted transaction data record that the time value exists in. While this does work to keep the time dimensions as simple as possible, it does introduce the risk of reporting misaligned totals.
For Example: Wellington in New Zealand versus Perth in Australia and the alignment to midnight.
An enterprise that trades only 10-16 hours per day and uses different timeline variants against their posted transaction data will not really need to deal with the issue of time zones in their time dimensions. The Calendar time dimension is implied to aligned to the posted transaction data for the local region that created it. There will be no misunderstanding of ‘Mid-night on Wednesday Night’. This works, even if you round the time dimension up to the date granularity and never store the time on the record (as a time stamp characteristic).
Most enterprise reporting requirements will work with the minimum time granularity of day. They will also ignore the discrepancy of the time zone alignment between the different posted transaction data DataSets (Wellington vs Perth). They are fundamentally interested in comparative analysis and trending, day-to-day.
Reporting a perfect time zone alignment of 08:00:00 here versus 13:00:00 there is normally not required for the bulk of the reports. Remember that this is a decision that is made during the requirements gathering stage for any report.
“time comparison and trending for reporting
is usually more important than alignment”
The today-to-date reporting requirements that deliver near-real-time analytics are also usually localised to the time zone they belong too. These reports are for local business personnel who already know what time zone they are in and they have no need to compare their data to other time zones. This is usually why the posted transaction data is time stamped (date and time) but rarely includes the additional meta-data of the time zone.
Even when an enterprise has a war room that displays the today-to-date near-real-time information of the individual local countries, the missing meta-data (aka the time zone) is not an issue because the data feed is known to belong to a specific time zone.
Most of the aggregated global reporting requirements will also behave just like the local reporting because their primary focus is ‘Like for Like’ reporting. The time zone itself is not the focus as the global comparison of a days worth of data, as it is better understood when the behaviour of each time zones local residents are aligned. The data aligns to breakfast, lunch and dinner; which is much better for trending reports.
The exception to this is a global enterprise that is only loosely aligned to human behaviour but treats the entire world as an instance of ‘Now’. Enterprises who do financial trading, global package deliveries, information streaming, etc. These enterprises will pay close attention to the usually missing meta-data of time.
Their transaction data records will have the additional time meta-data in the raw data to enable both global ‘now’ and local reporting perspectives of the data to be consumed. It is the dual reporting requirement for consuming a single transaction data record through both perspectives that leads to the DataModeling practice of storing all of the original raw data in UTC+0 format. This simplifies the processing, consumption and comprehension of the time zones involved in the transaction data records.
Now throw the topic of day light savings on top of the local time zone. For the posted transaction data to capture the strict alignment to the time zone it must also include the day light savings indicator. This becomes extremely important twice a year when the perspective of local time is adjusted. Without the day light savings indicator and the time zone there is no way to accurately convert and compare a distinct point in time across the earth.
Keep in mind the above concepts are about consuming the data in reports. It does not identify the time zone accuracy requirements of collecting data through system interfaces.
When you take a step back and look at the practical side of combining and consuming the enterprises data for reporting, there rarely is any need to store the distinct perspectives of time with 100% accuracy. In most reporting scenarios the posted transaction data will have a minimum time granularity of day. The rest of the reporting requirements will be a more aggregated level of time (week, fortnight, period, month, quarter and year).
The next time a reporting requirement for a business department is prepared, ensure the alignment of the time zones in reporting is still not needed in the underlying DataModel.