This is my presentation from last week’s Rittman-Mead BI Forum in Atlanta. Incidentally it happens to be the same topic (similar slides) to the KPI Partners webinar from April and what I gave in Denver at Collaborate also in April. The webinar has been recorded, so you can hear my commentary and the QA session afterwards. If you get the chance, I’ll be doing the same preso at Kscope in New Orelans next month. Enjoy!
Ok so I put a provocative title on this post for a reason. This post will explore why some of Kimball’s concepts may be out dated when newer query generation or database capabilities are taken into account. Specifically, I’m going to discuss the OBI concept of Nested Aggregation, also known as Dimension Based Aggregation, in detail. Using this OBI Modeling technique you can relax a bit on one of the Kimball Dimensional Modeling rules. I’ll show how it works in a simplified manner, plus identify some things to be careful with and how it does not alleviate you from doing real modeling work. Read the rest of this entry
An interesting topic came up recently on my latest project that I think is quite informative on how OBI works under the hood. It involves how to model degenerate dimensional fields into OBI when there are aggregates present. Individually each of these two topics are relatively well know with solutions for how to handle them, but when combined they can get pretty tricky. This is especially true when you are using the BI Apps, which to a large degree still rely on the old method of manually mapping Time Series (Ago & ToDate) metrics (as opposed to the OBI 10.x + technique of Time Series functions) which require an explosion of Logical Table Sources (LTSs).
The core of this post concerns what do you do when some of your aggregates don’t have all of the degenerate fields that your base fact has? What do you need to do to model them properly? In the explanation I will shed a bit more light on how OBI “thinks” so that you too can model things properly the first time.
In the past I’ve written and presented on OBI performance from the ‘before perspective’: before you begin development, what are the things you should plan on and how should you implement them. This paper however is with the ‘after perspective’; what to do if you are stuck with a performance problem that you need to solve quickly. It will use the Financial Analytics (All versions) application from Oracle’s BI Applications (aka the BI Apps) to walk through a plan of attack and demonstrate specific changes using the Oracle Database 10x. Thus, this paper has two purposes:
- Specifically to document actual changes for Financial Analytics
- Generally to walk through a performance tuning effort that you yourself may undertake
Note: You do not need to be working on Financial Analytics or even the BI Apps for the concepts in this article to apply. It merely uses Financial Analytics as its example, and where appropriate I will explain the model.
I’m going to do something a bit different with this article in that I will tell the story of a recent performance tuning project for a client. A previous integrator had delivered Financial Analytics and it was up and running in a production environment, but the performance was terrible. Many queries were over 8 minutes. We were asked to tune the GL Transactions star, but the lessons learned here will work for all modules of Financial Analytics, regardless of version. In fact, implementing them for only one star actually boosted the performance of the other Financial Analytics Subject Areas. Read the rest of this entry