Monthly Archives: September 2012
The fall means lots of things to lots of people. To me it means baseball playoffs, football and Oracle Open World. I’m not presenting this year – I was just too swamped on a long project to put anything together, so I’ll just enjoy some of the sessions. I’m going to be heavy on the performance stuff, whether its database features, Exadata and of course Exalytics. In any event, I’ll spend some time with my KPI Partners colleagues at the booth in Moscone South, @2315. Hope to see some of you there!
The BI Apps from Oracle present customers with a nice head start to getting their BI environment up and running. Financial Analytics, HR Analytics, Supply Chain Analytics, etc. all come pre built with much of the code and reports you’ll need to build to support your business. But for many customers they just are too slow for their user community while running dashboards, reports and ad-hoc queries. In an era where an internet search engine can give you what you want in one second, reports running for a minute or more are just not acceptable.
In this post I’ll discuss some of the inherent performance limitations in the BI Apps and what you should do about it. Note the vast majority of customers really don’t have a performance problem with their system, but you can always deliver reporting content faster. If you are running at 15 seconds per page, wouldn’t 5 seconds be that much better? The performance problem really lies with some large customers with larger data volumes. It is here where the BI Apps design can be enhanced with more performance in mind.
I’ve written about OBI performance a few times in the past, and I’m sure there will be more to come. As a refresher, here are a few other posts to take a look at:
- Achieving Good Performance with OBIEE
- OBI Performance Preso
- Performance Tuning Financial Analytics
- Stitch Joins
An interesting topic came up recently on my latest project that I think is quite informative on how OBI works under the hood. It involves how to model degenerate dimensional fields into OBI when there are aggregates present. Individually each of these two topics are relatively well know with solutions for how to handle them, but when combined they can get pretty tricky. This is especially true when you are using the BI Apps, which to a large degree still rely on the old method of manually mapping Time Series (Ago & ToDate) metrics (as opposed to the OBI 10.x + technique of Time Series functions) which require an explosion of Logical Table Sources (LTSs).
The core of this post concerns what do you do when some of your aggregates don’t have all of the degenerate fields that your base fact has? What do you need to do to model them properly? In the explanation I will shed a bit more light on how OBI “thinks” so that you too can model things properly the first time.