Presenting QA at the BIWA Summit this week

I’ll be presenting at the Oracle BIWA Summit on Thursday of this week in Redwood Shores CA near Oracle HQ.  I’ll be discussing boring old QA.  The presentation is based on the articles I recently did below on QA.  Not sure how this will go over as this preso is less technical than a lot of the other content.  However, in my opinion it is very useful stuff to know – everyone needs to go through a QA cycle, right?

Anyway, stop by and bug me if you like.

 

Keep It Simple Stupid

I’ve been spending a lot of time recently working on performance tuning projects.  Sometimes the BI apps are slow, sometimes it’s custom, sometimes it’s a mix.  I’ve gotten the chance to see what works in both Oracle and SQL Server.

My conclusion about both of these databases is that they are like a cat or dog that gets fooled when you play hide the ball; they aren’t very smart sometimes.  The only way you can really truly ensure database engines, even modern advanced ones, do things the right way to is to make it as simple and easy for them to understand as possible.  I guess this is nothing new; the KISS principle comes from Kelly Johnson of The Lockheed Skunk Works, the guys who created the SR-71 Blackbird spy plane.  I think he knew a few things about complexity in systems and how they tend to break or become difficult to maintain.  (BTW more on that topic as it pertains to OBI and the BI Apps at a later date.)

As I’ve been looking at performance tuning many reports and queries over these days, I find that a lot of time is spent trying to get the database to do the smart thing.  Too much time in fact.  Usually this is due to some small piece of non-simple SQL that causes the problem.  In more unusual cases I’ve seen something on one table completely break down the query plan, even something that should be trivial and very innocuous.

<Vent>For example, not being able to use an index on a table with 2,000 records should not radically alter the query plan, but in fact it will do that on you.  After you spend hours upon hours with it, after you’ve called up the DBA for help to dig into the extreme nitty-gritty details of the query plan, you make a change it works for that one query but not any others.  Then you decide to write this article because you’ve spend 10 hours on something so minuscule that in the end doesn’t even work consistently.  If only the query had been clean to begin with…</Vent>

In this brief article I’m just going to lay out a few things to consider to help make you system simpler for the database engine to understand and therefore do a better, faster job in answering a query.

Read the rest of this entry

Tuning Time Series Metrics

In this post I am going to explore some performance issues related to OBI’s time series functions.  Released back in OBI 10g, the ToDate() and Ago() functions brought a significant improvement to the process of easily creating a variety of time series metrics.  In older versions of Siebel Analytics, creating time series was a very manual effort involving a lot of aliases and special joins that could at time become a little confusing to the developer.  They did have a wizard called the Time Series Wizard to assist, but if you are like me you never use wizards J.  The Time series functions however solved that; using them is a piece of cake, requiring only a minor enhancement to the Date dimension.

All is rosy with the world then, correct?  Well not so fast.  The reality is that these functions do some very strange things behind the scenes in order for them to work properly.  So strange in fact that the database engine typically has some difficulty figuring out what to do.  One thing I’ve learned over the years when it comes to database engine performance – keep it simple if you want it to run fast.

As it turns out these strange things that OBI does for the Time Series functions in fact cause a decent performance hit when compared with the old technique.  This short post will discuss this in more depth. Read the rest of this entry

QA for OBI (Part 2 – The Tests)

In the last post I discussed some general, overall topics to prepare you for a better QA cycle.  In this post I’d like to get into the details a bit more to discuss some of the actual tests you can perform across both ETL and OBI layers.  In particular I’m going to focus on data testing for ETL and OBI.

Read the rest of this entry

QA for OBI (Part 1 – Planning)

It seems that every client and every project (I think I’m in the 40’s now) has a different way of QA’ing their OBI system.  This holds true for whether you are doing a BI Apps deployment or simply a new deployment of a custom solution.  This post will set the foundation for the discussion of what needs to be QA’d and how to break the multi-layered solution into more manageable units.

I get the feeling sometimes that my customers are expecting QA to be something perfectly pre-canned that I can simply show up with, fill in the cells in a spreadsheet, and off we go.  The reality is that each project by nature is different, starting from a different place, executed in a different manner, and with different people involved.  It follows that some of the specifics of the QA cycle will therefore be different as well.

Read the rest of this entry

That Wonderful Time of Year – Oracle Open World

The fall means lots of things to lots of people.  To me it means baseball playoffs, football and Oracle Open World.  I’m not presenting this year – I was just too swamped on a long project to put anything together, so I’ll just enjoy some of the sessions.  I’m going to be heavy on the performance stuff, whether its database features, Exadata and of course Exalytics.  In any event, I’ll spend some time with my KPI Partners colleagues at the booth in Moscone South, @2315.  Hope to see some of you there!

 

 

Build a Performance Layer to Make the BI Apps Fly

The BI Apps from Oracle present customers with a nice head start to getting their BI environment up and running.  Financial Analytics, HR Analytics, Supply Chain Analytics, etc. all come pre built with much of the code and reports you’ll need to build to support your business.  But for many customers they just are too slow for their user community while running dashboards, reports and ad-hoc queries.  In an era where an internet search engine can give you what you want in one second, reports running for a minute or more are just not acceptable.

In this post I’ll discuss some of the inherent performance limitations in the BI Apps and what you should do about it.  Note the vast majority of customers really don’t have a performance problem with their system, but you can always deliver reporting content faster.  If you are running at 15 seconds per page, wouldn’t 5 seconds be that much better?  The performance problem really lies with some large customers with larger data volumes.  It is here where the BI Apps design can be enhanced with more performance in mind.

I’ve written about OBI performance a few times in the past, and I’m sure there will be more to come.  As a refresher, here are a few other posts to take a look at:

Read the rest of this entry

Degenerate Dimensions and Aggregates

An interesting topic came up recently on my latest project that I think is quite informative on how OBI works under the hood.  It involves how to model degenerate dimensional fields into OBI when there are aggregates present.  Individually each of these two topics are relatively well know with solutions for how to handle them, but when combined they can get pretty tricky.  This is especially true when you are using the BI Apps, which to a large degree still rely on the old method of manually mapping Time Series (Ago & ToDate) metrics (as opposed to the OBI 10.x + technique of Time Series functions) which require an explosion of Logical Table Sources (LTSs).

The core of this post concerns what do you do when some of your aggregates don’t have all of the degenerate fields that your base fact has?  What do you need to do to model them properly?  In the explanation I will shed a bit more light on how OBI “thinks” so that you too can model things properly the first time.

Read the rest of this entry

Big Data for Oracle BI

The term Big Data has existed in some form or another for years but recently has taken on a new and more official meaning.  In today’s world of massive internet applications, digital instruments streaming non-stop data, scientific data collection and fraud detection, Big Data has grown far beyond what even a large company used to consider large – into the hundreds of terabytes or even petabytes.  Furthermore, Big Data has a large unstructured component to it, whether comments on websites, blog data, internet usage, images or documents.  This kind of information typically does not map well to traditional database technologies which rely on a very structured table/column arrangement.

Considering high volume and great variability of data, along with very high uptime and extremely short response times needed, traditional RDBMSs simply won’t work – they will not be able to scale out to provide 1 second response time when a Facebook user posts a picture or visits a friend’s wall when there are millions of users looking at petabytes of data.  Thus, completely different kinds of data access and storage technologies are needed, ones which are designed to scale far beyond even a very powerful systems such as Oracle ExaData.

This article discusses Oracle’s view of Big Data and in particular how it pertains to Data Warehousing and Business Intelligence.  Keep in mind there are many offerings and capabilities pertaining to the acquisition and use of Big Data which are well beyond the scope of Data Warehousing and BI systems; I’m going to focus on just a slice of it here.

Read the rest of this entry

OOW MetaData Presentation Sunday @ 3pm

As part of the Sunday ODTUG OBI day, I’m presenting on Advanced Metadata topics.  This is the same presentation I gave at the Rittman-Mead forum in April of this year.  Its useful if you’ve been working on OBI for a while and are starting to feel comfortable with modeling the basics.  I’ll cover the following topics:

•Combo Tables•
•Make Dims & Facts Reach
•Non-Conformed Dimensions
•Canonical Time
•Selecting a Logical Table Source
•Controlling Join Paths
I cover both OBI 10g and 11g.  Come on out and stop by Moscone West, room 2011 at 3pm!
Update:  Here is the preso.  Enjoy!
Jeff