Dimensional Model Vs Views as Structure for BOBJ Universe
Page 1 of 1
Dimensional Model Vs Views as Structure for BOBJ Universe
I have a fairly complex data mart (multiple fact tables: 5 at the moment and several more coming; 4 shared dimensions across the 5 facts, 26 total dimensions and several bridge tables). I used standard best practices for building the Business Objects universe (1 context per fact table).
I've structured the universe so that in WebI they can choose objects from a class with all the shared dimensions, and any other single class without getting prompted. If they want objects from any of the other classes, they need to add a query and then create detail variables, which then can be dragged into the report.
The client has an aversion to contexts (they don't believe their users will use the datamart if prompted for contexts or have to create detail objects) and is pressing for a simplified solution. They want to create views joining every dimension to every fact, and joining the views to each other based on the shared dimensions. Besides the potential performance issue, which they feel is inconsequential based on the projected datamart size of around 4GB, I'm looking for any other potential issues with this approach.
I saw this post which is similar to my situation; I guess my concern is more about accuracy than performance.
I've structured the universe so that in WebI they can choose objects from a class with all the shared dimensions, and any other single class without getting prompted. If they want objects from any of the other classes, they need to add a query and then create detail variables, which then can be dragged into the report.
The client has an aversion to contexts (they don't believe their users will use the datamart if prompted for contexts or have to create detail objects) and is pressing for a simplified solution. They want to create views joining every dimension to every fact, and joining the views to each other based on the shared dimensions. Besides the potential performance issue, which they feel is inconsequential based on the projected datamart size of around 4GB, I'm looking for any other potential issues with this approach.
I saw this post which is similar to my situation; I guess my concern is more about accuracy than performance.
Last edited by Cinghiale on Wed Nov 17, 2010 6:06 pm; edited 1 time in total (Reason for editing : Additional information.)
Cinghiale- Posts : 1
Join date : 2010-11-15
Similar topics
» Snowflake to support different grains with BObj universe?
» Is a Master Person Index (MPI) in conflict with the concepts of a dimensional structure?
» How to model Facts with Conformed Dims on different grain level in BO XI universe
» How to model 4 parallel views/hierarchies of a dimension?
» Rule based algorithm to convert an ER model to a dimensional model
» Is a Master Person Index (MPI) in conflict with the concepts of a dimensional structure?
» How to model Facts with Conformed Dims on different grain level in BO XI universe
» How to model 4 parallel views/hierarchies of a dimension?
» Rule based algorithm to convert an ER model to a dimensional model
Page 1 of 1
Permissions in this forum:
You cannot reply to topics in this forum