operational complex logic requirements
+2
ngalemmo
guyka
6 posters
Page 1 of 1
operational complex logic requirements
Hi,
in one of our BI projects there are extensive complex requirements for heavy operational calculations and algorithem as part of the ETL process.
the project is based on manufacturing system which developed in-house and has no real reporting functionality as part of it.
the ETL tool we are using is informatica and reporting tool is QlickView.
My question is - what will be general recommendation for implementing complex logic and algorithem. my major concern is that by writing such complex SQL scripts and functions the project will be expose to failures and problematic future maintanance.
in one of our BI projects there are extensive complex requirements for heavy operational calculations and algorithem as part of the ETL process.
the project is based on manufacturing system which developed in-house and has no real reporting functionality as part of it.
the ETL tool we are using is informatica and reporting tool is QlickView.
My question is - what will be general recommendation for implementing complex logic and algorithem. my major concern is that by writing such complex SQL scripts and functions the project will be expose to failures and problematic future maintanance.
guyka- Posts : 4
Join date : 2010-01-10
Age : 55
Location : Tel Aviv
Re: operational complex logic requirements
I feel your pain. The best advice I can give you is make the risks clear, and document things as best you can.
guyka- Posts : 4
Join date : 2010-01-10
Age : 55
Location : Tel Aviv
Re: operational complex logic requirements
You may be tempted to build some custom, SQL-based measure calculation engine. Personally, I would dimish the SQL usaage in the ETL implementation to the absolute minimum. You can consider architecting the metadata-driven BI solution. Informatica offers you some capabilities for designing flexible and reusable component such as mapplets or Visio Mapping Architect.
In case you decide to implement custom measure calculation engine, then ensure that this will be standard, reusable building block for all applications on your data warehouse platform.
In case you decide to implement custom measure calculation engine, then ensure that this will be standard, reusable building block for all applications on your data warehouse platform.
marcin.wizgird- Posts : 4
Join date : 2013-05-07
Re: operational complex logic requirements
I would try to put it in the ETL. Any heavy lifting done by the BI tool just ends up slowing down the reports. Plus you do it every time a report is executed.
BoxesAndLines- Posts : 1212
Join date : 2009-02-03
Location : USA
Re: operational complex logic requirements
I agree with BoxesAndLines - the ETL layer is the best place for this. As well as SQL, you can use the transformation functinality of your ETL tool. It's also well worthwhile to materialize the calculation results as columns in tables. This speeds query performance and helps testing and review.
Trying to do this in the BI tool results in a "black box" which hinders testing. In real world scenarios you often find the calculation requirements are impossible or very difficult to achieve in BI tools alone (sorry vendors).
Trying to do this in the BI tool results in a "black box" which hinders testing. In real world scenarios you often find the calculation requirements are impossible or very difficult to achieve in BI tools alone (sorry vendors).
Re: operational complex logic requirements
Having been through a similar scenario, I agree with all the comments thus far.
Keep a few things in mind when designing your ETL solution.
1. The calculations will change over time. Try and minimize the effort required to accomodate the changes.
2. The more complex the calculation, the greater chance of errors or perceived errors. Your solution needs to be transparent and easy to debug. As MikeH suggests, keeping the intermediate results stored in tables will greatly aid in any analysis.
3. Keep well defined metadata about the calculations. Most of the consumers of the data may not be intimate with the calculations, so you need to provide them the calculations in a clear format. (e.g. you can't send end users a SQL script )
Keep a few things in mind when designing your ETL solution.
1. The calculations will change over time. Try and minimize the effort required to accomodate the changes.
2. The more complex the calculation, the greater chance of errors or perceived errors. Your solution needs to be transparent and easy to debug. As MikeH suggests, keeping the intermediate results stored in tables will greatly aid in any analysis.
3. Keep well defined metadata about the calculations. Most of the consumers of the data may not be intimate with the calculations, so you need to provide them the calculations in a clear format. (e.g. you can't send end users a SQL script )
LAndrews- Posts : 132
Join date : 2010-05-13
Location : British Columbia, Canada
Similar topics
» Logic behind Top Down and Bottom Up approach
» Where to implement SCD type 2 logic?
» Business Logic: DWH vs. Source system
» Hot Swappable Dimension -> best practice for implementation of swapping logic?
» Help with Complex One-to-Many relationships
» Where to implement SCD type 2 logic?
» Business Logic: DWH vs. Source system
» Hot Swappable Dimension -> best practice for implementation of swapping logic?
» Help with Complex One-to-Many relationships
Page 1 of 1
Permissions in this forum:
You cannot reply to topics in this forum