SSAS: Using fake dimension and scopes for dynamic ranges
Join the DZone community and get the full member experience.Join For Free
In one of my BI projects I needed to find count of objects in income range. Usual solution with range dimension was useless because range where object belongs changes in time. These ranges depend on calculation that is done over incomes measure so I had really no option to use some classic solution. Thanks to SSAS forums I got my problem solved and here is the solution.
The problem – how to create dynamic ranges?
I have two dimensions in SSAS cube: one for invoices related to objects rent and the other for objects. There is measure that sums invoice totals and two calculations. One of these calculations performs some computations based on object income and some other object attributes. Second calculation uses first one to define income ranges where object belongs.
What I need is query that returns me how much objects there are in each group.
I cannot use dimension for range because on one date object may belong to one range and two days later to another income range. By example, if object is not rented out for two days it makes no money and it’s income stays the same as before. If object is rented out after two days it makes some income and this income may move it to another income range.
Solution – fake dimension and scopes
Thanks to Gerhard Brueckl from pmOne I got everything work fine after some struggling with BI Studio. The original discussion he pointed out can be found from SSAS official forums thread Create a banding dimension that groups by a calculated measure.
Solution was pretty simple by nature – we have to define fake dimension for our range and use scopes to assign values for object count measure.
Object count measure is primitive – it just counts objects and that’s it. We will use it to find out how many objects belong to one or another range.
We also need table for fake ranges and we have to fill it with ranges used in ranges calculation. After creating the table and filling it with ranges we can add fake range dimension to our cube.
Let’s see now how to solve the problem step-by-step.
Solving the problem
Suppose you have ranges calculation defined like this:
CASELet’s create now new table to our analysis database and name it as FakeIncomeRange. Here is the definition for table:
WHEN [Measures].[ComplexCalc] < 0 THEN 'Below 0'
WHEN [Measures].[ComplexCalc] >=0 AND
[Measures].[ComplexCalc] <=50 THEN '0 - 50'
CREATE TABLE [FakeIncomeRange]
[range_id] [int] IDENTITY(1,1) NOT NULL,
[range_name] [nvarchar](50) NOT NULL,
CONSTRAINT [pk_fake_income_range] PRIMARY KEY CLUSTERED
Don’t forget to fill this table with range labels you are using in ranges calculation.
To use ranges from table we have to add this table to our data source view and create new dimension. We cannot bind this table to other tables but we have to leave it like it is. Our dimension has two attributes: ID and Name.
The next thing to create is calculation that returns objects count. This calculation is also fake because we override it’s values for all ranges later. Objects count measure can be defined as calculation like this:
Now comes the most crucial part of our solution – defining the scopes. Based on data used in this posting we have to define scope for each of our ranges. Here is the example for first range.
SCOPE([FakeIncomeRange].[Name].&[Below 0], [Measures].[ObjectCount])
[Measures].[ComplexCalc] < 0
To get these scopes defined in cube we need MDX script blocks for each line given here. Take a look at the screenshot to get better idea what I mean.
This example is given from SQL Server books online to avoid conflicts with NDA. :)
From previous example the lines (MDX scripts) are:
- Line starting with SCOPE
- Block for This =
- Line with END SCOPE
And now it is time to deploy and process our cube. Although you may see examples where there are semicolons in the end of statements you don’t need them. Visual Studio BI tools generate separate command from each script block so you don’t need to worry about it.
Published at DZone with permission of Gunnar Peipman, DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.
Part 3 of My OCP Journey: Practical Tips and Examples
Revolutionizing Algorithmic Trading: The Power of Reinforcement Learning
Seven Steps To Deploy Kedro Pipelines on Amazon EMR
How To Use Geo-Partitioning to Comply With Data Regulations and Deliver Low Latency Globally