Category Archives: Analysis services 2005

Analysis services -> Clear Cache

Just a short note about analysis services and its cache 🙂

One of the reasons for analysis services fast respond time is its cahce. But what if you want to test a query without using the cache? Well, just use this XMLA Script:

<Batch xmlns="">
 <DatabaseID>Your Database Goes here </DatabaseID>
 <CubeID>And or cube id goes here</CubeID>

Also, if you do partial processing, the cache, can in some cases also be invalid. Therefore, after a partial processing, you also want to run the above XMLA script.

Why I don’t think powerpivot will be a success

Just returned from the sharepoint 2009 conference, I’m really excited about all the new BI stuff.

One of the things I’m really excited about is powerpivot. As you might have guessed from the title, I’m going to elaborate on why I don’t think powerpivot will be a success, in the sense I don’t think it will fit for massive rollout to big organizations. I hate to that, before I have described what it is, and what I think is great about it(a lot of other people are doing that though), but here it comes:

There’s a lot of love regarding powerpivot in the MS BI and sharepoint communities around the world.
Just to challenge that a little bit 🙂

Sure, it is nice that you can pull more than a 100 million rows from various source systems like a SQL server, cubes, flat files, a data set from a reporting services report, a table in an HTML page, a RSS feed, or even from another powerpivot, relate these data to each other, and do advanced analyzing with a performance matching  the performance you would normally get from a cube running on analysis services. Well, it is more than nice. It is amazing in a crazy kind of way, but….

1)  More than one version of the truth: If powerpivot becomes a success, and a lot of people are publishing powerpivots to different places on the intranet, making new calculations based on data from the relational DWH, cubes, calculations from other powerpivots, calculations made in different reporting services reports etc. This is obviously going to result in having much more than one version of the truth.

And what about the scenario, if a person copies the powerpivot, from one place to another(The security is based on sharepoint, so he might have to do that, so another group can see it), and then make a change to one of the versions, but forget the other.

2) Performance 1: The reason powerpivot can do advanced analytic on your data with so good performance, is that it holds all the data in memory (The example from MS is with 100+ Million rows, and the size of the data was around 100-200MB.). That is nice on your local PC, when you are handling 1 powerpivot.

When you want to share it, you have to publish it to the sharepoint server. If powerpivot becomes a success, many people will do that. Then the server has to hold many powerpivot in memory, thus infuriating a lot of IT administrators.  

3) Performance 2:  When you publish your powerpivot to sharepoint, your data is not going to get updated . Everyone can click refresh data on the powerpivot, resulting in the powerpivot getting all  the 100+ millions row from the different source systems, linking the data together, recalculate all the calculations etc, but again, I don’t think  that is the way to make friends with the IT department. I guess there will be some way to set this up as a schedule in sharepoint, but again, if powerpivot becomes a success, there will be a lot of powerpivots in sharepoint, the IT department maybe don’t know from which source systems the powerpivots are pulling data, and many of the powerpivots will pull data from the same source.

4) Usability:  From the demos I have seen, I still think powerpivot is to difficult to use for the business users. I don’t want to elaborate on that, since this poat all ready is way to long.

I feel like I could go on about things I’m worried about regarding powerpivot, but I will stop now.

All this being said, I still think that powerpivot is a fantastic tool, but I think it is a bad idea to role it out for adhoc use to the average business users.

Errors in the OLAP storage engine: The metadata for the statically linked measure group, with the name of XXX, cannot be verified against the source object

I’ve been working a little with MS Analysis services lately. Yesterday I came across this error message:

Errors in the OLAP storage engine: The metadata for the statically linked measure group, with the name of XXX, cannot be verified against the source object.”

The error message is self explaining. I did indeed have more than one cube, and some of the cubes where linking to another. I had, however, been running with this setup for a long time, and never seen this problem before.

After some googling and som trial and error, I found a solution to the problem. It is not pretty, but there was nothing else to do 🙁

  • Delete the Analysis services database
  • Deploy the cubes without processing 
  • Process the cube(s) that are being linked to
  • Process the whole database

As I said before, it’s not at pretty solution, and I must admit I’m little curries about next time I have to deploy the cube.