Third Party Data Refresh Tools and Negative Interaction with PowerPivot for SharePoint


Recently we have seen multiple instances where custom 3rd party tools/scripts are being used to bypass the standard daily scheduled data refresh limit for PowerPivot workbooks. The main reason we have seen these instances is because these tools have been causing performance issues for SharePoint Web Front End servers that were not accounted for in their design and/or implementation into the subject farms. I first want to note that we do not support these tools in any way and highly recommend against using them at all. If you need to refresh your PowerPivot workbooks more than once a day, then you likely need to find a different BI solution to display your data. Excel Services and PowerPivot for SharePoint are not meant to be a “live feed” of data.

In these cases, we found that the tools and scripts update the workbooks from another server/machine outside of the farm and re-upload them to SharePoint. As in most cases where PowerPivot is used, the library where the documents were stored was a PowerPivot Gallery as it provides the best user experience visually and functionally. Due to the fact that the documents were stored in a gallery, the massive front end load was caused from the PowerPivot gallery snapshots being triggered on the workbooks. Every time a user (program, script or otherwise) updates or uploads a workbook to a gallery, this snapshot process is triggered. The process engages the workbook after upload and literally creates images for every supported sheet in the workbook and uploads them to the metadata of the file.

Now, if we take this as an example: If I have a PowerPivot gallery with fifty 30MB workbooks in it and I use my script to refresh all of them and then re-upload them to the site, not only did I just slam a single WFE with 1.5 GB of uploads, but I have potentially caused 50 getsnapshot processes to launch at the same time and start processing. Each of those processes reaches out to the site at the same time to create an image file and then upload it to the document. You can also see how this can affect more than one WFE at the same time.

The other difficult part about this is that a solution like this can be implemented by any user that has access to the farm. Administrators may not even know it exists until it is too late. Most of these update solutions can be run and triggered from a CLIENT machine. They do not require servers or code level access to function. They generally access the site via the same interface that other software (like Office for instance) would use.

Lastly, if the PowerPivot gallery is configured with versioning, this can cause some really bad issues with version limits. You may find these documents becoming quickly unusable or causing corruption in your content database if too many versions are created. They also cause massive storage issues on the database side when they get out of hand.

So, some things to think about:

  1. Do not use custom solutions to update your PowerPivot workbooks more than once per day. (It is not supported)
  2. If you do, do not store the workbooks in a PowerPivot Gallery.
  3. Do not configure versioning on the library where the workbooks are stored (or limit the versioning severely).
  4. If you need to refresh BI data more frequently, re-think your BI data strategy.

What can you do if you need data more frequently?

  1. You could “upsize” your data models. Effectively this involves moving a data models created in PowerPivot to Analysis Services. You can then use SQL jobs to update the data as frequently as you wish. After that is configured, you could create Excel workbooks to hit the new model and “Refresh on Open”. This would effectively give you “fresh” data every time you open the workbook. Again, it would not be a live stream, but it would give you a supported method to retrieve data on a more frequent basis.

    Upsizing PowerPivot 2013 Workbooks to SSAS for Knowledge Workers

    • Note: This workaround may not make sense if you have very small workbooks.
  2. If the data source is SQL Analysis Services, you could use flat out use Excel rather than PowerPivot and again configure the workbook to “Refresh on Open”. This solution will likely cause a little more load on the application servers as well as the Analysis Services data sources, but it is fully supported and a much more reliable method to retrieve data “on the fly”.
  3. Consider other products that integrate with SharePoint such as SQL Reporting Services or PerformancePoint to display the data.

Just to define what we mean when we say something “Is not supported”. We at Microsoft will not provide support to fix custom solutions and/or scenarios that have been deemed “out of bounds” by our product teams. This does not mean that these solutions will not work, but if you run into a problem due to one of these solutions, we will not be able to assist beyond a best effort to get the farm back into a working state after the customizations have been removed and/or disabled.

If you question whether a solution you want to implement is supported, please contact support and we can take a look at let you know.


Comments (2)

  1. InBoise says:

    Rick – Could you post a link or more information on this process (see below)?

    1.You could “upsize” your data models. Effectively this involves moving a data models created in PowerPivot to Analysis Services.

    1. Hi InBoise,
      Here is a blog by one of our EE’s that has one process: https://blogs.technet.microsoft.com/excel_services__powerpivot_for_sharepoint_support_blog/2013/05/23/upsizing-powerpivot-2013-workbooks-to-ssas-for-knowledge-workers/

      This should work for models created with PowerPivot for Excel 2010/2013/2016.

      Also, technically speaking you can also pull your model out of an Excel file (it is literally stored as an .abf) and restore it to Analysis Services. The above process is much better because of the detailed import it does from the file. We usually only use the “manual” method of loading a model into SSAS as testing mechanism.

      There may be other ways to accomplish this as well, so take a look around. This just happens to be one that is pretty easy to do and works well.

Skip to main content