The days when BI was seen as hugely expensive tools that were only accessible to a small number of people are gone. The use of common tools such as spreadsheets and access to query languages have made it easier for end users to gain access to large datasets and do their own BI analysis.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
One of the challenges, however, is that data inside the organisation often needs to be supplemented by external data in order to get the greatest value from it. An example of this is taking the sales data from a company database and comparing that with geographical and census data to not only see where certain items are sold but to see if there is a correlation between the population and the type of goods that can be more widely applied across the company.
Now there are a number of companies that work in this area and who charge a lot of money for dong this sort of analysis. None of these tools are easy out of the box and all require a fair bit of massaging to make them work.
At Microsoft TechEd in New Orleans, Amir Netz demonstrated new functionality inside Excel 2010 that has the potential to significantly change how many companies do BI.
Using a feature called PowerPivot, Netz connected to 11 different data sources, something that might not seem that difficult until you realise that these are not just local data sources but include SQL Azure (part of Microsoft’s Cloud platform) and data sources that use the Open Data Protocol (OData).
OData was introduced by Microsoft last year as part of the Project Dallas announcement and is a Microsoft initiative.
In the Netz demonstration he announced that Netflix is making its entire DVD database available via OData. Using the OData interface, Netz proceeded not only to pull in the entire Netflix database onto the local laptop but then demonstrate that once you have brought together all your data sources inside PowerPivot, you can then save into SharePoint.
Once you save into SharePoint, the data is uploaded into SQL Analysis Services and becomes its own database. Users are now able to query against that database and use any reporting or analysis tools that they want and all of this has been done relatively seamlessly.
To get the geographical data, Netz used the Bing SDK that has just been released.
In order to view the data, Microsoft will be releasing a new Silverlight control – Pivot Viewer – which adds a lot of extra rich graphical ways to display and present data.
There is no doubt that this is a significant step forward for power users create BI databases and who need to work with complex datasets from multiple locations. Using the OData interface to bring in external data is also very powerful but there are some significant concerns here.
Once the user has create the BI set and then saved it out through SharePoint into its own database who owns the data and how is data integrity preserved? This is something that has not been addressed by Microsoft.
There are also some significant issues from a commercial perspective over data access and data usage. For example, if I make my data available via OData and you then use that for analysis, how do I stop you storing it locally and reusing it?
This is a significant problem for any commercial organisation looking to provide limited use licenses of data and without the ability to track and limit use, the commercialisation of large datasets could end up being heavily restricted.
Microsoft might be making it easier to pull together data sources but it does need to provide the tools to allow commercial data owners to better limit the way their data is used.