Monday, December 31, 2012

PowerPivot-SQL Server Data Import...

I would like to take this opportunity to wish all of my blog readers/visitors a very happy new and prosperous new year 2013. In the coming year I am sure there would be a lot of paradigm shifts/new methodologies/big data analytics adoption. I hope it is a good prosperous year for all of us. In this blog post I would like to write about PowerPivot, more importantly how to import data from SQL Server. The PowerPivot Download is available in the following website:
http://www.microsoft.com/en-us/bi/powerpivot.aspx. In the link mentioned there are also of lot of demonstration videos regarding the use of PowerPivot.
In order to install PowerPivot, one needs to have the Excel 2010 Add On, in case one has Excel 2010 as part of Office 2010 that should be sufficient as well. Once PowerPivot is installed and one opens Excel 2010, there is a PowerPivot Option at the end. When PowerPivot Option is choosen, this needs to be done in order to work with PowerPivot, one can see the screen captured below.



Once the above screen appears, the user needs to click on the PowerPivot Window icon, this will bring up the next screen. In that screen, in the From Database option choose From SQL Server. This will allow one to connect to SQL Server database and choose the tables that needs to be used for PowerPivot reports.





Customer stories:
http://www.microsoft.com/en-us/bi/CustomerStories.aspx

Wednesday, December 12, 2012

In Memory Analytics...

In the past couple of years there has been a lot of growth in the different type of Analytics that is being performed at various organizations. There has been a tremendous requirement for companies to be very Agile and have a speed to market to introduce their products and services. This has resulted in lot of exciting requirements for the BI Analytics domain to fullfill. In the SQL Server space, Microsoft first introduced inMemory analytics through PowerPivot. Now with SQL Server 2012 we have a new type of Analytics called the Tabular Model which uses InMemory Analytic capabilties. This is different from the SSAS model of analytics. These days the end users/Business users have been provided with tools to harness the power of inMemory Analytics. The couple of in Memory technologies available in SQL Server 2012 are  xVelocity Analytics, xVelocity Column Store and Power Pivot. Quoting the Microsoft web site(www.microsoft.com),xVelocity is a family of memory-optimized technologies spanning Analytical, Transactional, Streaming and Caching data. Designed for industry standard hardware, xVelocity is a built-in capability in SQL Server 2012, SQL Server Parallel Data Warehouse, Office Excel 2013 and Windows Azure. Please read the following the blog post in the link provided below about a very fast inMemory analytics project from Microsoft, this is still in Private customer Preview.
http://blogs.technet.com/b/dataplatforminsider/archive/2012/12/11/how-fast-is-project-codenamed-hekaton-it-s-wicked-fast.aspx




Tuesday, December 4, 2012

Data Abstraction-Data Integration

In my earlier blog post I discussed about data integration and how it can provide unified view of disparate data sources. In today's world of data warehousing/data analytics it is becoming increasingly common to have disparate data sources and there are lots of ETL projects which are aimed at consolidating the data into data marts and data warehouses. In today's blog post I would like to introduce a new product called Composite Software which helps in Data Integration and Provides Data Abstraction. It helps the business to perform Data Virtualization. This provides the business with following benefits:

•Simplify information access – Bridge business and IT terminology and technology so both can succeed.

•Common business view of the data – Gain agility, efficiency and reuse across applications via an enterprise information model or “Canonical” model.
•More accurate data –Consistently apply data quality and validation rules across all data sources.
•More secure data – Consistently apply data security rules across all data sources and consumers via a unified security framework.
•End-to-end control – Use a data virtualization platform to consistently manage data access and delivery across multiple sources and consumers.
•Business and IT change insulation –Insulate consuming applications from changes in the source and vice versa. Business users and applications developers work with a more stable view of the data. IT can make ongoing changes and relocation of physical data sources without impacting information users.

Please use the following link to know more about the software:
http://www.compositesw.com/data-virtualization/data-abstraction/

The developers can work with the Composite Studio to develop the views which would be sourced from different data sources. In a nutshell Composite software is a good tool for perform Data integration and Data Abstraction.