Thursday, December 30, 2010

Welcoming 2011

The year 2010 has been interesting in a lot of ways, there were successes and failures. It has been a year where things are begining to recover after the recesion hit us. I sincerely hope the recovery momentum builds up in 2011. Lot of things are anticipated in the new year, will there be a paradigm shift in technology, it will be interesting to see how the events unfold. Some articles are predicting that smarphones will outsell the PC purchases (or has it already), there is a lot of talk about cloud computing, new releases of SQL Server are expected. Wishing every one a happy and prosperous New Year 2011.

Tuesday, November 23, 2010

Cloud based Tools-Microsoft...

One of the time consuming activities  for a DBA in the sql server world is to work with problems related to poor performance which are due to lack of proper configuration of a SQL Server.Usually DBA's have scripts/jobs/PowerShell scripts to detect performance related issues and fix them.Once a problem is detected, then one would deal with research of different KB articles/ use experience to fix the issues. As part of the SQL Server 2011 release, microsoft also released a cloud service which can be used to monitor different sql server's. Here is a link: explaining the basic premise of the Atlanta tool and the beta version is available for download.
http://www.microsoft.com/systemcenter/en/us/atlanta.aspx
Microsoft is embracing the cloud in a big way and making lot of investments in this area. Cloud is going to be an interesting area to watch out for in the years to come.

Monday, November 15, 2010

SSIS New Features...

With the PASS Summit winding down, there is a lot activity in the SQL Server blog sphere about the next SQL Server Release, SQL Server 2011 (Code Name: Denali). One of the areas of improvements which I have been following is the New Features in SSIS. There have been lot of wish lists request, one of the video's/demo which explain these very well is available at the following link: The content in this video is presented by Steve Swartz.
http://www.msteched.com/2010/Europe/BIN303.
The features are broken down into the following areas: User Interface/Designer Experience, Project Management (This explains features which enhance multiple package experience) and Administration.
In The Designer improvements some features explained are Undo and Redo, a new improved mapping editor. This allows one to export and import mappings, this is huge since now it allows the developers to store mapping s outside the SSIS packages. There is also an Zoom in/Zoom out feature which comes in handy while developing complex packages. As part of the Project management, there is a feature called Parameters being added to SSIS packages, communication betwen child and parent packages would improve with this feature.
As part of the Administration piece, a new SSIS server is being built out, this could be installed in the SQL Server instance. This SSIS Server is being built out with the DBA in mind, giving ability to monitor these packages in a more productive manner. In the Adminstration side, the deployment process has become more streamlined and allows one to set parameters and variables by pointing to differen environments. There is going to be a SSMS based SSIS package management feature and dependency analysis.

Saturday, November 13, 2010

SQL Server Denali...

Microsoft has released he CTP1 version of the next version of SQL Server Code named Denali. It is available for download, Quoting Softpedia.com,The first Community Technology Preview of SQL Server Codenamed "Denali" went live on the Microsoft Download Center earlier this week, as Ted Kummert revealed details of the next release of SQL Server at the opening keynote of PASS Summit 2010.
Quoting the http://www.theregister.co.uk/2010/11/09/denali_sql_server_cloud/ article,This version of SQL Server Denali introduces Juneau, what Microsoft is calling a "new-generation" Visual Studio development tool for SQL Server that will bridge the gap between building and deploying apps for on-site and on cloud. There has been a lot of momentum in the cloud space in the last few months. There are 3 major players at this point for the Cloud Initiative, they are Microsoft,Google and Amazon. Microsoft had released its cloud version of the SQL Server Database called SQL Azure close to a year ago and has been refining it ever since. SQL Azure details: http://msdn.microsoft.com/en-us/windowsazure/sqlazure/default.aspx

Friday, November 5, 2010

Open DTS packages in SQL Server 2005...

I am working on a project where in I had to analyse DTS packages and figure out the tables being exoprted/imported. The supported environment i have is only SQL Server 2005 and SQL Server 2008, so i needed way to open up DTS packages in SQL Server 2005. Microsoft has a tool SQLServer2005_DTS available for download in the MSDN site. Once this file is download and installed, one would have the ability to open DTS packages from Management Studio. Looka t the snapshot below:


Once the DTS package is opened, the package open up in the DTS designer of SQL Server 2000. Once the package is open, the DTS package can be exported to a Visual Basic file and open the .vb file in a standard text editor and analyse the contents.

Tuesday, October 26, 2010

Services,Devices and Cloud...

There has been a lot of developments in the cloud computing space with new offerings coming out and the concept being increasingly debated in and outside of the enterprise. The announcement of Ray Ozzie's departure from Microsoft and the resulting departure memo he circulated within Microsoft is a must read. He talks about his vision and what Microsoft's vision needs to be and the reasons for all these can be found in this link. This piece of text i being increasingly read and being debated, here is the link:
http://ozzie.net/docs/dawn-of-a-new-day/
I found the memo very interesting and gave points to ponder being a developer/consultant in the Microsoft BI space.

Thursday, October 21, 2010

Learning SQL...

My Friend and SQL Server Expert Pinal Dave (SqlAuthority.com) has blogged and reviewed a book called Beginning SQL Joes 2 Pros: The SQL Hands-On Guide for Beginners (SQL Exam Prep Series 70-433 Volume 1). In today's age: One should open to learning new ways of doing things and be adaptive. Reading books can help in keeping up or find innovative way of doing things.
One can learn more about Joes2Pros by going to the web site here...
http://www.joes2pros.com/index.php

Thursday, October 14, 2010

SQL Server Mgmt Studio...

During my recent visit to the SQL Saturday Sessions at Raleigh,NC, I noticed a feature which was used by a presenter. Usually while using the SQL Server Management studio, I typically open up query files and they usually are arranged using tabs. Once the number of files open grows it becomes kind of difficult to keep track of the files opened in case you need to go back to a certain query file. During the presenters demo I noticed he was using the solution explorer/project setup available in SSMS. Normally I have used this setup in BIDS or Visual Studio, I rarely used the setup in Management studio. I decided to give it a spin and like the way it is setup. When one chooses the Solution Explorer from the View Menu in SSMS, the solution explorer window pops up on the right hand side. Once this is done once can add projects, get to choose from SQL Server Scripts, Analysis Server Scripts and SQL Server Compact Edition Scripts, these are options in SQL Server 2005. I choose Sql Server Scripts and once the project was created, it creates 3 subfolders called Connections,Queries and Miscellaneous. Once this is done, one can add the existing .sql files to the queries folder by choosing the Add Existing Item under the Projects folder. This way all the query files are lined up under the queries folder. I found this setup to be a easy way to open up files if one has to go back and forth between scripts. This is also a good way to organize scripts within SQL Server Management Studio.

Sunday, September 19, 2010

SQL Saturday - Raleigh...

I had the opportunity to attend the SQL Saturday session held at Raleigh on Saturday 18th sept. It was a great learning experience listening to the Speakers (all of them Microsoft MVP's) deliver interesting lectures on varied areas of SQL Server. My first session was with Rafael Sallas (MVP) on why we should Care about dimensional modelling. The lecture was an introduction to dimensional modelling,star schema, Kimball methodologies. The second session which i attended was by Tim Chapman (MVP) on Dynamic SQL. The speaker did a great job on doing demos which made the session very interactive. He showed how the performance is for dynamic sql, also showed how to minimise SQL injection attacks while using dynamic SQL. One of the examples he showed was for a web app which does dynamic searches. The dynamic SQL which he showed was really neat and picked up very good execution plans. The next session was on SQL Server 2008 SSRS new features by Mike Davis (Pragmatic Works). Lot of cool demos on Maps, Data Bars and Sparklines, Report parts and Shared Datasets. During lunch time there was a presentation by Red gate on a new product introduced by them called SQL Source Control. Very nice product works with TFS and SVN source Control systems. After lunch he first session was on identifying performance issues and fixing them. A very nice interactive session and lot of questions were asked by developers. Following this attended Andy Leonard (MVP) session on SSIS design patterns. A great lecture on how one should go about developing an SSIS framework for the enterprise. Neat demos on performing incremental loads in SSIS packages and also discussed scope of variables and performing auditing. The final session was on advanced T-SQL by Kevin Bole (MVP), solving problems using SQL and some very innovative approaches to problem solving.

Wednesday, September 15, 2010

MDX...

One of the key aspects of working with SSAS is understanding MDX and its basic concepts. In order to work with MDX comfortably, one should have familiarity with dimensional modeling and SQL Server Analysis services. There is content on SQL Server 2008 books online which gives insight into concepts which are essential for MDX. Concepts like Sets,tuples,members,levels etc... need to be understood before getting into MDX. There is a book which saw on the net which I find useful:
Microsoft SQL Server 2008 MDX Step by Step, by Bryan C. Smith and C. Ryan Clay,  both of Hitachi Consulting, shipped to the printer in January and is available now (Microsoft Press, 2009; ISBN: 9780735626188; 400 pages). The prerequisites and the content are clearly defined, the book also comes with a companion CD. Using a book, books online one should practice as much as possible to get a handle on MDX.

Monday, September 6, 2010

Data Visualization...

I am working on a project using SSAS to build out cubes. One of the requirements as part of the project is generate dynamic reports. One of the tools which i have come across is Tableau software, here is the link for the software web site, http://www.tableausoftware.com/. There is free trial download available at this web site. The tool is very rich when it comes to data visualization and also the kind of reports that can be generated is very rich in terms of features. Please read the following link for details http://www.tableausoftware.com/about

Tuesday, August 31, 2010

Shared Datasets...

There is a new feature in SQL Server 2008 R2 called Shared DataSets. In prior versions of reporting services one would create datasets to get the desired results/output. In cases where same datasets had to be used in different reports, one had to re-create the datasets. This situation comes up when you have the same parameters in different reports. In SQL 2008 R2, one can create shared datasets or convert a regular dataset to a shared dataset. The pre-requisite for creating a shared dataset is that it has to be done on a shared datasource. Once share datasource has been created, one can create a shared dataset or convert a normal dataset to a shared dataset. The shared dataset can be deployed like a shared datasource, shared datasets gives us improved maintainability. For a more detailed overview, click on the following link:
Prologika Forums:
http://prologika.com/CS/blogs/blog/archive/2009/11/16/shared-datasets.aspx

Friday, August 27, 2010

SQL Server 2008 R2 Trial Version...

Recently I downloaded the evaluation version of SQL Server 2008 R2, the 32 bit version. I installed all the components like Integration Services,Reporting Services and Analysis services. The installation went of smoothly without any issues. I was interested in checking out the new features in the reporting services/ report builder 3.0. There are new data visualization features which have been added. Follow this link to download the evaluation version and look for more SQL Server 2008 resources.
http://www.microsoft.com/sqlserver/2008/en/us/R2Downloads.aspx

Monday, August 16, 2010

Policy Management...

One of the features available in SQL Server 2008 is the ability to create and evaluate policies on different database components. The policy management feature allows an administrator to effectively maintain and enforce database standards on a server. For example some of the policies would like to check the compatibility level of all the databases on a SQL Server 2008 database server, checking database backups (how recent the backups are..). There are lots of  components on which policies can be created. The policies can be evaluated on demand or run on schedule. In addition to running standard scripts for database administration, policy management gives additional tool kit for the administrators. The policy management option appears Under Management in the Sql server Management studio. Here is a snippet of the policy management.

Friday, August 13, 2010

SSRS - Drill Down...

Usually while developing reports, I have the requirement to develop a summary and detail report. Based on the requirements I create the detail reports which are opened by the Javascript windows.open function. The reason why my users like this is because the the detail report appears in a separate window and if they have dual monitors they can lay the reports side by side. Recently I ran into an issue with this approach, the problem was the length of the URL string exceeded 260 characters. There are parameters which are passed to the detail report from the summary report, the URL string is built by concatenating the parameter values. The root cause was that the parameter values and labels were string values, this caused the http query string to exceed the limit of 260 characters. In order to resolve this I had two options:
1. Look at the tables feeding the parameter values, make sure there is a ID,description column. This would enable me to pass the ID's across to the detail for the parameter values.
2. The second option is to use the native SSRS drilldown to sub report option. The drawback of this approach is that the detail report would layover the top of the summary report. The user would have to click on the Back arrow in the browser to go to the summary report.

Wednesday, August 11, 2010

Loops Vs Set Based Approach...

Recently I had a requirement where given a range of dates, I had to perform the counts for each date within the date range. Initially I had started off approaching the problem by using a While Loop construct and running the select statement for each date in the date range. The solution worked but the performance was bad, the user had the choice of giving the date range, so the performance would get a hit if the date range is very wide. After researching for a while I came across a query in ORACLE which used the CONNECT and LEVEL clauses to handle the problem which i had. This was a set based approach with no looping involved and apparently performed very well on oracle. I decided to use this query as a starting point and created a CTE to be populated with the range of dates within the date range. I use the following query to create the CTE:

WITH t_dates(rptday) AS  
(SELECT CONVERT(VARCHAR(10),@ldt_failstartdate,121)

      UNION ALL
SELECT CONVERT(VARCHAR(10),DATEADD(d,+1,rptday),121)
FROM t_dates
WHERE rptday < CONVERT(VARCHAR(10),@ldt_failenddate,121)
)
Once this was done, immedietly followed it up with the main SELECT statement which calculated the counts. The performance of this setup was much better than the Looping construct. This was an instance were I was able to replace the looping construct with a SET Based solution to achieve better results.

Tuesday, August 3, 2010

SSRS-Backgorund Colors-Part Two

Alternating background colors in a report is a request which comes very often as part of report development requirements. Alternate row colors can be get very tricky as one moves into matrix type reports, also when one starts to have multiple groups. Recently i had a problem in a report trying to alternate colors which had multiple groups. My report had  two adjacent groups, one of the groups in them had a child group. I had to alternate the background colors of the rows across the groups in the report. I started using the groupvar solution which i had posted earlier but ran into issues since there were multiple issues. Lisa Slater an SSRS expert know for her very detailed method of presenting innovative solutions help me out. Here is a link to her blog which has the definition of my problem and the solution.
http://www.spacefold.com/lisa/post/2010/08/02/YAPS-on-Why-Groups-dont-make-everything-childs-play-in-RDLs.aspx
Thank you Lisa for taking the time to analyse the problem and present a solution.

Wednesday, July 28, 2010

SSRS-Column Text

Recently I had to develop a report where column names had to have text in vertical direction. There have been enhancements made in Sql server 2008 R2 called Rotating Text 270 degrees. Currently working on SQL Server 2008, this option is not there. In order to make the Column header text vertical, I had to set a property on the column textbox called WritingMode and set it to Vertical. Once this was done I got the column header to show up in a vertical fashion. The other requirement was to center the text. Initially I set the following properties of the textbox:
They were TextAlign set to Center, The VerticalAlign was set to Middle. I ran the report after the properties were set and still the column text was not perfectly centered. After playing around with the different properties, I set the Property CanGrow to false. Once this was done, i got the text to center in the column textbox. Interesting this took a while set it right, at first sight it seemed very easy requirement. In summary the following properties were set:

WritingMode  Vertical
VerticalAlign  Middle
TextAlign       Center
CanGrow       False.

Tuesday, July 27, 2010

Page Breaks in SSRS...

I am posting this link in response to one of the reader's (sumit's) comment about on how to assign a Page break after a specific number of rows. The requirement is interesting since normally page breaks are based on groups and these are usually set at the Group level. The following link talks about dynamic grouping in great detail. There are very interesting ways suggested to achive this result.
http://www.spacefold.com/lisa/post/2009/03/08/Dynamic-Page-Breaks-in-SSRS-Going-to-Eleven.aspx.
I have not run into this situation before, apparently from the above link, there other reporting tools which offer conditional pagebreaks....
Also this is taken form the MSDN forumns:
BOL 2008:
Page Breaks
In some reports, you may want to place a page break at the end of a specified number of rows instead of, or in addition to, on groups or report items. To do this, create a group in a data region (typically a group immediately outside the detail), add a page break to the group, and then add a group expression to group by a specified number of rows.
The following expression, when placed in the group expression, assigns a number to each set of 25 rows. When a page break is defined for the group, this expression results in a page break every 25 rows.

=Int((RowNumber(Nothing)-1)/25)

Wednesday, July 21, 2010

Export to Excel - SSRS

As a Follow-up to my previous post on exporting reports to Excel. There are 2 ways to export a SSRS report to excel. The first way is to use the dropdown provided for export in the Report Viewer: When you choose the following option below, the excel export retains the formatting of the data provided in the SSRS report.


There is a second option on how to export to excel. This happens when you right click on the report and then you have the pop up menu which comes in the IE browser, there is an option to export to excel here. When you choose this option, only the data in the SSRS report is exported, it loses the custom formatting. It is in this option that one tends to start seeing blank columns when you have the interactive sort turned on the columns in the report. I should have been more explicit when i blogged about the excel export option in my previous post. Here is the image for the second option:

Friday, July 16, 2010

Interactive Sort,Blank Columns,Excel Export

One of the constant/standard task that happens when a SSRS reports is developed is that the data in the report is exported to Excel. I had developed a report which had the Interactive sort turned on the columns. The user liked this feature so we made it a standard requirement while developing the reports. One of the issues which I faced was when the report (which had the interactive sort on the columns) was exported to excel, there were blank columns being added after each column. Then i went through forums and discussions, checked the alignment of each column in the report, made sure there were all lined up correctly, there were no headers or footers overlapping the data area. Once these were all checked, did the export , still saw the blank columns. Then i started turning off the interactive sort on couple of columns, did the export to excel, the blank columns did not appear. So it turned out that the interactive sort on SSRS reports can distort data when exported to excel. Here is a link from microsoft which states these features are not persisted in the export.
http://msdn.microsoft.com/en-us/library/ms345232.aspx
Interactive sorting


Preview/Report Viewer
For tabular repots, users click sort arrows on column to change how the data is sorted.

HTML
For tabular repots, users click sort arrows on column to change how the data is sorted.

PDF
Not available in PDF.

Excel
Not available in Excel.

Monday, July 12, 2010

Managing Reports...

I am working on a project where i had to clean up unused,not needed reports from our portal. As part of the project the unused subscriptions also had to be removed. The portal has several folders with a quite a number of reports. In order to make the task easier, I utlised the tables available in the ReportServer Database, which hold data regarding the reports and its schedules. Some of the important tables to be aware of are 1. Catalog, 2. Subscriptions, 3. ExecutionLog, 4. ReportSchedule, 5. Schedule. One can write queries involving these tables to get a list of reports which have subscriptions, the reports which have been accessed by users and when the reports were pulled. I essentially made a SSRS report out of the queries and used the report to manage by cleanup operation. I would like to thank Devin Knight for his blog entry on the same subject, http://www.bidn.com/articles/reporting-services/110/monitoring-reporting-services-subscriptions.

Here are some of the queries which I used: The following query gives the Report name, the last time it was run, the status of the run and the user who accessed the report. The LastRun Time and Last Status comes from the Subscriptions table.

select Catalog.Name,UserName,MAX(Subscriptions.LastRunTime),
LastStatus from Catalog
inner join ExecutionLog
ON ReportID = ItemID
inner join Subscriptions
On Report_OID=ReportID
WHERE Catalog.Path like '/Your Path/%' -- This is the report path on the report server.
GROUP BY Catalog.Name,UserName,LastStatus
GO

One can also use the following query to find out which user is getting subscriptions (there could be a better way..)
select Name,UserName,MAX(LastRunTime) As LastSentDate,LastStatus from Catalog

inner join ExecutionLog
ON ReportID = ItemID
inner join Subscriptions
On Report_OID=ReportID
WHERE LastStatus LIKE '%FirstName.LastName%' --One can enter a email address or a pattern here.
 
The tables in the ReportServer database are helpful while managing and adminitering the Reporting Server.

Friday, July 9, 2010

SSIS 2008-Conditional Split

Initially in the SSIS packages which were developed,  I used a lot of Execute SQL Tasks which had calls to stored procedures. Then there were business changes I had different data sources to pull the data from. This led me to use different Data Flow Transformations in the Data Flow Task. One of them i used was the Conditional Split transformation. This transformation can be used to go through a dataset row by row and perform checking on different columns and create different output paths. In my Data flow, I had data coming from a sql server table, the data had 5 columns. Based on the CustomerType column I had to direct the output to different data sources, In my case i had to create 2 different text files as outputs. When you right click on the conditional split task and click edit, the Conditional Split Transformation editor opens up. In this editor one can create Different ouput streams and give the conditions for each output, also there is a default ouput name. The conditional split editor provides different functions which can be used in conditions. Data Transformations tools provide neat ways of performing transfomations but at the same time one has to test the performance of these tasks before deploying to production.



Wednesday, July 7, 2010

SSRS-Reference to Assemblies...

In one of my earlier posts I had written about the use of Report templates in order to standardize the development of reports. One of features that is available is the use of references to assemblies. In certain cases we would have code as part of the report properties. In the Report properties window there is a section called References, here one can add assemblies and also add classes available in assembly. These classes were created in Visual C#, in these classes certain standard functions were created. This would decrease code duplication in the reports for certain standard calls, for example use of dates , names in certain formats etc. The functions can also encapsulate certain standard Business functions, these function calls can be made from reports to provide consistent use of business rules. The assembly is usually a compiled .dll file which needs to be referenced as shown in the figure below. The name of the .dll has been masked since these are used at work.

Tuesday, July 6, 2010

SQL Server 2008 R2...

The new version of sql server, sql server 2008 R2 has been out for a while now. There has been a lot of new features added to the reporting services. These features have been listed here:  The features have been categorised into various categories like Data Visualization, Report Layout and Rendering to name a few:
http://msdn.microsoft.com/en-us/library/ms170438.aspx#ReportAuthoringTools

Wednesday, June 30, 2010

Attribute RelationShips...SSAS 2008

One of the features that has been expanded upon in SSAS 2008 when compared to SSAS 2005 is the concept of attribute relationships. In SSAS 2008 there ia Attribute realtionships editior which is GUI based. It allows the developer to explicity to define the realtionships between the diferent attributes in a dimension. This concept is very important to be understood while building a dimension as this has an effect on the performance of the cube. As per MSDN: attribute relationships offer the following performance advantages:

An attribute relationship provides the following advantages:

• Reduces the amount of memory needed for dimension processing. This speeds up dimension, partition, and query processing.
• Increases query performance because storage access is faster and execution plans are better optimized.
• Results in the selection of more effective aggregates by the aggregation design algorithms, provided that user-defined hierarchies have been defined along the relationship paths.

Attribute relationships are sometime inherent and sometimes it is user defined, this would happen while defining ragged hierarchies.

Monday, June 28, 2010

Data Export/Import...

Usually I use SSIS to perform Data Import/Export type of activities. Recently I ran into a situation where I had to use the bcp utility to perform the export/import data activity task. The schema's of the the databases involved were identical, the databases were on different physical servers. Initially i used the bcp out command to export the data to text files (with .txt extension), once this was complete I set up the bcp in for the import of data. To my surprise i found that there were errors while importing the data. I expected all of the data to go in since the table definitions were the same, I tried using different flags on the bcp still i did not get it to work, the flag -n was set on the export and the import. At this point i tried a different command to perform the import, I used the BULK INSERT command. The Data Import worked fine, the counts of the source and the destination matched perfectly, the content was intact as well. The import option which i had to use was the Datafiletype = native.

BULK INSERT Testdb.dbo.Test_Table FROM "c:\TestImport\Test\test_table.txt" WITH (DATAFILETYPE ='native')

GO

Ram's Blog: SSRS-Backgorund Colors-Group Variable...

Ram's Blog: SSRS-Backgorund Colors-Group Variable...

Thursday, June 24, 2010

SSRS-Backgorund Colors-Group Variable...

I was recently working on creating a SSRS report in SQL Server 2008 which required alternating background colors. Initially it seemed a easy requirement but as the report got built out it had a row group and a column group which made the getting background colors tricky. Initially I went with the RowNumber and Running value functions, it did not get the required effect. In an earlier blog post i had mentioned the use of DENSE_RANK() function in a stored procedure to get the background color effect. I recently came across another approach which is using Group variables(I would like to acknowledge the MSDN forums and blogger's who have mentioned this approach), which works very well. First you would need a custom code function which would go in the Report Properties code section:

Public EvenRow as Boolean
Public Function AltRow() as Boolean
EvenRow = Not EvenRow
Return EvenRow
End Function

Once this is done, In the group properties (this would be the data point/column in the dataset for which we need the alternate row color), under variables create a variable called AltRowColor, in the value section call the function =Code.AltRow()

Select the whole row for which you need the alternate row color, in the background color property enter the following expression: =iif(Variables!AltRowColor.Value,"Gainsboro","White"). Once this is completed run the report you should see the alternate row colors. This was done in SSRS 2008 with the tablix control which has Row and Column Groups.

Reports Drill Down

When developing reports, I have  usually would have to develop summary and detail reports. On the summary report I would have to have hyperlinks for certain data points to drill down in the detail of the data. For the drill down of reports I tend to use the Javascript:windows.open function. I usually will have a function created which would build build the detail report path along with the parameters that is required for the detailed report. I ran into a situation where when i clicked on a data point to open the detailed report, i kept getting an error with the link which i was trying to build to open the detailed report. I investigated the issue, the function, the windows.open syntax everything looked fine. I did not get any errors while trying to deploy the report, it was taking a long time to fix the issue. Then i started to look at the parameter values that were being passed, then my colleague who was working with me on the report stumbled into the issue. The problem was that there was data for one of the parameters which had a + sign embedded in it. This caused HTML to think of it as a concatenation character, which was throwing off the report path. The source of data for this parameter was fixed, the + was removed, the drill down started to work fine. While sourcing the data it is important to profile the data and scrub it, remove any unwanted special characters in the data and keep the data clean. In a nutshell Data Validation and integrity is very important otherwise one would never know where problems might arise.

Friday, June 18, 2010

Report Templates

One of the most important aspect of report development in a Business/Corporate setting is delivering reports with consistent look and feel. In order to solve this problem, one can create standard report templates which have a standard company logo, Page header and page footers. In these templates one can also include any customcode that is standard across all reports. There can also be managed code/dll references which has standard function calls that can be used during report development. These templates are usually stored in the following path (For visual Studio 2008): C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE\PrivateAssemblies\ProjectItems\ReportProject. The use of templates really in standardized report development, bringing a consistent look and feel for reports. In SQL Server 2008 R2 there are features like report gallery where one can store standardized images/logos and report parts.

Friday, June 11, 2010

SSRS-Backgorund Colors...

I was recently working on creating a SSRS report in SQL Server 2008 which required alternating background colors. Initially it seemed a easy requirement but as the report got built out it had a row group and a column group which made the getting background colors tricky. Initially I went with the RowNumber and Running value functions, it did not get the required effect. I was researching this and found a a discussion on MSDN where in one of the moderators had suggested the USE of DENS_RANK() function in the sql being used to create the DataSet. In my dataset there are multiple records appearing for a single state spanning different months. The Use of the DENSE_RANK() fuction allowed me to persist the Rank across multiple occurences of months for a singlke state. The Row_num column was created as part of the result set and sent it back to the DataSet in SSRS. For example: SELECT a,b,c,DENSE_RANK() OVER (ORDER BY state) AS Row_Num.
In the Report on the row of the tablix control, in the background property I used the expression =iif(Fields!Row_Num.Value Mod 2, "Gainsboro", "White"). This gave me the required alternate color effect on a tablix report with a Row and Column group.

Tuesday, May 25, 2010

MSDB,Data Collector

Recently i had decided to enable the Data Collection feature in sql server 2008. This is a very neat feature to capture some important statistics about the database and queries. When you enable this feature it also creates SSIS packages. One day when we were cleaning up all the unwanted SSIS packages, we decided not to use the Data Collection feature hence removed all the SSIS packages. I encountered a problem while removing the SSIS packages, go some errors pertaining to Foreign Key constraints. This lead me to deleting the syscollection tables in the msdb database, this was a wrong thing to do i realised later. These system tables were deleted. Later i had to stop and start the dev server, once the dev server restarted i started getting errors while trying to expand the Management folder in SSMS. I realised that there were views in the msdb database which were referencing the system tables. This was kind of a mjor problem since msdb houses lot of important information related jobs,SSIS packages,policies. Then based on the article here http://feodorgeorgiev.com/blog/2010/03/how-to-rebuild-msdb/
i located the install msdb script, i used this to recreate all the data collector system tables, once this done my errors with SSMS went away. The take away was to completly understand all the related objects which are created for the new feature (here data collector) and plan the disabling of the feature in a organized way.

Thursday, May 13, 2010

Map Reports...

I have been working on report builder 3.0 and trying to understand and build out map based reports. Usually i have using the sahpe file option or use the USA maps given in the map wizard to build out sample reports. One of the other option available is to build a spatial query. This is done by running a query against a table that has spatial data types defined, in my case I had a table with a location column and another geographytype column, this column had the LINESTRING between 2 points. When i built out the query
DECLARE @GeoLocation GEOGRAPHY


SET @GeoLocation = GEOGRAPHY::STPointFromText('POINT(-80.7941 35.3189)',4326)
SELECT NAME,Distance,Location
FROM
(select Name,(Bank.STDistance(@GeoLocation)/1000.00)* 0.62 AS Distance,Location
from venue) AS a
WHERE Distance <= 20
and executed i got back the results. In the results i had data like LINESTRING(-80.88 35.0228, -80.7941 35.3189), since there was LINESTRING data, the report builder prompted me 2 different kind of line maps. The first one was like a regular line based map, the other one was to display analytical data. To make these maps more meaningful, it would be good to add a tile layer using Bing maps. This way the line maps would make more sense visually. Here is the image without the bing maps background:

Monday, May 3, 2010

Geocoding...

Recently started to work on Geocoding project where in there is going to be use of report builder 3.0 mapping report features. To begin with what does geocoding mean: as per wikipedia(http://www.wikipedia.com/) definition:Geocoding is the process of finding associated geographic coordinates (often expressed as latitude and longitude) from other geographic data, such as street addresses, or zip codes (postal codes).  The idea here we have a list of addresses and pass it to a geocoding software, the one being used is alteryx,http://www.alteryx.com/pages/default.aspx. The software in turn produces a list of geocoded locations, these can then transferred to a spreadsheet or the out put can be directly routed to a spatial database, in this context is to use the spatial data types in sql server and receive the data, This portion is still being tested out. The issues is whether the geocoded locations can be stored as a geometry or geography data type or store the latitude,longitude as an integer. Once the geocoded locations are stored in a sql server table, this can be used for reporting purposes using report builder 3.0/SSRS.

Wednesday, April 21, 2010

Report Builder 3.0 Part 2...

In my earlier post on Report Builder 3.0 which is part of sql server 2008 R2, i had written about the basic features available. As i get to work on a geo coding project and playing around with the Maps feature available, there are basic elements of spatial terminology which needs to be understood so that a better map based reporting soluton can be developed. The technet microsoft site mentioned here
http://technet.microsoft.com/en-us/library/ee240845(SQL.105).aspx goes over basic terminologies associated with Map based reports like Viewport,Shape files, Spatial dataset,Polygon,point,line and tile layers, all of these in totality makeup the map report. A better understanding of the concepts would enable us to build out better map based reports. Here is a picture of map based report generated with report builder 3.0, one of the neat things that can be done is add Points(here it is in the shape of a push pin) to highlight locations.

Thursday, April 15, 2010

EXCEPT,INTERSECT

I have been working a project where in data needs to archived from the main reporting database at ed of each month. Once the data has been archived, I run a validation stored procedure to check the counts in the archive and the main source table. Once the validation is complete, a message is sent out to the developers indicating how many matches and/or mis matches are found in the data. I noticed the validation process was taking too long to report the results since the number of checks that were to made were on a small set of data. Here is where i tried the use of INTERSECT and EXCEPT clause available in SQL Server 2008. The results were generated very quickly since these were set based operations, earlier i was using the cursor approach which was taking too long to report the comparison results. Here is an example of the intersect and except clauses. For the matches I used the INTERSECT clause:

SELECT SummaryTable,cnt FROM #tmp_Arch

INTERSECT
SELECT TableName,cnt FROM #tmp_sumry

For the rows which were not macthing I used the EXCEPT clause.

SELECT SummaryTable,cnt FROM #tmp_Arch

EXCEPT
SELECT TableName,cnt FROM #tmp_sumry

Tuesday, April 6, 2010

Virtual DB...

In today's IT world, one of the concepts which is taking prominence in the server and infrastructure management, is the concept of virtualisation. This allows the companies to consolidate servers, this in turn gives the company to save on expenses related to server management. Virtualisation also is one of the skills sought after in network/database administrators. Along the same lines of thought is the product SQL Virtual DB bought out by Idera (http://www.idera.com/), the link for the product is http://www.idera.com/Products/SQL-Toolbox/SQL-virtual-database/. Pinal Dave (SQL Server MVP) has outlined a pretty detailed demo of this product in his blog (http://blog.sqlauthority.com/). The features of this product are interesting and useful especially when a Virtual Database is created from existing database backup, the Virtual DB does not ocuupy any additional disk space. The regular sql queries can be run on the Virtaul Database, these kind of databases can be used to create effective Test databases which can mimic production databases.

Thursday, March 25, 2010

PowerShell 2.0 - Capture file names

Powershell 2.0 is a very powerful tool  and certainly can be used for sql server database tasks. One of the rquirement i had was to get the list of filesnames and the lastmodified date in a particular drive/folder, use the most recent file as a excel file source to one of my SSIS packages. One of the approaches as mentioned here was to create a table which will contain the filename and the last modified date. In this case the all the files from different days have the same schema/layout. One of the first thins i did was use a powershell script to look for the file with the given extension and store each of the file and date into the sql server table. I would like to thank Sean McCown for showing how to use the dir command in powershell to find files, described in his blog(http://midnightdba.itbookworm.com/). Here is the script which finds the files, then I loop through the collection and insert records into the sql server table. I developed this in Windows Power ISE editor version 2.0, In this script i look for excel spreadsheets.

add-pssnapin SqlServerCmdletSnapin100

set-location "c:\ssis\"

$query=“INSERT INTO Table_1 SELECT "

$files=dir -recurse -include *.xls

foreach ($file in $files)

{

     $file.name,$file.LastWriteTime

     $fquery = $query + "'"+$file.name+"'"+", "+"'"+$file.LastWriteTime+"'"

     invoke-sqlcmd -query $fquery -database MyDb -serverinstance MyServer
}
 
The above script inserts records into the Table_1 table in the database MyDb. Once this is done in the job step, in the next step I run a query to find the most recent file and use that as the file as an excel source in the SSIS package.

Monday, March 15, 2010

PowerShell 2.0

PowerShell is slowly but steadily gathering steam if one were to gauge by the number of powershell projects in codeplex. PowerShell 2.0 is now available for download from the micorosft site: http://support.microsoft.com/kb/968929. The powershell version 2.0 includes a script editor which is called Powershell ISE, the editor makes it a lot easier to write powershell scripts.  Here is a snapshot of how the Windows ISE Editor looks...

The script editor has the ability to open multiple script files by using the tab interface. The key powershell cmdlets are highlighted in color which provides more readability. The script editor also has a help section which provides pretty good examples in powershell commands. With the advent of the editor writing  Powershell scripts for sqlserver has become much more easier in terms of coding and debugging the scripts. There are different ways to get into the sql server powershell mode, one is using the PSSnapin command:
Add-PSSnapin SqlServerCmdletSnapin100,
Add-PSSnapin SqlServerProviderSnapin100
The other way is to make SMO calls and instantiate SMO objects pertaining to SQL Server. There is a popular SQL Server PowerShell Library availalbe in Codeplex called SQLPSX, the latest version is 2.1,here is the link:
http://www.codeplex.com/SQLPSX

Monday, March 8, 2010

SQL Saturday Warpup

The sql saturday seesion which was organised in Charlotte was a super successful event. The event had a lot of sessions by around 14 MVP's with topics ranging from SQL Server Internals to Data Mining. These events are now going to be owned by PASS. The topics which i attended included Integrating Reporting Services With Sharepoint, Adding Bells and Whistles to Reporting services these were done by MVP Jessica Moss, PowerShell for Data Professionals by MVP Aaron Nelson (http://www.sqlvariant.com/) and Harnessing Power Pivot By Rushabh Mehta(Managing Director, Solid Quality India Private Limied), Intorduction to SSIS packages by MVP Andy Leonard(http://sqlblog.com/blogs/andy_leonard/default.aspx) . All the sessions were very information and interesting with a lot  of tips thrown in by professionals. It was definitely an eye opener for me with respect to PowerShell and PowerPivot with range and capabilties of these tools. Apparently there are more than 240 projects in codeplex for PowerShell, it was amazing to see the power of the powershell scripts. It also was evident from some of the presentations that Micorsoft is pushing sharepoint as a mjor presentation platform for BI. All the sessions I attended were packed to capacity with folks eager to know what is coming up next, of course the best thing was the whole event was free. This was my first sql saturday camp experience hoping to attend more in the future.

Friday, March 5, 2010

Team Foundation Power Tools

I use Visual Studio 2008 with Team foundation Server for all our projects. This includes .NET and BI based projects. Team Foundation Server is a very nice tool which allows for very good source control of the code base. One of my collegaues gave me an insight into the Team Foundation Power tools. This is really a cool tool and helps out immensely when you are working as part of a development team. Prior to installing this tool, the Visual Studio 2008 has to be complety closed down. Once the Power Tool is installed, open up Visual Studio and launch The team explorer. Once the team explorer comes up, it will list the team projects, within the team projects folder, there is a folder called team members. Once you expand the folders it will list the team members who are online/offline. One can righ click on the team meber, you will get the following options like: Show Check in history, Show Pending Changes and Show Shelvesets. The neat thing about this feature one can resolve any conflicts in Check in/out, get information of the pending changes which are going to be rolled into the codebase. There is an another feature called alerts which has different calssifications, where by one can get alerts. Here is a a list of events for which alerts can be setup. In a nutshell the Team Foundation Power Tools provided features to mange projects and builds effectively.



Thursday, February 25, 2010

SQL Saturday...

SQL Server Events and conferences are very helpful with respect to learningabout SQL Server, getting to know to your peers and experts in the SQL Server domain. These kind of events also provide an opportunity to network and exchange best practices. The SQL Saturday events was pioneered by SQL Server Central, it is a one day free event aimed providing insight into different aspects of sql server. A Sql server saturday event is being scheduled to be held on March 6th at Charlotte,NC. Here is the link to event providing registration,scheduling information. It has detailed information on all the tracks.
http://sqlsaturday.com/33/eventhome.aspx

Friday, February 12, 2010

Disaster recovery...

Recently we were asked to present disaster recovery plans for different parts of IT, business intelligence and reporting group was one of them. There are lots of approaches to disaster recovery, it starts from backups for databases and runs all the way upto bringing up a failover server when the primary server crashes. These plans should be an integral part of oragnisation and should not be made as after thought. One of the approaches we took as recovery plan for our reporting database servers was that we had a failover server. This server was configured exactly as the primary server in terms of memory,space and databases. Apart from the regular database backups which were being done on the primary server, we also had to make sure that the failover server can perform as the production server in times of disaster. We did not choose log shipping or database mirroring in this case, instead we ran the same processes in parallel on both the primary and failover servers. This approach worked the best for our situation. One of the tasks we had to make sure happened also maintain the standard databases required for SSRS reporting, since these databases contains the report databases catalog. One of the key issues to be made part of the disaster recovery plan is to include items like how quickly can the failover server be up and running for the regular operations to continue.

Friday, February 5, 2010

SQL Search

Recently came across a mention about SQL Search developed by on the SQL Server Central Site. I decided to check it out, downloaded the software from the red gate site(www.red-gate.com). Once SQL Search is installed, at the end of the installation there is a mention to open Sql server management studio. Once i opened the SSMS and connected to the server, there is a icon in the toolbar called SQL Search. Once you SQL Search it opens up a window like the one shown below.


The sql search window is split into two, on the top window, the term to be searched is entered in the textbox on the left hand side, in the next dropdown choose the kind of object (like tables,stored procedures,views) where the term need to be looked up, in the next drop down choose the database where term has to be searched. Once you start typing the word like product, the tool starts displaying objects that contain the term. When you click on the detail row, if the detail row is a table the control shifts to the object explorer where the table is present, if the row is a stored procedure or trigger, the text containing the term is displayed in the bottom window. In a nutshell SQL Search is a very helpful and intutive tool for database developers and administrators.

Monday, February 1, 2010

SSIS-Logging

When working with designing and developing ETL processes, one of the important aspects that needs to factored in is the auditing of the ETL process. It is very important to keep track of the flow of process with in ETL. While working with SSIS, it is important to design auditing feature for a SSIS package. There are several ways of auditing SSIS packages, one is use to the SQL Jobs history to figure out if there was an error in the SSIS job step and then troubleshoot the problem. This might not effective way to audit, the other option would be is to create an audit table with in a sql server database and keep inserting records into his table at he beginning and completion of each step. This could be tedious, since there could be lot of calls to the same stored procedure using multiple execute sql tasks. One a new SSIS package is being created, SSIS itself provides logging capabilities. The logging option can be seen by clicking anywhere on the design surface of the SSIS package. When you click on the logging option, the following window pops up, there will be SSIS package and the various tasks in SSIS package on the left. On the right hand side within the window you have two tabs Providers and Logs, Details. When you choose the Provider type there are different types available:

SSIS log provider for SQL Server,
SSIS log provider for Windows Event Log
SSIS log provider for XML Files
SSIS log provider for Text Files
SSIS log provider for Sql Server Profiler
For my project I choose the SSIS log provider for SQL Server, Once this is done check the box next to the option SSIS log provider for SQL Server, then  choose the database connection where the logging will happen, when SSIS package runs it creates a table called sysssislog. In the details tab, choose the Events that need to be tracked, look at the figure below for reference.

Once this is done, click the OK button, Logging is now enabled for the SSIS package. Once the package is executed, look for the table sysssislog in the System databases under the database which was chosen in the Configuration column in the Provider and Logs tab. The sysssislog table has columns called source (contains the name of each task which is executed), message which contains more detailed message about the task,event (which contains the event that were chosen to be audited). The table also has columns called starttime and endtime which indicates the start and end time of each task.
SELECT * FROM dbo.sysssislog
To summarise, using the Logging option in SSIS, one can audit SSIS packages very effectively.

Monday, January 25, 2010

SSIS 2008

I was working on a Script Task in SSIS 2008, the purpose of the task was to traverse a folder and list the files in the folder. The language used for the script was Visual C#2008, in the script window I finished the script and saved it. Once i was back in the control task there was red cross sign in the script task, the error message said binary code not found. I have come across this error in sql server 2005, there were fixes issued for this by Microsoft, also there was a property for the script task in 2005 which was called precompilebinaryintocode (something like that) which had to be set for the error to go away. In 2008 that property is not present, I was puzzled initially as to how to fix the issue. I opened the script task, clicked on the edit script button to get into the script editior. In the menu for the script editior there is a Build menu option, when i clicked on it, I choose the script which had to be built.


The script name will be something like st_cxxxxxxxxxx, once i built the script, there were compilation errors. I fixed the errors in the script and saved it, when i got back to the script task in the main control task window, the binary code error went away. Based on this, it would be a good idea to build script, fix any errors and save the script, this could be one of the ways to prevent the Binary code not found error.

Tuesday, January 19, 2010

Wide Table...

I had a SSRS Report in SQL Server 2008 which using a table which a lot of columns. The columns in the table kept getting added as requirements kept changing/modified. I hit a point where the report performance slowed and decided to investigate the root cause. Ideally this situation should not have arised if the design of the table was done properly. The table which SSRS report was using had more than 300 columns, the reason this happened table was designed as a flat table. I decided to break this table out into a category master table and a category metrics table. The master table was a like a dimension table which had all the category id's and the corresponding category names. The metric(fact table) had the id's and the metrics for each id. In order to populate these table i wrote a stored procedure and this stored procedure also had to feed the data to my report. Given that my inital flat table with the lot of columns was populated, the performance of the report using the  dimension fact table setup was very much faster. The time that the report was taking earlier before the design was close to a minute, after the table split the run time was around 15 seconds. Based on this exercise what i learnt was that it is not necessary a wide flat table with huge number of columns will give better performance.

Tuesday, January 12, 2010

Parameters in Report Builder...

Report Builder 2.0: When developing reports, there are situations where on needs to create parameters. The purpose of parameters is allow the user to generate reports for a specific set of values. One of my colleague had to build out a report using report builder 2.0  and wanted to pass an ID as a parameter. In this situation the SQL was going to be executed as a query. At first i tried to create a parameter in report builder and defined all the necessary attributes for the parameter. The stumbling block was how to pass the parameter to the SQL Statement in the dataset being created, again we were not using a stored procedure. In the sql query as part of the where clause, i changed the clause to something like Where CustomerID = @CustomerID. When i used the placeholder @CustomerID in the where clause, this caused report builder to automatically create a parameter called CustomerID. Once this was complete at run time of the report there was a prompt requesting the user to input the CustomerID. So the query in the dataset looked like
SELECT CustomerName,CustomerCity,CustomerAddress
FROM Customer
WHERE CustomerID = @CustomerID

Saturday, January 9, 2010

BI Conference...

Hi folks,
There is going to be a Microsoft Business Intelligence Conference this year, the details are provided in the link below:
http://blogs.msdn.com/usisvde/archive/2009/12/28/mark-your-calendars-for-microsoft-bi-conference.aspx.
Conferences like the one mentioned are a good place to keep in touch with current and future developments in BI, also provide a good opportunity to network with other BI professionals. One of the issues which I try to focus on is how the featires/developmets mentioned in the conference can be used in the Company/Organization I work for.

Thursday, January 7, 2010

Replication...

I use replication in SQL Server 2008 extensively, the snapshot and transaction replication methods are used the most. The situation where it is used is that i  replicate production database tables to other (destination) database on a different server, this (destination) database is used as the source to perform all the data transformations. In this way the OLTP production (source) database is not overloaded, while replication was running recently there were some errors. These errors caused replication to be backed up meaning there were rows which were not being pushed down to the subscribers. In the replication monitor, Under the All subscriptions tab right click on the status column corresponding to the subscriber where the problem is occuring. In the window that opens up, click on the distirbutor to subscriber history.  Click on the error line, the error details appear in the bottom section of the window. The error message displayed here is not really helpful, all it does is it shows the Command attempted, along with the Transaction sequence number, note the transaction number which can be seen in the window below:




There is a stored procedure that be used to which we need to pass the transaction sequence number. The result of the stored procedure will display the actual sql command that was being attempted on the subscriber database.
Here is the stored procedure: (This has to be run on server which serves as the distributor, the distribution database):
The SQL below expects a start and end sequence number. Grab the value for [xact_seqno] and plop it into the parameters for the stored proc below. The start and end sequence number needs to be the same.

EXEC Distribution..sp_browsereplcmds @xact_seqno_start = '0x0018B73A00003BCF000800000000', @xact_seqno_end = '0x0018B73A00003BCF000800000000'
The result set contains a column called command which would have the actual Sql command.
I found the above command to be very useful for debugging errors that happen during replication.

Wednesday, January 6, 2010

Typography...

I was getting ready to blog about sql server related topic, incidentally i got diverted into a totally different topic while going through MS BI group on Linked In. It had to do with the aspect of presentation design and how content need to laid out for reports and presentations. The link which i am providing below provides very useful ideas and tips in the area of typography. I found the topic very interesting and how one can miss out an important aspect, presentation. This site discusses about typography and also give suggestions for books on the same subject:
http://www.thedesigncubicle.com/2008/12/10-common-typography-mistakes/
As working professionals care should be taken while presenting, the topics discussed in the above site also include Logo Design.