Automated SSRS Report Deployments from Octopus

Recently we started using the SQL Data Tools preview in 2015, its looking good so far, for the first time in a long time I don’t have to “match” the visual studio version I’m using with SQL, i can use the latest version of VS and support SQL version from 2008 to 2016, which is great, but it isn’t perfect. There is a lot of bugs in the latest version of data tools and some that Microsoft are refusing to fix that affect us.

One of the sticking points for me has always been deployment, we use octopus internally for deployment and try to run everything through that if we can to simplify our deployment environment.

SSRS is the big one, so I’m going to go into how we do deployments of that.

We use a tools in SQL called RS.EXE this is a command line tool that pre-installs with SSRS for running vb scripts to deploy reports and other objects in SSRS.

You need to be aware with this tool though that based on which API endpoint you call using the “-e” command line parameter, that the methods will be different. And out of the box it defaults to calling the 2005 endpoint, which has massive changes to 2010.

A lot of the code i wrote was based on Tom’s Blog Post on the subject in 2011, however he was using the 2005 endpoint and I have updated the code to use the 2010 end point.

The assumption is that you will have a folder full of RDL files (and possibly some pngs and connection objects) that you need to deploy, so the VB script takes a parameters of a “source folder”, the “target folder” on the SSRS server, and the “SSRS server address” itself. I package this into a predeploy.ps1 file that I package up into a nuget file for octopus, the line in the pre-deploy script looks like the below.

rs.exe -i Deploy_VBScript.rss -s $SSRS_Server_Address -e Mgmt2010 -v sourcePATH="$sourcePath"   -v targetFolder="$targetFolder"

You will note the “-e Mgmt2010” this is important for the script to work, next is the Deploy_VBScript.rss

Dim definition As [Byte]() = Nothing
Dim warnings As Warning() = Nothing
Public Sub Main()
' Create Folder for the project if not exist

Dim descriptionProp As New [Property]
descriptionProp.Name = "Description"
descriptionProp.Value = ""
Dim visibleProp As New [Property]
visibleProp.Name = "Visible"
visibleProp.value = True
Dim props(1) As [Property]
props(0) = descriptionProp
props(1) = visibleProp
If targetFolder.SubString(1).IndexOf("/") <> -1 Then
Dim level2 As String = targetFolder.SubString(targetFolder.LastIndexOf("/") + 1)
Dim level1 As String = targetFolder.SubString(1, targetFolder.LastIndexOf("/") - 1)
rs.CreateFolder(level1,"/", props)
rs.CreateFolder(level2 , "/"+level1, props)
Console.Writeline(targetFolder.Replace("/", ""))
rs.CreateFolder(targetFolder.Replace("/", ""), "/", props)
End If
Catch ex As Exception
If ex.Message.Indexof("AlreadyExists") > 0 Then
Console.WriteLine("Folder {0} exists on the server",targetFolder)
throw ex
End If
End Try
Dim Files As String()
Dim rdlFile As String
Files = Directory.GetFiles(sourcePath)
For Each rdlFile In Files
If rdlFile.EndsWith(".rds") Then
'TODO Implment handler for RDS files
End If
If rdlFile.EndsWith(".rsd") Then
'TODO Implment handler for RSD files
End If
If rdlFile.EndsWith(".png") Then
Console.WriteLine(String.Format("Deploying PNG file {2} {0} to folder {1}", rdlFile, targetFolder, sourcePath))
Dim stream As FileStream = File.OpenRead(rdlFile)
Dim x As Integer = rdlFile.LastIndexOf("\")

definition = New [Byte](stream.Length) {}
stream.Read(definition, 0, CInt(stream.Length))
Dim fileName As String = rdlFile.Substring(x + 1)
Dim prop As new [Property]
prop.Name ="MimeType"
Dim props(0) as [Property]
End If
If rdlFile.EndsWith(".rdl") Then
Console.WriteLine(String.Format("Deploying report {2} {0} to folder {1}", rdlFile, targetFolder, sourcePath))
Dim stream As FileStream = File.OpenRead(rdlFile)
Dim x As Integer = rdlFile.LastIndexOf("\")

Dim reportName As String = rdlFile.Substring(x + 1).Replace(".rdl", "")

definition = New [Byte](stream.Length - 1) {}
stream.Read(definition, 0, CInt(stream.Length))

rs.CreateCatalogItem("Report",reportName, targetFolder, True, definition, Nothing, warnings)
If Not (warnings Is Nothing) Then
Dim warning As Warning
For Each warning In warnings
Next warning
Console.WriteLine("Report: {0} published successfully with no warnings", reportName)
End If
Catch e As Exception
End Try
End If

End Sub

I haven’t yet added support for RDS and RSD files to this script. If someone wants to finish it off please share. Now its running PNGs and RDLs which is the main thing we update.

Now that all sounds too easy, and it was, the next issue i ran into was this one when deploying. The VS 2015 Data Tools client saves everything in 2016 format, and when I say 2016 format, i mean it changes the xmlns and adds a tag called “ReportParametersLayout”.

When you deploy from the VS client it appears to “rollback” these changes before passing the report to the 2010 endpoint, but if you try to deploy the RDL file from source it will fail (insert fun).

To work around this I had to write my own “roll back” script in the octopus pre-deploy powershell script, below:

Get-ChildItem $sourcePath -Filter "*.rdl" | `
[Xml]$xml = [xml](Get-Content $_.FullName)
if($xml.Report.GetAttribute("xmlns") -eq "")
if($xml.Report.ReportParametersLayout -ne $null)
$xml.Save($sourcePath + '\' + $_.BaseName + '.rdl')

Now things were deploying w… no, wait! there’s more!

Another error I hit was this next one, I mentioned before MS are refusing to fix some bugs here’s one. We work in Australia, so Time Zones are a pain in the ass, we have some states on daylight savings, some not, and some that were that have now stopped. Add to this we work on a multi-tenanted application, so while some companies rule “We are in Sydney time, end of story” we cannot. So using the .NET functions for timezone conversion is a must as they take care of all this nonsense for you.

I make a habit of storing all date/time data in UTC that way in the database or application layer if you need to do a comparison its easy, then i convert to local timezone in the UI based on user preference or business rule, etc. Its because of this we have been using System.TimezoneInfo in the RDL files. As of VS 2012 and up, you get an error from this library in VS, the report will still run fine on SSRS when deployed, it just errors in the Editor in VS and wont preview or deploy.

We’ve worked around this by using a CLR function that wraps TimezonInfo example here. If you are reading this please vote this one up on connect, converting timezone in the UI is better.

After this fix things are running smoothly.






SQL Query Slow? Some Basic MSSQL tips

A lot of developers I’ve worked with in the past don’t have good experience with SQL out of the box, when is say good i mean beyond knowing the basics. A lot of the systems I work on are high performance so I end up training developers in SQL as a priority when they come onto the team. There is a few basic things I always show them which gets them up to speed pretty easily.

Common Mistakes to Avoid

Backwards Conversions in WHERE clauses

A common mistake is backwards where clauses, this causes the engine to create a temporary table and convert all the data in that column in the table, you should always convert the parameter not the column.

WHERE CONVERT(int,column1) = @param1

Cursors, they aren’t bad, just use them correctly

If you are using cursors for doing transactional workloads I will be scared and probably not talk to you again. If you are using them simply to iterate through a temporary table or table var and do “something” you are probably using them correctly.

Just remember the use these two hints all the time READ_ONLY and FAST_FORWARD, they will give you a speed boost of an order of magnitude.


limit them to a few thousand rows if you can, don’t use them for millions of rows if you can avoid it.

INSERT is your friend, UPDATE is your enemy

Try to design your schema with INSERTing rather than UDPATing, you will mitigate contention this way, contention of resources is what makes you build big indexes and will slow you down in the long run.

Get some CTEs up ya

Common Table Expressions (CTEs) are useful for breaking sub-queries out, and I find are cleaner in the code, performance difference is arguable though.

Table Variables over Temp Tables

Yes, I’ve said it, now I may get flamed. Temp tables have their place, primarily when you need to index your content, but if you have temporary data that is large enough to require an index, maybe it should be a real table.

Also Table variables are easier debugging because you don’t have to drop them before F5ing again in SSMS.

Missing Indexes

Number one cause of query slow down is bad indexing. If you are doing large amounts of UPDATE/INSERT on your tables though, too much indexing can be bad too.

SSMS is your friend in this case, there are a lot more advanced tools out there sure, but you should be starting with SSMS, you’ll be able to find out your basic slow downs

Look at query plans

Hit this button in the tool bar and your away


If there is an obvious slow down SSMS will recommend you an index to fix your problem.

You can see the below Green text it displays sometimes (not all the time) you can right click on this and select “Missing Index Details…” and it will give you are CREATE INDEX statement that you can use to create your index.


Most of the index hints in here are pretty spot on but there is a few things to consider before going “yeah! here’s the index that will solve my problem”

  1. Don’t index bit columns, or columns that have a small Cardinality
  2. Look for covering indexes, what it suggests might be the same as an index you have already but with one extra column, which means you could use a single index for both jobs
  3. Think about any high volume updates you have, you might slow down your updating if you add more indexes

The query plan itself will give you some more detailed info than the hints

Each block will be broken up by the percent of the entire statement (below is 1 block which is 13% of the entire statement), then within each block it breaks it up further, the below 3 Index Seeks use 12% of the total performance each.


When looking at the above, it can get very confusing what to do if you are not very familiar with SQL this interface gives you a lot of info when you mouse over each point, I think this is why some developers like to hide behind entity frameworks 🙂

The basic thing i tell people to look out for is the below:


Index Scans are usually your source of pain that you can fix, and when you get big ones SSMS will generally suggest indexes for you based on these. You will want to make sure they are consuming a fair chuck of the query before creating an index for them though, the above example of 2% is not a good one.

Maintain your Database

Your indexes will need defragmeting/rebuidling and you will need to reclaim space by backing up db and logs.

I won’t go into this too much in the scope of this post, I might do another post about it as its a rather large subject. I recommend googling this for recommendation but at least use the wizard in SSMS to setup a “default” maintenance plan job nightly, don’t leave your database un-maintained, that will slow it down in the long run.

People to watch and Learn from

Pinal Dave from SQL Authority is “the man”, he has come up on more of my google searches for SQL issues than stackoverflow.

Converting Between UTC and TimeZone in SQL CLR

I make a habit of storing in UTC time, and then converting when displaying, or getting input form the user. It makes things a lot easier and more manageable. I have yet to try a large project with the new offsets data type, but worried about how they will handle daylight savings time, which is a constant source of frustration for Australian Developers (Disclaimer: I come from Queensland where our cows don’t like it either).

Most of the time we handle the conversion in the Application or Presentation layer, I don’t like to encourage developers to handle it in the SQL layer because this is the most common place of doing comparisons and means you have more opportunity for Developer error when one side of your comparison is in the wrong Time Zone.

However there are a few cases where it is needed so today I whipped up a project that is backwards compatible to SQL 2008 R2 (the earliest version running in prod systems i work on).

GitHub Link here

it basically re-advertises the TimeZoneInfo object, wrapped up specific to what I use which is going back and forth from UTC to what ever time zone my users are in.

Most of my projects are either multi-tenanted, or have users in different states in Australia that need to view data in their own timezone.

Going to UTC below (i.e. form user input in a text box).

public static SqlDateTime ConvertToUtc(SqlString dotNetTimeZone, SqlDateTime theDateTime)
var localDate = DateTime.SpecifyKind((DateTime)theDateTime, DateTimeKind.Unspecified);

// create TimeZoneInfo by string time zone.
var timeZoneInfo = TimeZoneInfo.FindSystemTimeZoneById(dotNetTimeZone.ToString());

// convert date local to utc date by time zone.
return TimeZoneInfo.ConvertTimeToUtc(localDate, timeZoneInfo);

Coming from UTC below (i.e. form data in a database table)

public static SqlDateTime ConvertToLocalTimeZone(SqlString dotNetTimeZone, SqlDateTime theDateTime)
// create TimeZoneInfo by string time zone by time zone.
var timeZoneInfo = TimeZoneInfo.FindSystemTimeZoneById(dotNetTimeZone.ToString());

// convert date utc to local date.
return TimeZoneInfo.ConvertTimeFromUtc((DateTime)theDateTime, timeZoneInfo);

The DataTools SQL projects these days are great, you just need to add a reference into your C# library from your SQL project, and it’ll compile into the SQL script that generates for deploy, examples are all in the github project.

Just noting that you can’t compile a C# class library for 3.5 in a SQL Data Tools project, you need to create an external C# class library project, because 3.5 reference in the SQL projects have some issue with mscorlib.


After adding the reference you also need to also set it to “unsafe”


Then when you publish the Class with output like the below into your deployment script:

FROM 0x4D5A90000300000004000000FFFF0000B8000000000000004000...

I’ve also included a post deploy script in this example code with the other settings you need to get you database going with CLR functions

sp_configure 'clr enabled', 1