Monday, June 3, 2019

Continuous Deployment (CD) with Azure DevOps, SSIS, and SQL Server Database Projects

What are we doing here?

This is the second part of a two-part series covering 1) CI (continuous integration/build) and 2) CD (continuous deployment/release) for Visual Studio SQL Server Database and Integration Services project types using the Azure DevOps) platform.

Some background

The Visual Studio business intelligence (SSIS, SSRS, SSAS) and database projects types have long been the red headed step child of the Visual Studio ecosystem. Huge strides have been made in recent years but there are still some impediments, one of which is creating CI/CD pipelines for these projects in Azure DevOps.

Don't fret

This step-by-step guide will walk you through creating a release pipeline for a solution containing both a SQL Server Database project and an Integration Services project.

Assumptions (you know what they say...)

The following steps assume that you have already have a working Azure DevOps build definition. If not, you can refer to part 1 of this series.


Let's do it

  1. Navigate to /Project/Pipelines/Releases and select New Release Pipeline.
  2. Choose “Start with an Empty job”.
  3. Name the Stage. An appropriate name might be the environment you’re deploying to such as “Development”.
  4. Click “Add an Artifact”.
  5. Select “Build” as the source type and choose the Project and Source (build pipeline) that will be deployed. If you haven’t already created a Build definition, see Continuous Integration (CI) with Visual Studio Team Services (VSTS), SSIS, and SQL Server Database Projects.
  6. Click the lightning bolt on the newly created Artifact, enable the Continuous deployment trigger and add a branch filter for master or whatever branch will be deployed.
  7. Click on the “1 job, 0 task” link in your newly created Stage.
  8. Click the plus sign to the right of “Agent job” to add a new task.
  9. Search for “Command Line” and add a new task.
  10. Click on the new task and add populate the properties as follows.
    Keep in mind that this code is using SQL Server 2017. If you’re using another version, you’ll likely have to modify the Tool path.

    Argument text: /a:Publish /sf:"replace_with_build_output_root_location\$(Build.DefinitionName)\$(Build.BuildNumber)\replace_with_appropriate_path\replace_with_dacpac_name.dacpac" /tsn:replace_with_server_name /tu:replace_with_user_name /tp:replace_with_password /tdn:replace_with_database_name
  11. Add another Command Line task and populate as follows.
    Argument text: /S /ModelType:Project /SourcePath:"replace_with_build_output_root_location\$(Build.DefinitionName)\$(Build.BuildNumber)\replace_with_appropriate_path\$(BuildConfiguration)\replace_with_ssis_ispac_name.ispac" /DestinationServer:"localhost" /DestinationPath:"/replace_with_appropriate_ssis_catalog_path"
  12. The previous tasks use a variable to tell us which build configuration to use. Add the variable by clicking on the “Variables” tab, then click +Add, enter “BuildConfiguration”, and set a default value that is the name of the build configuration that is appropriate for this Stage of the pipeline. This will depend on the names of the configurations that have been created in your Visual Studio solution.
  13. Always test! Create a release and make sure it deploys your projects correctly.

As always, if you have any questions feel free to drop me an email at david@elish.net.





Sunday, November 18, 2018

How to Share Power BI Reports On Your Website (for free)

Let's say your a geek and you're traveling the country in your 30 year old Airstream trailer and you have a bunch of data you'd like to share with other fellow digital nomads (oh wait, that's me). Data like how much money you've spent on gas and campground fees, the number of days you've been on the road and the states you've traveled to. Wouldn't it be cool to do that with Power BI...for free? Well you can and it's pretty easy. In this post I'll walk you through the steps necessary for "embedding" your Power BI reports in any web page/web site.

First things first

There are a couple of things we need to do to make this magic possible.

  1. Download Power BI Desktop. You're going to need something to author and publish your Power BI reports with. That's Power BI Desktop which can be downloaded here.
  2. Create a Power BI account. This account will allow you to publish your reports to the web, from where your web page will display them.
  3. Create a Power BI report that you'd like to share with the world.

Put yourself out there

Now that you have the proper tools and accounts and have created a deeply insightful report, it's time to publish your report to web. Publishing your report puts it on the Power BI servers and makes it available to share with your millions of adoring fans.
  1. If you haven't already, open your Power BI report in Power BI Desktop.
  2. Navigate to File -> Publish -> Publish to Power BI.













  3. Choose your workspace and publish.

Expose yourself

Now that you've published your report to Power BI it is available for you to "embed" in your web page. I have been putting "embed" in double quotes because Microsoft uses this term for another, more feature-rich (and not free) approach to sharing Power BI reports that we're not going to get into here. The approach we're going to use here simply uses an iframe to expose your Power BI hosted report in your own web page.
  1. Open your report on the Power BI website.
  2. Navigate to File -> Publish to Web.
  3. Copy the embed code
  4. Paste the embed code into your web page.

Bask in the glory

Start spinning up that HDFS cluster to handle all the web log data that will be coming in now that your website has fancy shmancy reporting for all the world to see ... for free (the reporting, not the hdfs). 


Wednesday, May 9, 2018

How to Execute an SSIS Package on Azure from On-premise or IaaS Server

So you have SSIS packages deployed to Azure using ADFv2 and an Azure-SSIS Integration Runtime but you still have on-premise or IaaS SQL Server(s) and want to initiate the execution of your Azure hosted packages from your on-premise or IaaS server(s)...

As I like to say, don't fret! This is pretty easy to do using straightforward, tried & true on-premise SQL Server technologies:

  1. Linked Server - A Linked Server is used to execute Azure SQL Database queries using four-part names (server.database.schema.object).
  2. Stored Procedures - The SSISDB create_execution and start_execution stored procedures are used to initiate package execution.
  3. SQL Agent Job - A SQL Agent Job is used to schedule the execution of the SSISDB create_executionstart_execution, etc. stored procedures.
There you have it. A rather painless way to execute Azure-based SSIS packages from your traditional SQL Server instance. This can be useful when migrating from on-premise and/or IaaS to PaaS.

Thursday, May 3, 2018

Integration Services Catalogs Node Not Visible in SSMS When Connecting to Azure Server w/ Azure SSIS Integration Runtime

So you went through all the steps to configure and deploy SSIS on Azure using an Azure SQL Server, Azure SQL Database (for SSISDB), Azure Data Factory v2 (ADF v2), and the Azure-SSIS Integration Runtime type but when you connect to the SQL Server using SSMS you don't see the familiar Integration Services Catalogs node. You see something like this:


Don't fret. You likely configured and deployed everything correctly and the service is eagerly awaiting your requests. All you have to do is explicitly specify the SSISDB when connecting. To do this, click the Options>> button in the connection dialog and enter SSISDB as the database name.


Click Connect and you should be good to go!


Tuesday, May 1, 2018

SSRS Continuous Deployment PowerShell Script using ReportingServicesTools

Way back in 2010 I wrote a post about how to use RS.exe to automate the publishing of SSRS objects such as reports, datasource, folders, etc. I got familiar with RS.exe at that time because I needed a way to include SSRS object deployment in a commercial software installer package. This time around I had a similar requirement except that the deployment is now for a CD (continuous deployment) pipeline for an open source project that I am working on called SSIS Project Weaver. It is an SSIS execution framework that includes a number of SSRS reports used to monitor and administer ETL batches.

Instead of using my old tricks I decided to see if there was anything else new out there. Turns out there is. Microsoft was kind enough to release a PowerShell-based project called ReportingServicesTools which targets SQL Server 2012+. Once installed, this PowerShell module provides 40+ commands for interacting with an SSRS instance. Not unlike RS.exe, its commands are not always straightforward.

I won't go into detail about what each command does as there's plenty of existing documentation on that. What I will do is post the PS script I'm using specifically for my CD scenario.

My requirements were the following:
  1. Wipe all objects from SSRS instance. Keep in mind that the objective of this script is to test the build and deployment of the source code of my project, not deploy to a production environment. Just in case I have other projects being deployed to the same server I created a switch so that I can choose whether or not to execute this portion of the script.
  2. Create a folder in which to deploy the reports. This is not a straightforward as one would hope. The ReportingServicesTools command does not provide an option to "force" or overwrite if the folder already exists. If the folder already exists the command fails.
  3. Deploy the datasource(s). Using PowerShell piping and object filtering we can deploy all of our datasources with a single reference to, but multiple executions of, the Write-RsCatalogItem command.
  4. Deploy all reports to a single folder. If you want to deploy to multiple folders you'll have to modify the script.
  5. Set all report's datasources. 
Before you can use the ReportingServicesTools commands and use the script below you must install the PS module by running the following command. 
Install-Module -Name ReportingServicesTool  

Write this script to a text file with a .ps1 extension.
1:  # Script Parameters  
2:  Param(  
3:    [string]$ReportServerUri = "http://localhost/ReportServer",  
4:    [string]$RdlFolder = "C:\vsts-rel\SSISProjectWeaver-CI\44\reports",  
5:    [string]$RsPath = "/",  
6:    [string]$RsFolderName = "SSISProjectWeaver",  
7:    [string]$DatasourceName = "SSISExecutionManager",  
8:    [switch]$CleanReportServer  
9:  )  
10:    
11:  #Delete all objects  
12:  if ($CleanReportServer) {  
13:    Get-RsCatalogItems -ReportServerUri http://localhost/ReportServer -RsFolder '/' | Remove-RsCatalogItem -ReportServerUri http://localhost/ReportServer -Verbose -Confirm:$false  
14:  }  
15:    
16:  #Create SSRS folder   
17:  $targetFolder = Get-RsFolderContent -RsFolder $RsPath | Where-Object { $_.Name -eq $RsFolderName }  
18:  if ($targetFolder -eq $null)  
19:  {  
20:    New-RsFolder -ReportServerUri $ReportServerUri -Path $RsPath -Name $RsFolderName -Verbose  
21:  }  
22:    
23:  #Create full path  
24:  $Destination = $RsPath + $RsFolderName  
25:    
26:  #Change local dir to location of .rdls  
27:  CD $RdlFolder  
28:    
29:  #Deploy datasource  
30:  Dir $RdlFolder -Filter *.rds | Write-RsCatalogItem -ReportServerUri $ReportServerUri -Destination $Destination -OverWrite -Verbose  
31:    
32:  #Deploy all reports to single folder  
33:  Write-RsFolderContent -ReportServerUri $ReportServerUri -Path $RdlFolder -Destination $Destination -Verbose -Overwrite  
34:    
35:  $DataSourcePath = $Destination + "/" + $DatasourceName  
36:    
37:  # Set report datasource  
38:  Get-RsCatalogItems -ReportServerUri $reportServerUri -RsFolder $Destination | Where-Object TypeName -eq 'Report' | ForEach-Object {  
39:    $dataSource = Get-RsItemReference -ReportServerUri $ReportServerUri -Path $_.Path  
40:    if ($dataSource -ne $null) {  
41:      Set-RsDataSourceReference -ReportServerUri $ReportServerUri -Path $_.Path -DataSourceName $dataSource.Name -DataSourcePath $DataSourcePath  
42:    }  
43:  }  

Here is a sample call of the script.
C:\deploy_ssrs_reports.ps1 -ReportServerUri "http://localhost/ReportServer" -RdlFolder "\\my-build-share\reports" -RsPath "/" -RsFolderName "SSISProjectWeaver" -DatasourceName "SSISExecutionManager" -CleanReportServer   

Monday, April 30, 2018

PowerShell PSSecurityException: UnauthorizedAccess

I just fought with this for close to an hour and the solution was REALLY simple so I thought I'd try and save someone else an hour of their life.

When trying to execute an unsigned PowerShell script (yeah let's not go there) from a VSTS release job I encountered the following error:



2018-04-30T20:35:34.8886378Z ##[error]. : AuthorizationManager check failed.
At line:1 char:3
+ . '\\exec-mgr-build\vsts-rel\SSISProjectWeaver-CI\38\deploy_ssis_proj ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : SecurityError: (:) [], PSSecurityException
+ FullyQualifiedErrorId : UnauthorizedAccess

2018-04-30T20:35:34.9115615Z ##[error]Process completed with exit code 0 and had 1 error(s) written to the error stream.

I was able to workaround the error by using the local path (this wouldn't have worked if the PowerShell script was on another server). Even though I had a workaround I wanted to be able to use a UNC path. Turns out all you have to do is add the path as a Trusted Site. You can do this within Internet Explorer's security options like so:


Before I spend the rest of my day trying to explain why and how this works I am going to cut it short and get back to finishing my release definition. Hope this helps!

Thursday, April 26, 2018

Continuous Integration (CI) with Azure DevOps, SSIS, and SQL Server Database Projects

What are we doing here?

This is the first part of a two-part series covering 1) CI (continuous integration/build) and 2) CD (continuous deployment/release) for Visual Studio SQL Server Database and Integration Services project types on the VSTS (Visual Studio Team Services) platform.

Some background

The Visual Studio business intelligence (SSIS, SSRS, SSAS) and database projects types have long been the red headed step child of the Visual Studio ecosystem. Huge strides have been made in recent years but there are still some impediments, one of which is creating CI/CD pipelines in Visual Studio Team Services.

Don't fret

While it's not as straightforward as say a ASP.NET web application, with just a little finessing it's not overly difficult to build and deploy SQL Server Database and Integration Services project types. If you're familiar with creating VSTS build and release definitions it probably wouldn't take you too long to get it figured out. If you are new to this functionality it can definitely be a bit overwhelming. Hence, this step-by-step guide to standing up a basic VSTS build definition that includes both database and SSIS projects.

Assumptions (you know what they say...)

The following steps assume that 1) you have a VSTS account 2) a VSTS collection, 2) a VSTS project within the collection and 3) have a source control repository in VSTS Git, TFVC, GitHub, Subversion, Bitbucket Cloud, or Git (not necessarily hosted by VSTS).


This walk through will use an open source project called SSISProjectWeaver that is hosted on GitHub (using a different repository type requires little, if any changes). It's not necessary to know the details of the SSISProjectWeaver project other than that it contains a SQL Server Database project and two Integration Services projects.

<plug>If you're starting a new SSIS project it's worth checking out SSISProjectWeaver as it's a 100% non-invasive ETL framework that provides metadata-driven parallelism and dependencies between any two packages or groups of packages, extensive logging, reporting and dashboarding, alerting, and a bunch more. It's easy to set up and has been proven in large organizations with hundreds of packages running a day. You don't have to modify your existing code and if you don't like it you just stop using it. Nothing will have to change with your existing packages.</plug>

Let's do it

  1. Open you VSTS project and click on the Build and Release tab.
  2. Choose your source. For this example we are going to use GitHub.
  3. Once you've connected to the repository, select the branch that you want to build and click Continue.
  4. In the top right-hand corner under Select a template, click Empty Process. This will start us off with a blank build definition.
  5. Name your build definition appropriately and then choose the Agent queue. In this example we are using a Hosted VS2017 queue. Depending on your project you might want to use a local/on-premise build agent of VS2015. It all depends on your project types, versions, dependencies, etc.
  6. Before we define our build tasks let's create some variables. Click on the Variables tab and create the BuildConfiguration and BuildPlatform variables along with their corresponding default values.
  7. Navigate back to the Tasks tab and click the next to Phase 1. Search for and choose the Visual Studio Build task. This task will use the out-of-the-box msbuild template for building SQL Server database projects with a .dacpac as the build output.
  8. Change the Display name to something meaningful.
  9. Choose the SQL Server database project or solution to build.
  10. Choose the version to build.
  11. Use the newly created variables for the Platform and Configuration properties.
  12. Save the task.
  13. Navigate back to the Tasks tab and click the + next to Phase 1. Search for and choose the Command Line task. We will use this task as a workaround to a lack of an msbuild template for SSIS projects by calling invoking devenv to build the projects.
  14. Change the version to 2.*(preview). This version allows multi-line commands similar to a .bat file.
  15. Change the Display name to something meaningful.
  16. Enter the following text into the Script box. The first 8 lines just echo (print line) info that is useful for debugging. The last line is the actual command that is being executed.
    echo BuildShare: $(BuildShare)
    echo Agent.BuildDirectory: $(Agent.BuildDirectory)
    echo Build.SourcesDirectory: $(Build.SourcesDirectory)
    echo Build.ArtifactStagingDirectory: $(Build.ArtifactStagingDirectory)
    echo System.DefaultWorkingDirectory: $(System.DefaultWorkingDirectory)
    echo BuildConfiguration: $(BuildConfiguration)
    echo BuildPlatform: $(BuildPlatform)
    echo Command: "C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\Common7\IDE\Devenv.com" "$(Build.SourcesDirectory)\your_project_or_solution_name.(sln|proj)" /build $(BuildConfiguration)

    "C:\Program Files (x86)\Microsoft Visual Studio\2017\Enterprise\Common7\IDE\Devenv.com" "$(Build.SourcesDirectory)\your_project_or_solution_name.(sln|proj)" /build $(BuildConfiguration
  17. Save the task.
  18. Click on the Triggers tab and enable continuous integration. This will enable VSTS to execute this build definition each time code is committed to your source control repository.
  19. Test the build definition by clicking on Queue.
  20. Assuming your build was successful, you now have the CI half of your CI/CD pipeline and can auto-notify the world when one of your co-workers breaks the build!

Wrap Up

When it comes to technology, nothing is ever as simple as it appears. The SSISProjectWeaver solution is pretty straightforward in terms of code and dependencies. It's T-SQL and SSIS packages with no external dependencies. Building it is is very straightforward as you can see above. However, many projects are much more complex and require more complex configurations to set up a build definition. For example, the first time I created a build definition with an SSIS project it failed because I had references to 3rd party components. Of course Microsoft doesn't know I am using those components so it's not on their hosted build servers. To make it work I had to give up the convenience of using a hosted build server and stood up my own build server with all the necessary components. Food for thought. Don't be discouraged if you hit a few stumbling blocks on your way to automated build bliss!

Stay tuned for part two which will cover CD (continuous deployment) and will walk you through getting your code deployment automated.

As always, if you have any questions feel free to drop me an email at david@elish.net.