Wednesday, December 21, 2011

TFS 2010 Can't Convert Branch to Folder

Branches can be removed from a TFS project by converting them back to a folder. This is usually done when a branch was created by accident and needs to be removed. The tricky part is when the Convert to Folder menu option doesn't appear. If you find yourself in this situation you're most likely using the context menu (right-clicking). Try navigating to File -> Source Control -> Branching and Merging. You should see Convert to Folder there.

I'm not sure if it's true but a colleague of mine told me that Microsoft purposely did this to help avoid accidental conversion of branches to folders by making it harder to get to the command. It also helps make it very confusing and frustrating - IMHO.

Thursday, December 15, 2011

SQL Server 2008 R2 Master Data Services: A Review

I recently had the opportunity to spend a couple of days working through a proof-of-concept using Microsoft's SQL Server 2008 R2 Master Data Services (MDS) application. I had been curious about the product ever since I had heard about it, as it is intended to solve a problem I have often encountered during the beginning stages of numerous data warehousing projects.

The problem I am referring to is finding a source for core organizational data such as products, customers, store locations, raw material types, and business units - the data that ultimately makes up the dimensional model's dimension tables. Often times this data is spread across numerous systems or better yet lives in one or more spreadsheets that are maintained by numerous people and are missing data as well has contain duplicates. In addition, the data that is there is often out of date or simply invalid.

A data warehouse is just about worthless if the dimensions by which the measures will be evaluated are of low quality. It could even be detrimental if business decisions are being made based on invalid data.

Before moving on to a technical solution to these data issues it is very import that everyone involved in the success (or failure) of an MDM and/or data warehousing solution understand that technology will not make the problem of unreliable, disparate data go away. Technology acts as an enabling platform for well thought out, planned, and accepted data governance and stewardship. A brief definition of the two can be found here.

MDS assumes you have these things figured out and that you want to implement them on a technology platform that provides one or more of the following:
  1. Data modeling (entities, attributes, hierarchies, etc.)
  2. Data browsing
  3. Manual data maintenance (add, edit, delete) via a modern UI
  4. Bulk import and export
  5. Data and Model versioning
  6. Audit capabilities (who changed what when)
  7. Business rules for data validation
  8. Workflow based data validation
  9. Ability to integrate (both ways) with other systems
  10. Fine-grained security
It's my experience that building an MDM database is easy. It's the UI, complex data validation, workflow, and integration with other systems that's the tough part. Lucky for me Microsoft now provides a platform that does all these things for me. How well it does them is what I spent a few days trying to figure out. Before I detail what I liked and didn't like about the product I will provide a general overview as well as an overview of the underlying architecture, technology, user interface, and integration points.

General Overview


The purpose of MDS is to provide a platform for the creation, maintenance, and integration of high quality master data throughout an organization.


MDS comes with Enterprise and Data Center editions of SQL Server. Licensed the same way as SSIS, SSAS, and SSRS in that it can be installed free of any additional license fees on a machine that already has a paid SQL Server license. However, if installing an a server without SQL Server installed, it is necessary to pay for another license.


MDS can only be installed on a 64-bit server.


All model metadata and data (members) are stored in a SQL Server database.
Master Data Manager Web application and Web service layer built on ASP.NET and WCF respectively and are hosted by IIS. All web application methods ultimately pass through the WCF layer.
Windows service that acts as a broker between MDS and SharePoint in order to allow MDS business rules to use SharePoint 2010 workflows (build on WWF).This is optional.
Configuration Manager Web application that is similar to that of SSRS and allows for such things as enabling the web service and attaching to an MDS database.

When new entities are created or modified via Master Data Manager, calls are made to the WCF based web service that ultimately result in the creation or modification of SQL Server tables, views, etc. Unfortunately the tables are created with a “code name” such as tbl_2_10_EN. However, this shouldn’t be an issue as all data access should be done via auto-generated views, exports, and/or web services. There is also a view that maps the table code names to meaningful entity names.
There are a number of CLR-based functions that are used for merging, de-duping, and survivorship based on rules created by the data stewards and/or administrators.
Service Broker is used in scenarios where many emails are being sent or large amounts of data are being loaded. This allows for asynchronous processing that frees up the users to continue using the web application.
Database Mail, which resides in the msdb database, is used to send e-mail alerts.
Staging tables are used when importing data into MDS. This allows for a load to be completed in a single batch once it has been staged without error.
IIS hosts a default named application “MDS”. It also hosts a WCF Service called Service.svc which is located in the same directory as the application. It is highly recommended that all custom processes use the web service to interact with the database.

User Interface (Master Data Services Manager)

Allows administrative users to create data models to hold entity members (term member is used the same as in dimensional modeling to refer to a row/entity instance).
Allows users to add, edit, and delete members (entity instances).
Allows for the creation of hierarchical relationships between members.
Allows for the creation of collections of members for non-mutually exclusive groups of members.
Copies of entity data and related metadata (models) can be archived to create a separate version of the data.
Includes a business rules engine that can be used to enforce data quality as well as assist with data stewardship via workflows. Both data validation and manipulation rules as well as workflows have the ability to send e-mail alerts.
A granular security model can be used to fine tune specific user’s access to certain entities, attributes, and even members (rows).

Integration Points


When importing bulk data it is necessary to get that data into existing SQL Server staging tables. This can be done by making calls to out-of-the-box stored procedures and/or writing directly to the staging tables. For scheduled and programmatic imports, custom SSIS packages are recommended. Once these tables are populated you can invoke the import by using the API, web service, or the Master Data Manager user interface.

Web Service

All MDS functionality can be executed via the IIS hosted, WCF-based web service named Service. It can be used to create a custom UI and/or integrate it with existing and/or new applications.


Although using the web service directly is recommended, MDS provides three .NET assemblies (Microsoft.MasterDataServices.Core, Microsoft.MasterDataServices.Deployment, Microsoft.MasterDataServices.Services). These are typically used for administrative functions such as creating and deploying update and deployment packages and cloning and deleting models.


I got started with the tool by installing it on a VMware player virtual machine with Windows Server 2008 R2. The install was relatively easy although I did have to do some post install tinkering to get IIS configured correctly to serve the web app and WCF web service. MDS has a configuration utility that feel somewhat similar to the SSRS configuration utility. It was pretty easy to use although it has an option to enable the web service yet you still have to change the web.config file to actually enable it.

User Interface
The web-based user interface is on the right track but it is unbearably slow. It can take 15-30 seconds to respond to a simple request. I have seen others complain and responses that say that once the application caches some data things should speed up. I have seem some improvement but it's still way too slow. There are also no indicators letting you know that the application is doing something so many times I found myself wondering if I had even clicked something.

I also found myself trying to remember where to go to do certain tasks. I couldn't quite put my finger on it but the UI just doesn't seem intuitive and is definitely inconsistent in some areas. For example, many pages have data grids that look like they are for display only but if you click on a row you are presented with add, edit, and delete buttons. Additionally, some pages require the user to double-click on on a cell in a grid to edit the value. It took a while for me to figure this out as there was no indication that this was the expected behavior.

System Administration
The System Administration area of the application is where models, entities, attributes, etc. are created and maintained.

I found it easy enough to create entities and attributes although there seems to be a lack of data types for attributes. The biggest shortfall for me being the lack of a boolean data type. The workaround to this is to create a domain-based attribute with values of yes/no or true/false. Seems kind of lame to me.

There are also the concepts of derived hierarchies versus explicit hierarchies, as well as collections. It was unclear to me when to use which feature.
Attribute groups is a nice feature that allows you to group certain types of attributes such as address type attributes which ultimately groups these attributes on different tabs for end-users to maintain.

The business rules functionality is pretty nice. It lets you define complex, row-level validation and data maintenance but doesn't allow you to access any aggregate or related values in your logic. One of the actions that can be triggered is a workflow. This workflow is a SharePoint 2010 WF based workflow. My feelings are that there's not much special here as you have to develop a customer WF component and deploy it to your SharePoint server to make it work. All that MDS provides is the ability to call this workflow. This can be done with just about any application. It falls short for me.

As much as I want a free tool to ease my workload, MDS just doesn't feel ready for prime-time although I'm on the fence as to whether I would use it in an environment that had no other tool and the ability to install it on a machine in which a SQL Server license was already paid for. It should be noted that a new version will be released with SQL Server 2012 and promises to be a substantial improvement over the current version. 

Generate Insert Statements for SQL Server

Many times it is necessary to script the insert of rows into a SQL Server table as part of a release, bug fix, maintenance process, etc. I have done this through T-SQL, Excel, and I'm sure a few other ways.

It turns out Microsoft provides a tool to do this for us with ease. It is called the Database Publishing Wizard. This tool can script both DDL and DML. The actual executable is named SqlPubWiz.exe and can be found in a path similar to C:\Program Files (x86)\Microsoft SQL Server\90\Tools\Publishing\1.4.

Give it a shot. It's a nice little tool to have in the toolbox.

BIDS (Visual Studio 2008 Shell) with TFS 2010

BIDS is not compatible with TFS 2010. It has to do with TFS 2010 having the concept of Project Collections whereas 2005 and 2008 do not. When adding a Team Foundation Server to Visual Studio Team Explorer 2008 (Tools -> Connect to a Team Foundation Server -> Servers -> Add) there is no way to reference a TFS Project Collection. However, it is possible to get your Visual Studio 2008/BIDS environment up to speed.

There are four things you need. 
1) Visual Studio Team System 2008 Team Explorer downloadable here.
2) Visual Studio 2008 SP1 downloadable here.
3)  Visual Studio Team System 2008 SP1 Forward Compatibility Update for Team Foundation Server 2010 downloadable here.
4) Lastly you'll need to create a registry entry for each Server/Collection you want to use. Launch the registry editor (regedit at a command prompt) and create an entry in HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\9.0\TeamFoundation\Servers similar to below. 

    Value name: ServerName
    Value data: http://ServerName:PortNumber/tfs/CollectionName

Wednesday, December 14, 2011


Microsoft strikes again by replacing a technology that was just beginning to gain a foothold with a product that has some nice improvements but does not replace all of the existing functionality. Here is a link to a long awaited feature comparison between the SQL Server Data Tools SQL Server database project (SSDT) and its predecessor Visual Studio Database project (VSDB).

Dimensional Modeling Skills Assessment

As everyone knows it's extremely hard to hire good people. It's equally as hard, if not harder to actually identify good people. I have previous posts with interview questions that dive into the details of T-SQL, SSRS, SSIS, SSAS, modeling, etc. Unfortunately many of these questions can be answered by those that have done a lot of reading and memorizing. It's my belief that the best technical interviews uncover a candidate's thought process, problem solving skills, and past experience. I recently put together a practical exam that attempts to do this. While I have only administered it once at this point I am confident that it is a good start to a practical skills assessment which I think should be part of any developer interview.

The assessment consists of business requirements, modeling requirements, an OLTP normalized, physical data model and some questions. The candidate is asked to review the requirements and data model and create a dimensional model that will serve as the basis for a data warehouse. There are also some follow-up questions.

The current version of the assessment asks for a dimensional model but has some questions that cross over into the physical world and assumes a SQL Server environment. It can probably be improved or at least made applicable to more organizations by making it more generic. However the position I created it for requires some knowledge of SQL Server so I incorporated it into the test.

Below is the problem statement and the physical OLTP data model. You can download the entire assessment here. Please e-mail me at if you'd like the answers.

You are a consultant and have been engaged by a manufacturing company to develop a business intelligence solution to help them reduce their raw material costs. Understanding the Purchase Order process which includes both purchases and receipts against those purchases is key to any potential cost reduction program.
The project’s sponsor has requested a first iteration of a Kimball-style dimensional model representing the material acquisition process. This model will be the basis of a star-schema based data warehouse.  It is your task to develop this model.