Success in Releases

I work for a small company (150 or so people) with a limited IT budget.  When I started with the organization, emergency releases would happen nearly every month with some happening twice or three times in a single month. We have gone over a year without an emergency release. This is a huge success and there are more than one change to bring it about. As the architect of the group, I am proud and very happy with the results of the changes.  Here are the key factors:

Requirements Gathering

We brought into the department a Business Analyst. This may seem like a huge investment for such a small company, but the person we brought into the process had worked in several departments within the organization. Intimate knowledge of business processes was the key to help us technical people understand in more depth what the business was asking the what fors and whys. This helped us produce what the business wanted versus what they asked for.

Improved Sprint Process

We limit the number of change tickets per department to two (more if another department has none) and never over commit. This may sound like it wouldn’t work in the real world but we are very successful in getting what the business needs completed in what they perceive as a reasonable time frame. Our old system was to commit to a huge number of items and complete a fraction of them with the business perception that we cannot get anything done. The pressure was high, the perceived failure was high and morale was low.  With the reduced workload, code quality is higher, satisfaction with work is better and the business sees the IT department as successful because we complete what we say we will.

Code Reviews

Code reviews were hard to get into the development process but the results are unquestionable. Every developer hates to code review, it takes away from coding time, it takes away from getting stuff done. The truth is, it makes coders more accountable, it catches bonehead mistakes before they end up in production, it helps with writing code that adheres with standards, it helps coders comply with the architecture of the application, it helps the coders know all of the code, it just helps. The time taken away from coding is well spent looking over new features and functionality with the eye on the prize: quality. Great code isn’t by accident, great code is crafted.

Definition of Done

One of the keys in scrum is the definition of done. To a coder, done might be when the code runs through unit or evaluation tests. To a business analyst, done is when the code is in test and appears to work as expected.  To the business, done is most often when the feature or fix is in production and can be used. Having all parties involved know what ‘Done’ means to the whole organization puts everyone on the same level of expectation. For us, a ticket is closed when it is validated in production.

I hope to have another year of no emergency releases. Improving the process of making software has been very beneficial to the organization and the customers we serve.

Set Windows Wallpaper from Bing

The other day, I was looking and a colleague’s beautiful wallpaper and thought it would be nice to set a daily wallpaper but without the fuss of finding one every day. I found some sample scripts to get the Bing daily image and other scripts to set the wallpaper to an image.  Here’s my version written in C# with LinqPad.

NDepend Review version 2017.3.1 Professional Edition

NDepend is a static code analysis tool to improve code quality, visualize dependencies and reduce technical debt.

What’s included

What you get is a zip file with the contents of the NDepend application suite for which the contents can be dropped into a folder of your liking.  Included in the package are the following files in the root folder:

  • Console.exe – Command line version of the NDepend for use by Continuous Integration (CI) tools and build servers.
  • PowerTools.exe – Interactive command line as an example of how the API can be used by developers, the source code is included in the NDepend.PowerTools.SourceCode subfolder
  • VisualStudioExtension.Installer.exe – Installs the extension for Visual Studio
  • VisualNDepend.exe – Standalone version of NDepend

There is an Integration subfolder with detailed information on how to configure NDepend with your favorite Continuous Integration (CI) servers.  (You are using a CI server, right?)  On the web site, there are extensions and configurations for many of the most popular CI servers including TeamCity, CruiseControl.NET, FinalBuilder and others.  Visual Studio Team Services and TFS 2017 support is by an extension downloaded from the Visual Studio Marketplace.  Upon a full analysis, a web page is created with details from the analysis which is great for inclusion into a CI system.

NDepend rules are written in a LINQ syntax, each rule is displayed when looking at the issues.  The LINQ syntax also contains suggestions on how to correct the issue which I found helpful when the suggestion was not clear.  The rules can also be disabled on a per-project basis.

Here’s and example of the LINQ syntax (note the description section):

// <Name>High issues (grouped per rules)</Name>
let issues = Issues.Where(i => i.Severity == Severity.High)
let rules = issues.ToLookup(i => i.Rule)
from grouping in rules
let r = grouping.Key
let ruleIssues = grouping.ToArray()
orderby ruleIssues.Length descending
let debt = ruleIssues.Sum(i => i.Debt)
let annualInterest = ruleIssues.Sum(i => i.AnnualInterest)
let breakingPoint = debt.BreakingPoint(annualInterest)
select new { r,
Category = r.Category

// Issues with a High severity level should be fixed quickly, but can wait until the next scheduled interval.
// **Debt**: Estimated effort to fix the rule issues.
// **Annual Interest**: Estimated annual cost to leave the rule issues unfixed.
// **Breaking Point**: Estimated point in time from now, when leaving the
// rule issues unfixed cost as much as fixing the issues.
// A low value indicates easy-to-fix issues with high annual interest.
// This value can be used to prioritize issues fix, to significantly
// reduce interest with minimum effort.
// **Unfold** the cell in *ruleIssues* column to preview all its issues.
// **Double-click a rule** to edit the rule and list all its issues.
// More documentation:


Installation and First Use

The first thing I did was to install the Visual Studio extension which was not the best move. My favorite add-in is ReSharper but the installation of NDepend broke ReSharper and caused by Visual Studio to become very unstable. After several install/uninstall rounds I figured out that I can disable the automatic analysis in favor of a request based analysis which made VS more stable but still had issues.  With the newest version of NDepend, I was able to install the extension in Visual Studio 2017 and the extension did not cause any instability issues. This review is based on the standalone version which worked just fine for all the tests I ran.

For the purposes of this review, I will be using a project called SchemaZen found at as it was a project for which I was working on and it has unit tests which I thought was important for the analysis.  I needed to make a few changes to the project, I wanted a command timeout command line parameter and I wanted table, foreign keys and other table schemas to be in the same folder.

The process followed for this review:

  • Pull the latest version from GitHub
  • Load in VS2017, Full Rebuild
  • NDepend – First Analysis
  • Perform the code changes
  • Rebuild
  • NDepend – Second Analysis

First Analysis


The information provided on the first analysis is pretty informative.  Let’s break down some of the key points.

Code counts and diffs between versions (more on that in the second analysis).


The Debt, Coverage, Complexity and overall code quality section.  This project is pretty good with a C rating but of course it could be better and the time to get to a B is just a few hours.  The 258 issues is comprised of 26 high ranked items.


A click on the High Issues produces this list:


Long functions and classes are a part of this solution so I understand the list of issues.

Second Analysis

I added my code changes with the following result:


It appears my changes were good for functionality but bad for the long term stability of the application.


From a first use standpoint one might conclude that NDepend is just for simple information but the benefits are from long term use.  The ability to see positive and negative changes over time help to make concrete plans to clean up the code where without such a tool the time spent might be a full time challenge. Incorporating NDepend in the build process would help the development staff understand how the smallest of changes affect the long-term stability of the application.

Architecture is a key factor in maintaining the application in the long term, the Dependency Matrix view helps show the areas where the code is used and how much.



The analysis provided by NDepend is excellent even for smaller projects.  I found it most useful at first to use it on a smaller project until I better understood the interface and information provided. When I analyzed our huge internal application the results were overwhelming (over 2K types, 574K IL Instructions, etc.). This is where NDepend could help over time as that initial run would be used as the baseline for future analysis. The stand alone NDepend application is fast and churned through our application in about a minute. In our application, the dependency matrix was very useful in determining how to refactor the code to reduce direct coupling of classes and methods.  From a code review perspective, NDepend can help get our team to focus on direct coupling more as well as object relationships.  I still believe in code reviews and will still push to have the development team perform them as they are good learning tool as well as a good way to provide complete business benefit. NDepend is an excellent tool for wrangling the code into best practices and bringing to light potential long-term code issues without spending weeks or months of time manually analyzing the code base.

Dispose in .NET

Recently we were working on merging PDFs on our web site where we ran into a strange problem. The user could run the merged reports once but upon the second time the server would get a “File Locked” error. Here’s the original code:

public static string CreateMergedPdf(string targetPdf, List<string> pdfFiles)
    using (FileStream stream = new FileStream(targetPdf, FileMode.Create))
       Document pdfDoc = new Document(PageSize.LETTER);
       PdfCopy pdf = new PdfCopy(pdfDoc, stream);

       foreach (string file in pdfFiles)
          pdf.AddDocument(new PdfReader(file));

       if (pdfDoc != null)

    return targetPdf;

Seems simple enough but we obviously aren’t cleaning up our work as the files are still locked.  After searching for all places where we were referencing disposable objects. We finally came up with this:

public string CreateMergedPdf(string targetPdf, List<string> pdfFiles)
   using (var stream = new FileStream(targetPdf, FileMode.Create))
      using (var pdfDoc = new Document(PageSize.LETTER))
         using (var pdf = new PdfCopy(pdfDoc, stream))
            foreach (var file in pdfFiles)
               var reader = new PdfReader(file);

   return targetPdf;


The lessons learned were clear by the end, you must close and dispose when you are able. Another thing to note was the

pdf.AddDocument(new PdfReader(file));

was from an example, causing us to waste time looking for the last remnant of the locking situation.

LinqPad – Get Disk Information

Getting hardware information can sometimes be hard when writing in .NET, the Framework keeps many of the deails away from the developer.  One day I needed to know if a drive letter existed and what type of drive it was.  I found many ways to do this but the one I liked the best is extremely simple from a code perspective:


var drive = "C";
var disk = new ManagementObject("Win32_LogicalDisk.DeviceID=\"" + drive + ":\"");

LinqPad – Enum to String

Converting an enum item to s string in C# is pretty simple, but we had a situation where the enum was not what we wanted to display to the user on the web.  Looking around I found several examples of how to get the displayed version to be more user-friendly.  Here is an example:

public enum SysModeType

We wanted to show “Administrator” on the web site but internally we were using “Admin”. We could do a search and replace for all Admin entries but there were other implications to the enum question in my mind. What if I wanted to display “Student Teacher” or “Intern/Unpaid” or any other combination of things that are not compatible with C# syntax.  I came up with several solutions and then went once step forward to test the performance of each.  The link is what worked and what didn’t.  I leave it to you to decide which one works for you.

LinqPad File: Utility – Enum to string.linq

LinqPad – Active Directory User Groups

The other day I needed to know the Active Directory groups a user had assigned.  Not being an operations person, I couldn’t go and use the tools on the server.  I decided there must be an easy way to get this done.  After a bit of searching, I came up with this LinqPad script using some assemblies Microsoft provided.


string username = 'hlord'; 
string domain = 'MyDomain'; 

var domainGroups = new List<string>(); 
var domainContext = new PrincipalContext(ContextType.Domain, domain); 
var user = UserPrincipal.FindByIdentity(domainContext, username); 
var authGroups = user.GetAuthorizationGroups();
authGroups.All(g => {
		if (!string.IsNullOrEmpty(g.Name) && !domainGroups.Contains(g.Name))
		return true;


LinqPad File: LDAP – User Groups.linq

.NET Compiled in Debug? How do you know?

Recently I had a need to find out if the build was in Debug mode. Our build process produced debug code in production, causing all kinds of issues so I went out into the web to find a way of determining if an assembly (or executable) was created in debug.  It was much harder than I initially thought to get this information.

The article posted at The Black Box of .NET had the answer I was looking for.

.NET 4 – Tuple instead of out parameters

.NET 4 introduced a tuple class which can be very useful.

Wikipedia defines a Tuple as “… a tuple is an ordered list of elements.”

MSDN defines a Tuple as “A tuple is a data structure that has a specific number and sequence of elements.”

You may be asking right now, “What am I supposed to do with that?”  I hope to give you a good idea by the time you are done reading this post.  I will provide a few examples of why, when and how to use them.  Let’s get to it.

One of my pet peeves is output parameters also known as ByRef in VB.  I know there many reasons to use them but the default seems to be where you need to return more than one value from a function.  Consider the following:

string outParameter1, outParameter2;
var v = OutParameterFunction("Test", out outParameter1, out outParameter2);
Console.WriteLine("return value: {0} :: out parameters: {1}, {2}", v, outParameter1, outParameter2);

public static int OutParameterFunction(string inParameter, out string outParameter1, out string outParameter2)
     var retValue = 0;
     outParameter1 = null;

     if (!string.IsNullOrEmpty(inParameter))
         outParameter1 = inParameter;
         outParameter2 = "Kewl";
         retValue = 1;
     else outParameter2 = "NOT Kewl";

     return retValue;

Simply put there are two values being returned with one of them being the integer value and the other being the output string parameter.

It would be easy to circumvent issue by creating a class to hold the multiple values and return the class.  What happens when you have a bunch of permutations of data, you could end up having dozens of container classes therefore increasing the cost of maintenance.

Another option would be to use an array of object.  Although it can be done, it has a huge performance impact and

The tuple option can be like this:

var t = TupleFunction("Test");
Console.WriteLine("return value: {0} :: {1}", t.Item1, t.Item2, t.Item3);

public static Tuple<int, string, string> TupleFunction(string inParameter)
var retInt = 0;
var retString1 = string.Empty;
string retString2;

     if (!string.IsNullOrEmpty(inParameter))
retString1 = inParameter;
retString2 = "Kewl";
retInt = 1;
else retString2 = "NOT Kewl";

     return Tuple.Create(retInt, retString1, retString2);

The code performs the same overall function but does not rely on output parameters.  You may look at the source as the same length and seems to hide the vales inside of a strange object.  The caller of the Tuple function does not have to create a container variable except for the return value.  Consider the situation where the call needs to be extended to return a third value, or a forth?  The tuple can be extended with more values and the caller can deal with the longer return set or not as the caller’s choice (if you like var) because the actual call is not determined at run time but at compile time.

There are some drawbacks to tuples:

  1. You can’t change the values once placed into a tuple.
  2. Unlike list types, you can’t iterate through the values.

I hope this helps.

Windows: Wired vs Wireless

This morning I started up my work laptop after placing gently yet firmly into the docking station and saw my connection to the network was with the wireless network.  Something inside of me thought, “The docking station has a wired connection, why not use it versus a wireless connection?”.  Now that I have considered it for more than a few minutes, I am a little upset about the wireless preference when a perfectly good wired connection exists.

Armed with a sense of righteous empowerment and a little noodle power, I think I have come up with a sure fire way for WIndows to chose the right connection given a set of scenarios.  Here goes:

  1. Speed advantage– If one is faster than another then go with the fastest
    • Pros – the fastest will result in the best experience for the user.
    • Cons – How often does the measurement need to take place?  Do we warn the user when switching in case they are streaming or connected via VPN or any form of connection where the adapter and the data flow matters?
    • Net – I don’t see this as a viable option when it comes to configuration because the cons are too many and too hard to interpret.
  2. All network connections– Have no preference and use them all at once
    • Pros – Amazing throughput for applications that can use more than one connection at a time.
    • Cons – Incredibly complex implementation.  If wired and wireless are pointing to the same network there would be no advantage to the multiple connection points.  Possible application issues for apps that require a specific MAC or IP.
    • Net – Too many potential issues but if it could be done, the result could be amazing.
  3. Use wired as primary then wireless as a secondary or allow the user to select a preference
    • Pros – Simple and predictable; wired is usually better for speed and throughput; user preferential order for overrides; the functionality for wireless already exists
    • Cons – The user needs to know which is faster; the user may get confused if too many networks are available
    • Net – Easiest for Windows to support because half of the functionality exists

I don’t think making the third option work would be too hard and third parties may be able to get the other options working.  This request will be made to the Microsoft support network but I am interested in getting your feedback.

The PowerShell Journey: Part 1

As a developer I try to use the latest and greatest and when something good comes about I try to pick up as much of it as I can.  The use of command or batch files has been a part of my world for a long as I have used computers.  Back in the day, there were batch menus for DOS computers so people with little or no computer experience could get the their applications (Lotus 123, Wordstar, etc.)  The days of doing DOS menus has past but the power of the command shell remains.

The New Command Shell for Windows

I know I am late to the party when it comes to PowerShell but when it comes to shells I am skeptical at first, excited or dismayed second.  Recently there seems to be more and more need to run a shell with more power than VBScript could hope to provide, I was skeptical but I now see the fruit of such a pursuit.  When PowerShell hit the street I hardly noticed because I was very happy with my shell and I could do anything with a cscript command that could not be done with old DOS commands.   Thanks, but no thanks.

Then something changed, I needed to run a script to start & stop an IIS application pool on another server.  I looked and found this which clearly uses .NET to control a remote IIS web application pool.  I was about to translate this to VBScript when I thought about what PowerShell is supposed to be able to do out of the box.  I pursued it and came up with the following script:

# Action : Start, Stop, Recycle
param([string]$server, [string]$AppPool, [string]$Action)
$sPath = "IIS://" + $server + "/W3SVC/AppPools/" + $AppPool
$de = New-Object System.DirectoryServices.DirectoryEntry($sPath)

I know it isn’t sexy but it performs the task I needed. This is just the beginning, I’m sure there will be more scripts with more meat in them. Stay tuned, I am starting a series: The PowerShell Journey.

Backup and Archive

I have been using USB connected drives to backup the computer for a while now.  I can’t remember when I started but it was around the time when my daughter was born (Jan 2006).   Before the USB drive backup system, I was using DVD disks, several of them and waiting a long time for the backup to complete.   These backups were with an application that would attempt to compress the data and write out in their proprietary format to the disks.  One day I had a hard drive crash and found the disks but I had not backed-up the application.  I found the program later but the whole restore process was not pleasant.  In came USB drives and I could backup the files intact without using a backup program and the world was well again.

Four years later, I am backing up to an online web storage solution called BackBlaze and I love it for the most part.  The service took a very long time to do the initial backup but I have slept well at night since then knowing I have a copy of all of my files safe and sound.  The service does not save a history of the files backed up though, nor does it contain a list of old files I do not have on my primary or connected hard drive.

I still save an archive of old files and such to two USB drives.  The first is a 320GB drive used for the photos and videos we have taken over the years and the second is a 1.5TB drive for everything else.

Everyone needs a backup solution make sure you find one that works for you.

Code Simplification: String Maximum Length

I was looking at some old code and ran across something that struck a chord of wrong with me.

string temp; 
temp = "There can only be one!";
if (temp.length > 15) temp = temp.substring(0, 15);

Then I thought we could use a little math to handle the problem:

string temp; 
temp = "There can only be one!";
temp = temp.substring(0, Math.Min(15, temp.Length));

I think Microsoft should have included this kind of method to the String class. Hope this helps.

AMD: Remember you are a technology company!

AMD seems to have lost itself somewhere along the way.  I have owned several AMD-based systems over the years starting with a 386 clone to an Athlon several years ago.   Having a deep love for all things hardware and software, I like to know what the technical specifications of just about anything I read about.  The type of hardware doesn’t matter, be it a hard drive, case, power supply or processor; it’s all good to me.  That is why I need to convey my deep disappointment about AMD.

I saw a new laptop on Newegg‘s web site and was intrigued to see a processor I have never seen before.  The processor is a mobile Phenom II quad-core and the machine looks like a upper-middle end machine except I know nothing about the processor.  I assumed that going to the AMD web site would yield all kinds of information about the processor.  I was utterly flabbergasted to see it not on the list of mobile processors from the home page.  I then searched the web site for the phrase “mobile Phenom” with odd results, the first two links had the processor named but no link to any technical documentation just all kinds of marketing.  The third link took me to a page listing all the classifications for the processors with “AMD Notebook Platform for Home” and “AMD Notebook Platform for Work” listed but when I clicked on those I get more of what I found in the initial search, marketing material.  Then I saw on the far right, a link to “Compare Processors” with notebook processors listed and there is a compare page.  I select the first “AMD Phenom™ II Quad-Core Mobile Processors” option and there are two processors I clicked on the P620 and no luck.  Just the basic technical facts about the design and manufacturing of the processor.  Don’t they think we want to know about the power management features, the memory controller, the memory type is uses, etc?  Twenty minutes and nothing to show for it.

I gave up at this point and looked for more detail outside of the AMD site and was disappointed again.  Another five minutes lead me to all kinds of speculation on performance because AMD has no material to say how the processor performs relative to the old line of mobile processors or even compared to the desktop version.

How can I buy a machine when I know virtually nothing about the processor?

AMD, I know I am just one guy but here are your action items:

  • Make the specifications of your processors easier to find and available to search engines.  I hate to point out that your direct competitor makes 90% of the specifications very easy to find.  Do the same.
  • Make a detailed specifications page with all of the details including architecture, bus configuration and any other real selling points for a technology geek.  If your marketing department is feeling a little left out, then include links to the marketing materials for the pseudo-techs or non-techs.
  • If it makes sense, have a technology oriented version of your site where people like me can find the dirty details like relative performance and technology factors to make educated decisions on the validity of the processor as the platform for a tech-geek.

You may just sell more of your product if more information is available.

Enterprise Library 5.0 – Notes

The new version of Enterprise Library has some interesting implied additions and constraints.  I have used the frameworks before and like them for the most part, they keep the structure of a large application from looking like slop.  I took a good look at it and I am impressed how much better the library gets in each release.  In our organization however, we have chosen not to use the framework because we used the first few and converting from 1.0 or 1.1 to 2.0 to 3.1 was a royal pain.  Several applications used 3 and 4 but they soon converted to the Castle framework because the Castle team seemed to be making more progress.  Both have pros and cons which you may assess for yourself if your interest level is high enough.

What I do want to note is the lack of expressed Windows XP support with the Enterprise Library 5.  Here is the beginning of the system requirements section:

System Requirements Screenshot

I saw a comment asking if Windows XP was supported and the reply was that if it works, it works.

There are a ton of breaking changes this time, but none appear to be outside the realm of making things better in the long run.  There is also a Migration Guide as usual to help in the transition process.

Great job on the release and I hope to see more good things from the Enterprise Library in the future.