1. Machine.Specifications under TFS 2012

    About year ago we started to use Machine.Specifications (screencast) on one of our projects. It was really nice to have executable specifications and well organized tests. The only issue was is that TFS 2010 Build wasn’t able to run Machine.Specifications. Finally I’ve added some T4 templates to create MSTest wrappers for MSpec code. So we were able to run our tests and see the coverage on the CI server. What’s amazing TFS 2012 support test runner extensions. Hence now we can easily configure our CI server to run Machine.Specifications and show us the code coverage! The step by step guide for NUnit available at http://blog.accentient.com/2012/04/30/RunningNUnitInATFS11BuildOn64BitServer.aspx. For the Machine.Specifications it’s pretty much the same. The only change is that you’ll need to install MSpec Test Adapter instead of NUnit. Also please make sure you selected Visual Studio Test Runner in the build configuration. …


  2. TFS Build - pass build and revision numbers to the MSBuild script.

    If you are using TFS Build as you CI server it’s really simple to pass build and revision number to the MSBuild. Just create a copy of the template you are using at the moment. And then search for the (there are should be 2 occurrences)

    [String.Format("/p:SkipInvalidConfigurations=true {0}", MSBuildArguments)]
    
    And replace it by the following text:
    [String.Format("/p:SkipInvalidConfigurations=true /p:BuildNumber={1} /p:RevisionNumber={2} {0}", MSBuildArguments, BuildDetail.BuildNumber.Substring(BuildDetail.BuildNumber.LastIndexOf(".") + 1), BuildDetail.SourceGetVersion.Substring(1))]
    
    MSBuild script should looks similar to:
    <PropertyGroup>
        <BuildNumber Condition="'$(BuildNumber)' == ''">0</BuildNumber>
        <RevisionNumber Condition="'$(RevisionNumber)' == ''">0</RevisionNumber>
      </PropertyGroup>
    
      <PropertyGroup>
        <Major>1</Major>
        <Minor>0</Minor>
        <Build>$(BuildNumber)</Build>
        <Revision>$(RevisionNumber)</Revision>
      </PropertyGroup>
    


  3. NuGet + Package Restore = Failed Build?

    Recently we faced weird issue with NuGet. Then we build our solution (with Package Restore enabled) build fails due to the NuGet error. We have some packages from the internal NuGet server, hence in NuGet.targets we had this:

    <!-- Package sources used to restore packages. By default will used the registered sources under %APPDATA%\NuGet\NuGet.Config -->
    <packagesources>
    &quot;https://nuget.org/api/v2/;http://ournuget.com/nuget&quot;
    </packagesources>
    
    NuGet output was: One or more errors occurred The command was:
    "c:\ProjectPath\.nuget\nuget.exe" install "c:\ProjectPath\Source\ProjectName\packages.config" -source "https://nuget.org/api/v2/;http://ourinternalnugetserver.com/nuget" -o "c:\ProjectPath\packages"
    
    What’s interesting if you run NuGet with empty -source you get nice warning:
    WARNING: The schema version of 'Rx-PlatformServices' is incompatible with version 1.8.30524.9000 of NuGet. Please upgrade NuGet to the latest version from http://go.microsoft.com/fwlink/?LinkId=213942.
    *I got above warning after executing below command
    "c:\ProjectPath\.nuget\nuget.exe" install "c:\ProjectPath\Source\ProjectName\packages.config" -source "" -o "c:\ProjectPath\packages"
    
    Next steps are obvious “nuget update –self” did the trick.
    nuget update –self
    


  4. Migration to TFS 2012

    Over weekends I worked on migration from TFS 2010 to TFS 2012 + SharePoint 2013. Migration process was pretty straightforward, thanks to TFS and SharePoint teams! During migration I found that some TFS databases are really huge. The answer was that TFS saves all the test binaries, results and code coverage data right inside TFS database (!). For me it was a big surprise. So I had to use Test Attachment Cleaner tool to cleanup all those unnecessary data from TFS:

    "C:\Program Files\Microsoft Team Foundation Server 2010 Power Tools\Test Attachment Cleaner\tcmpt.exe" attachmentcleanup /collection:"http://tfs.example.com:8080/tfs/Collection" /teamproject:"Team Project" /settingsfile:tfscleanup.settings.txt /outputfile:tfscleanup.logs /mode:delete
    
    cleanup.settings.txt contained the following settings:
    <DeletionCriteria>
        <TestRun >
            <AgeInDays OlderThan="30" />
        </TestRun>
        <Attachment>
            <Extensions>
                <Include value="tr_"/>
                <Include value="tlk"/>
                <Include value="dll"/>
                <Include value="exe"/>
                <Include value="pdb"/>
                <Include value="xml"/>
                <Include value="coverage"/>
                <Include value="config"/>
                <Include value="iTrace"/>
                <Include value="wmv"/>
            </Extensions>
        </Attachment>
        <LinkedBugs>
            <Exclude state="Active" />
            <Exclude state="Resolved" />
        </LinkedBugs>
    </DeletionCriteria>
    
    It did the trick. But I had to shrink database cause test attachments were more than 70% of it’s size. Next I used script from MSDN to rebuild all indexes. So if your Build Servers run unit tests, it makes sense to review your TFS databases ;) …


  5. TypeScript - first impression

    Finally got some time to watch Anders Hejlsberg’s Introducing TypeScript. And you know TypeScript is just great. I started my career as a PHP developer and soon I moved to .NET cause I just needed that strong typing :) You know it’s a question of comfort.

    • If you have a compiler that could help you to early discover issues with your code, it's much better then to discover them at run time.
    • Also you can easily refactor your solution.
    So what Anders Hejlsberg and his team did? They brought comfort to the JavaScript developers. If you didn’t have a chance yet, please go and check out TypeScript! …


  6. Windows Identity Foundation (WIF)

    I’m really excited by the Windows Identity Foundation (WIF). And now it’s a part of .NET Framework, so it can be used in any .NET application. The team did great work by providing us awesome “plumbing” code, so now we can build truly separated systems very easily. Also WIF provides 100% backward capability to the old (IIdentity/IPrincipal) approach! Claims allow us to describe identity using key/value pairs. So we can deal with claims such as name, email, etc. WIF allows to make Claims Transformation. So the application will have to deal only with domain specific claims, and it won’t have dependency on the underlain authorization mechanism. For example we can transform  AD Groups into domain specific roles. Sure we can also make any custom Claims Validation as well. Claims-based Authorization is a piece of cake. Again you deal with domain specific terms. Here’s simple example:

    [ClaimsPrincipalPermission(SecurityAction.Demand, Operation = "Add", Resource = "Customer")]
    public void AddCustomer() { ... }
    
    As you can see your business logic says - this is “Add” operation and “Customer” is resource. And we have a single method to implement authorization logic:
    public override bool CheckAccess(AuthorizationContext context)
    {
    	// please note that we deal with arrays here, so we can use more complex expressions in describing our business logic requirements.
    	var resource = context.Resource.First().Value;
    	var action = context.Action.First().Value;
    	if (action == "Add" && resource == "Customer")
    	{
    		return context.Principal.HasClaim("http://myclaims/someclaim");
    	}
    	return false;
    }
    
    As you can see on above example we could move authorization logic to the separate assembly or even Web Service. It needs to mention that WIF supports Claims caching. For example you can use AppFabric cache to store transformed claims instead of make transformations for each request. And it support SAML tokens to seamless third party authentication. So you can use AD FS  or even Windows Azure. …


  7. Visual Studio 2012 Page Inspector

    For years I used FireBug to inspect HTML, CSS. But Visual Studio team did great work and now we got even better functionality right inside our favorite IDE. What’s so awesome about it? You can “Inspect” any element on you HTML or CSS and Page Inspector will show you a line in source code that produced that element! 4 minutes introduction video could be found here http://www.asp.net/vnext/overview/videos/visual-studio-2012-page-inspector. (thanks to Scott Hanselman) …


  8. Git on windows

    Finally got some time to watch Phil Haack’s talk Git and GitHub for Developers on Windows. It was really interesting. GitHub for Windows has really nice clean UI. Fun thing is that recently we used the same model to deliver our software as GitHub folks did – use ClickOnce as a deployment technique. Although we went a little further with it. Our installer doesn’t show any standard ClickOnce windows, it just uses API to streamline installation process. Also we build personal installer for each user who gets invitation or license, cause each copy could be installed just once. …


  9. NuGet without committing packages to source control

    Several days ago switched to new approaching of working with NuGet by enabling NuGet Package Restore featrure. From now packages aren’t being stored into source control. Instead NuGet download missing packages during build. So far so good. …


  10. Entity Framework (EF), UnitOfWork & Repository patterns

    There are some debates on the network about if we need to add abstraction layer on the top of Entity Framework. Or it’s redundant. Sure, I have my own opinion as well Smile Entity Framework (EF) object context provides us ability to make whatever we want with any table in database. So if our business logic uses EF directly our data access code will be spread through it.  Me personally like Persistence Ignorance pattern so business logic shouldn’t be aware of the data layer details. Furthermore I like then each repository contains some related methods. Usually it’s something like UserRepository, ChangeRepository, PostRepository, etc. It helps to structure code since in this case we don’t have huge repositories. EF implements UnitOfWork pattern and that’s amazing. So that we did in our last project is used EF object context as UnitOfWork and passed it to our Repositories. (it was wrapped into IUnitOfWork, probably I’ll post some technical details later) We used Code First approach and all our POCOs were located in the Model. So business logic worked with Model not with data. That’s worked like a charm. We got clean data layer and business logic. It’s very easy to unit test and maintain. …