Archive for the ‘Technical’ Category

I think this deserves a short blog post of its own, just in case you are (like me!) in the midst of deploying a production farm with SP 1 into the wild.

Microsoft just announced the following information on their SP 1 page:

We have recently uncovered an issue with this Service Pack 1 package that may prevent customers who have Service Pack 1 from deploying future public or cumulative updates. As a precautionary measure, we have deactivated the download page until a new package is published.

So this is some pretty heavy information. Not to say the least since everyone (Microsoft included) promoted SP1 as a top requirement to install in your SharePoint 2013 farms. With that said, shit happens, I understand. However I just wanted you to know that you might want to hold on for a while until they’ve found a fix for people who currently are deployed with SP1 or fixed a new installation package for SP1.

All the details can be found here:


As per a conversation with Bill Baer from Microsoft, there’s now additional info available. And I quote:

Slipstreams not affected!

P.S. Just received confirmation that the slipstream builds are not affected.

A fix for customers who have already deployed is in progress!

For existing customers who have deployed Service Pack 1 we’ll have an update available (ETA TBD) that resolves the issue.

With that additional info, I (and my clients) can feel a bit more relaxed.

/ Tobias.


In one of my previous blog posts about The future of Forms with SharePoint and Office 365 I talked about what is going on related to Forms. In this article I will show you the power and ease of using Access 2013 to create a simple App Form that you can use in your Office 365 deployment. Simply put, it’s one of the easiest way to create an App for SharePoint without breaking out any code at all.

Say what you want about Access, but these features are pretty slick and I’m going to stick out my chin and say that it’s here to stay.

First, the technical background

I was pretty skeptical when Microsoft announced the use of Access to build Apps and Forms for SharePoint Online, which meant I really had to dig into it myself and try it out. Said and done, here’s what my research tells us.

The big news here is something called Access 2013 Web Apps which enables you to publish your app and host it in Office 365. There’s three main areas in an Access 2013 Web App:

  • The Web App Model
  • Deployment to SharePoint
  • Data storage based on SQL in Azure

    The Web App Model

    With the new web app model we can design our forms for this app easily. There’s no exhausting design process to get around when creating apps using Access. You’ve got table templates ready-to-go as a boilerplate to start your work on.

    This is truly a no-code solution where you with just a few clicks have created a complete application.

    Deployment to SharePoint

    You deploy your app to SharePoint just like any other app you create. An Access App is just like any other SharePoint App. Simple as that.

    Data storage on SQL Azure

    When you deploy your App to Office 365, the data you store in the app is hosted in SQL Azure.

    Simply put, this is it

  • Design your App using Access 2013
  • View your app using the browser
  • Access Services in Office 365 hosts the presentation of the app
  • SQL Azure hosts the data storage of the app

    And what about reporting?

    I’m glad you asked. Since it’s all based on SQL Azure (an actual reliable relational database), you can pull out all types of information though any means you like, including:

  • Power View
  • Access Reports (desktop)
  • Crystal Reports
  • Custom built tools that gather and displays the data ( for example)
  • Excel
  • etc etc.

    Show me the money, step by step

    In order to show how easy it is to get started with building Apps using Access 2013, I’ll show you a step by step guide here for the entire process.

    Get things started

    Launch Access 2013 and choose a Custom Web App

    When you start up Access you are presented with the option to create various types of projects using different templates. There’s one called Custom Web App that you will choose:


    Select a name and location

    In this step you’ll simply enter a name and a location (SharePoint Online site url) and click Create:


    Sign in and authenticate

    If you are not signed in to the site already, you’ll have to fill in your credentials and re-authenticate. Mandatory step to ensure you can deploy the App later.


    Designing the application

    The design overview view

    Okay, so now we have opted to create a new App for Access Services in Office 365, but have yet to create the logic for it. In the next few steps I’ll create a very simple application that will be a way to create and manage small events.


    Create a new data table

    The easiest thing to get you started is to search for a specific type of template in the search box on the front page of Access. I’m searching for Event.


    Select the Event template

    Select the first template in the list, called Event. You will see that it will create a bunch of tables for you which you can review on the left-hand side of the Access design surface:


    Launch the App

    In order to just test this app out, you will click the very large and nice button in the upper left section of the ribbon called "Launch App". Once you’ve done that, it will be published to Office 365 and displayed in your oh-so awesome favorite web browser:


    Filling in some data

    I’ll go ahead and create some data in my application just to demonstrate the various built-in views I get with my forms here.

    I normally switch over to Datasheet view instead of the default List view, which enables me to quickly fill in new data:


    Review the data in the List view mode, a fully functional application


    Where can I find the App?

    Well, after you clicked "Publish App" from Access, it was pushed to your SharePoint Online site that you entered when you created the Access App. So, going to your portal you’ll find the App right there:


    Update: What about on-premises?

    Right after publishing the article I got a question whether this was available for on-premises SharePoint 2013 as well. It requires Access Services to be configured which my friend Kirk Evans have a great overview in his blog here:

    That’s the starting point for taking the discussion to on-premises, check it out!


    This was a very quick introduction to creating Apps using Access 2013. I only wanted to scratch the surface so you can get an introduction to the concept of App Forms with Access Services in Office 365.

    There’s a lot of additional resources available on this topic that I would urge you to check out.

    So as you can see, the information isn’t really new. The reason for the extreme hype the last few days is because when they announced the information about Forms that everyone waited on, Access Apps or App Forms were one of the categories moving forward. So there you have it, an easy way to create business applications without even touching a single line of code.

  • Visual Studio 2013 March 2014 Update has been released

    March 3rd, 2014 by Tobias Zimmergren

    Today, Microsoft announced the release of Visual Studio 2013 March 2013 Update which have additional features, project types and support for some of the enhancements that comes with Office 2013 SP1 and Office 365. So if you’re a developer targeting Office 365, Office 2013 or SharePoint Server 2013 it may be a good idea to go grab the update and install it now.

    SAP Data Source in your Cloud Business Application


    As quoted from the Visual Studio blog:

    The March 2014 Update also offers first class support for SAP Netweaver Gateway. When connecting CBA to SAP, it honors SAP annotations for Phone, WebAddress, and Email; speeding up your configuration of entities consumed from SAP Netweaver Gateway.

    While this may seem like a trivial update, I know this is a popular requirement from a lot of my clients. I don’t think this is the last data source type we’ll see in the list of available types, which is ever growing:


    Integrating new document features in your Cloud Business Applications

    One of the cool features, which I lacked in one of my CBA projects in the past, is the integration of native controls for managing documents in a library that your CBA is connected to. With the latest update, you get the ability to create new documents in your library from the CBA interface.

    When you attach your application to a SharePoint host-web document library, the application will be integrated with a set of new document controls, which allows your users to create new Office documents (either blank documents or from document templates that are available in the attached document library), open documents in Office Web App or Office client, and upload existing documents. All of these are provided by the tools without requiring any extra coding from you.

    Picture from, as per the article mentioned in the introduction.

    To add such a control and work with these types of controls, you simply have a new option here in the CBA Screen design interface:


    Apps for Office Development enhancements

    Some of the additions to the Office App development toolset are mentioned:

    Office 2013 SP1 and Office 365 supports PowerPoint content apps, Access Web App content apps and allows your mail app to be activated in compose forms (i.e. when users are writing a new email message or creating a new appointment). Office Developer Tools for Visual Studio 2013 – March 2014 Update supports all of these new app types across the development cycle from project creation, manifest editing, programming, debugging to publish.

    Apps for SharePoint Development enhancements

    There’s some additions to the experiences for developers when creating SharePoint Apps.

    With the latest release of the tools, we now can switch the App project to target Office 365 or target an on-premises installation of SharePoint. It’s a new property in the SharePoint project settings dialog:


    The app for SharePoint tooling now allows you to target your app exclusively to SharePoint Online in Office 365, or target to an on-prem SharePoint Server 2013 (which can also run in SharePoint Online). Through a simple switch in the app for SharePoint project properties page, the tools will update the SharePoint version number and the SharePoint Client Components assembly references used in your project accordingly.

    Another pretty neat thing is the support for MVP in client web parts:

    To enhance the support with ASP.NET MVC web applications, we added MVC support for client web part pages in this update. If your app for SharePoint project is associated with an ASP.NET MVC application, when you add a client web part and choose to create a new part page in the client web part creation wizard, a client web part controller and a default view will be added, following the MVC pattern.

    Wrap up

    I’ve been trying these things out really quick now, and I must say the small but constant flow of enhancements and additions with the new update cadence for all Microsoft products is pretty impressive. Visual Studio 2013 hasn’t been around for that long, but we’re still seeing new updates being published frequently. It is easier than ever to keep up with new development techniques and tools, and no need for exhausting waiting periods.

    Read the full story on MSDN Blogs: Announcing Office Developer Tools for Visual Studio 2013 – March 2014 Update

    Microsoft just released Service Pack 1 (SP1) for SharePoint Server 2013 and Office 2013 (and a bunch of other products that doesn’t quite relate to my focus).

    As per request from a few clients and friends, here’s a quick link list with download details.

    SharePoint Server 2013 SP1 improvements

    In this spreadsheet you can get a full list of improvements with SP 1. For SharePoint, these are the immediate changes:

    Description of fixes

    Metadata is lost when documents that use a custom content type with a "Description" field are opened for editing.

    When an item is deleted, restored from recycle bin, and then deleted again, there is a primary key constraint error.

    An error occurs when files are moved between document libraries and the web time zone is behind that of the server.

    Metadata filtering at list level always lists all metadata terms.

    The hyperlink popup window drops the selected word to be linked when there is a delay of more than one second in opening the window.

    Multiple-column, SummaryLinkWebParts with a group heading style of "Separator" are rendered incorrectly.

    A hash tag that contains a full width space does not get created successfully.

    Search schema compression is now enabled by default to allow larger search schemas.

    Highlighting for FQL queries is now enabled for FQL as well as KQL.

    Opening a custom SharePoint list in datasheet view and applying multiple custom filters, where each filter has more than one condition, can result in an incomplete set of list items.

    When the "Export to Excel" button is clicked in a SharePoint document library that has the Content Type field displayed, the Content Type field does not appear in the Excel workbook.

    An error occurs after changing the "Manager" property in EditProfile.aspx page when the My Sites WebApp is not in the same farm as the UPA.

    SharePoint REST API does not return a well-defined error response for a duplicate key exception.

    Developers are unable to specify a Content Type ID when creating Content Types in the client object model.

    On list views in SharePoint sites, the Connect to Outlook button in the ribbon may be erroneously disabled.

    In some non-English languages of SharePoint, the text displayed in the callout UI for a document or list item, describing who last edited the item, may not be grammatically correct.

    Copy and Paste in a datasheet does not work correctly with Internet Explorer 11.

    Pages do not render in Safari for iPad when private browsing mode is used.

    When editing rich text fields in SharePoint, if the editing session exceeds 30 minutes, the edits may not be saved.

    An error that says "SCRIPT12004: An internal error occurred in the Microsoft Internet extensions" may occur intermittently when users visit their SkyDrive Pro or other pages on their personal site.

    InfoPath may crash when a form that points to a SharePoint list, with a lookup to another SharePoint list, is opened.

    An InfoPath form with extended characters in its name fails to open.

    An error that says "Security Validation for the form has timed out" may occur when an InfoPath form is digitally signed and hosted in a SharePoint site collection that uses the SharePoint version 2010 user experience.

    "Show document icon" remains unchecked and the document icon does not show in Edit Properties for a list item.

    A "Failed tagging this page" error occurs when the "I like it" button is clicked.

    The wrong term is removed when manually editing a multi-valued taxonomy field.

    When tagging list items using a language that is different from the term store default language, suggestions for labels are offered in multiple languages. The suggestions appear confusing because both language suggestions are listed without any identification of the language.

    An error that says "There was an error processing this request" may appear when editing the user profile.

    Times are missing from Date/Time results in certain filtered list web service calls.

    Minimal and no metadata are now enabled as supported JSON formats.

    Actions4 schema workflow actions can’t be deployed to SharePoint.

    Using Client Object Model, Stream.Seek() to seek to a particular position doesn’t seek at the proper offset.

    Refreshing a workflow status page generates the following error: "System.Collections.Generic.KeyNotFoundException: The given key was not present in the dictionary."

    Setting custom, non-English outcomes in web pages on tasks in a workflow fails to set the value.

    Configurations of SharePoint using Azure Hybrid mode and Workflow Manager together can cause workflow callbacks to fail.

    Workflow task processes on wiki pages won’t start.

    Workflows won’t wait for changes to content approval status fields.

    E-mails generated by workflow cannot be disabled for approvals in SharePoint workflows.

    Workflows may fail to send an e-mail or send too many e-mails.

    Association variables do not update correctly for auto-start workflows.

    A KeyNotFoundException error may occur in a workflow when the associated task list uses unique permissions.

    Incomplete tasks are deleted when workflow task activities complete.

    Task activity is suspended when the task is completed using app-only credentials.

    An error that says "This task could not be updated at this time" occurs when trying to complete a workflow task using the "Open this task" button in Outlook.

    A workflow doesn’t respond properly when waiting for changes in specific types of list columns, such as Boolean, Date Time, and User.

    Details & Download

    Slipstream install SharePoint Server 2013 with Service Pack 1

    Yesterday (28th of February 2014), Microsoft released the slipstream installation media on MSDN. So if you’re aiming to install a new SharePoint 2013 environment, you can get the one with SP1 bundled in the installation.


    Please note, as per Microsofts recommendations it is UNSUPPORTED to create your own slipstream installation. You should get the prepeared slipstream installation as per above (MSDN, MPN or VLSC) instead. As quoted by Stefan Goßner:

    A couple of people already tried to create their own slipstream version using RTM and SP1. Please avoid doing this as this is unsupported due to a change in the package layout which was introduced with March 2013 PU. Due to this change in the package layout it is not guaranteed that the manually created slipstream version will update all modules correctly. Ensure to download the official SharePoint 2013 with SP1 from one of the above listed sites.

    In the past I’ve written some technical articles about "TFS Preview" which was the name of Visual Studio Online while in beta. Now that it’s officially launched (and awesome), I’d like to make a small wrap-up of the features I really love with Visual Studio Online. And of course the pricing which I’m getting tons of questions about each week.


    Visual Studio Online comes in a few different models and prices. Tailored to meet your needs, you can choose from the various alternatives and subscribe to them. Yes, the key here is that you subscribe to them. Instead of buying an expensive license for your developers, you can now subscribe for a monthly fee – all curtsey of the oh-so amazing cloud model.

    Visual Studio Online Basic

    The basic plan is awesome, because the first 5 users are FREE!

    The normal price for the basic plan is $20 / user / month.

    Visual Studio Online Professional

    The normal price for the Professional plan is $45 / user / month.

    Visual Studio Online Advanced

    The normal price for the Advanced plan is $60 / user / month.

    MSDN Subscribers, see here!

    There’s a white paper published called Visual Studio and MSDN Licensing White Paper which describes the different options and subscriptions. If you’re an MSDN subscriber you can access some of the benefits of Visual Studio Online without additional cost.

    Comparison chart

    Microsoft have provided us with a great comparison chart over the features in the various editions of Visual Studio Online.



    It doesn’t have to be expensive to develop great software anymore. We all know that in the past it cost a lot of cash to get a license for Visual Studio or MSDN. Now you can purchase a subscription that suits your needs and it’ll come to a pretty cheap price in my opinion. My organization uses MSDN Ultimate, which gives us access to essentially every feature in the package. Some of my clients are purchasing subscription licenses for the Visual Studio Professional which is quite sufficient for the most part.

    The point of this post is to merely highlight the benefits of the new subscription model for Visual Studio since I’ve been getting a lot of questions about it lately.

    Head on over to to read the rest – it’ll be worth it!

    Tools for your SharePoint 2013 development toolbox

    January 18th, 2014 by Tobias Zimmergren

    Author: Tobias Zimmergren | |


    Some of my students, clients and community peers have asked about my favorite tools for working with SharePoint. So with that in mind, I quickly noted down some of my favorite tools that every developer should have in their toolbox. I’ve written posts about this before but as times and technology moves forward, new tools and techniques are evolving. I may post about the tools I recommend for IT-Pros at some point, but right now let’s stay on track with the dev-focus.

    If you think I’ve missed out on some great tools, let me know by commenting on the article and I’ll make sure to include it in the post!

    Recommended development tools

    Here’s a brief list of tools that I would recommend for any SharePoint solution developer. At the very least, I would assume that you already have Visual Studio 2012 or Visual Studio 2013.

    CAML Designer 2013

    Historically I’ve been using some kind of CAML query generator since the U2U CAML Query Builder tool was released. There’s really no other efficient way to build your CAML queries than have the stubs auto-generated with a tool. For SharePoint 2013 I am now solely using the CAML Designer tool available from

    The application is a lot more enhanced than any predecessors and I highly recommend that you download it right away unless you’ve already done so. There’s some tweaks that has to be made in the tool before it’s feature complete, but hey it’s a community tool and the best one I’ve used thus far in the area. The best part is the auto-generated code samples that goes with your query. Yum!

    Quick Highlights:

  • Autogenerate the actual CAML Query
  • Autogenerate Server OM code
  • Autogenerate CSOM .NET code
  • Autogenerate CSOM REST code
  • Autogenerate Web Service code
  • Autogenerate PowerShell code

    If you investigate the tool you’ll see that it does not only generate the CAML query itself, but also code-snippets for various technologies for you to copy and paste into your code as a stub to work with. Simple, elegant and so far it only crashes every now and then ;-)



    SharePoint Manager 2013

    Since a friend of mine, Carsten Keutmann, started working on the SPM (SharePoint Manager) tool I have been impressed with it. It has been in the community for a long time and now the 2013 version is pretty slick. Simple and intuitive interface which allows you to quickly and easily navigate down the farm and investigate settings, properties, schema XML and so on. Most of the things in your SharePoint environment can be investigated from this tool.

    So unless you have downloaded this tool already, go get it now. Pow!



    ULS Viewer

    I still keep getting the question about what tool I use to parse my ULS Logs. Honestly, there’s no alternative to the ULS Viewer. I’ve written about logging previously, including the ULSViewer tool: In SharePoint 2013 I use this tool on a daily basis, but I also use the built-in ULS viewer in the Developer Dashboard.

    ULS Viewer Windows Application Screenshot:

    ULS Viewer in the Developer Dashboard:

    For information about how to enable the developer dashboard and launch it, check out my mate Wictor’s blog:

    Windows version download:

    CKS Dev

    A plugin for Visual Studio that is a killer-feature you can’t live without, is CKS:Dev. A team of awesome folks in the community have put together this amazing extension to Visual Studio and it now have support for 2013 as well. It allows you to manage your development routines more efficiently while you are on a coding adventure, it adds a bunch of new project items for your SharePoint projects and contributes to an overall satisfactory SharePoint developer story. You need this extension. Period.


    Color Palette Tool for Branding

    With SharePoint 2013 comes new possibilities for branding. A lot of people are accustomed to wobbling up their own custom CSS files and have a designer do most of the branding parts. If you’re just looking to create new composed looks for SharePoint 2013 without too much effort, you should use the SharePoint Color Palette Tool, provided by Microsoft!



    Debugger Canvas

    A few years ago I blogged about a tool called Debugger Canvas. A tool that can aid you in the debugging process. I’m not using it every day, but when I switch it on it really nails it! What can I say, if you hate the tedious normal debug execution and you want a better and more hierarchical way of displaying your trace in real time, enjoy debugger canvas awesomeness. All the code in your current calls displayed in one view type of thing. You’ve got to check it out.

    Note: The debugger canvas is for VS 2010 Ultimate. I’m not sure if they’ve gotten around to port it up to VS 2012 or VS 2013 yet; But if you’re lingering with 2010 Ultimate, you should get this now. Period.

    (Screenshot is from the Visual Studio Gallery)


    SharePoint 2013 Search Tool

    As we all know search is one of the biggest things in SharePoint 2013. This tool allows us to learn and understand how the queries can be formatted and allows us to easily configure a Search REST Query. Pretty slick if you ask me. Use the tool to create the queries for you, then you can analyze them and better understand how to tweak and modify the output. Great job with the tool!



    Fiddler. Always use Fiddler!

    For most experienced web developers, Fiddler has been a constant tool in the basket. It is an addition to many of the existing tools you can use, but it’s extremely slick for analyzing SharePoint requests on the client side. I’ve saved countless hours by using this awesome tool to analyze the requests and responses from SharePoint. Download it, learn it, use it.



    SPCAF – SharePoint Code Analysis Framework

    My friend Matthias Einig created a tool called SPCAF which analyzes your solutions and code. Truly a beneficial tool in your toolbox that will aid you in the direction of awesomeness. If you’ve developed crappy solutions, you’ll know it before you ship it off to production environments. It integrates with Visual Studio, there’s a stand-alone client application and you can even have it hooked up to your build process – something I’m doing with my Iterative Development Processes.

    (Image from


    .NET Reflector from Red Gate

    It’s no secret that we want to peek into other peoples’ code. With the .NET reflector from Red Gate you can do just that. It’s an awesome reverse-engineering tool which allows you to peek into the code of a compiled assembly! I use it to debug Microsoft.SharePoint*.dll assemblies and to investigate third-party assemblies.



    F12 Debugging experience in your browser

    As Anders mentions in the comments, I forgot to mention the most obvious one. The F12-experience in your web browser. It enables you to debug and investigate HTML, CSS, JavaScript and other resources on your web pages on the client. Internet Explorer, Google Chrome and FireFox all have some type of developer tools available. For me personally, I use Chrome as my main debugging tool and IE for verification. I seldom use Firefox anymore to be honest.

  • How to use F12 Developer Tools to Debug your Webpages
  • How to access the Chrome Developer Tools

    PowerShell Tools for Visual Studio

    As Matthias points out in the comments, there’s another great extension for Visual Studio called PowerShell Tools for Visual Studio. It allows you to get syntax-highlighting on your PowerShell files directly in Visual Studio.




    Are you developing Apps for SharePoint 2013? Steve Curran commented about using the SPFastDeploy tool that he has created. It’s a great extension for quickly pushing changes to your dev site without having to re-deploy the entire app. Pretty neat!

    (Image from Steve Curran’s blog)


    Advanced REST Client plugin for Google Chrome

    As pointed out by Peter in the comments, there’s an awesome plugin for Chrome called Advanced REST Client which allows you to investigate the REST calls and configure your queries pretty simply through the UI. You get to see the results and the request times directly in the browser and you can play with the parameters etc easily until you get it just right. Great tip!



    Postman – REST Client plugin for Google Chrome

    The previous REST Client I mentioned above is awesome, and here’s another really great tool that AC tipped us about. The Postman REST Client plugin for Google Chrome. Similar to the previous plugin for Chrome, but slightly different for the one who prefers that tool instead. An idea is to try them out both and figure out which one you like best yourself.



    SharePoint 2013 Client Browser

    As pointed out in the comments by André, the SharePoint 2013 Client Browser is a tool similar to SharePoint Managed which I’ve mentioned above in this article. With this tool you can connect remotely to a SharePoint environment and investigate the data through the Client API’s. In my screenshot I’m connected from my laptop to my Office 365 SharePoint Online dev-account for development. Pretty sweet!


  • Download:

    Note: There’s a similar tool available called SharePoint Explorer 365, which also allows for connecting to Office 365 which can be found here ( I prefer the previously mentioned one though, the SharePoint 2013 Client Browser, but that’s a matter of preference.


    I can’t believe I originally forgot to put this in. Thanks to the tip in the comments from Caroline I got around to add it to the list here. Smtp4dev is an awesome tool for testing out if SharePoint are sending its e-mails properly, but instead of actually sending the e-mails to the recipients (which may not be wanted if you’re testing on real data for example..) it will catch all e-mails sent through the smtp server and allow you to view them directly in the tool’s UI. It’s pretty neat, and I do use it a lot when working with things related to e-mails and specifically automated processes where e-mails may be sent at various points in time but you still need to verify the logic and correctness.




    So there you go. For everyone who asked what common tools I keep in my toolbox; There’s the most common ones.

    If I missed to mention a tool, feel free to enlighten me and perhaps I can add it to the post.


    Author: Tobias Zimmergren | |

    Introduction to the problem

    It’s not uncommon when upgrading to SharePoint 2013 from a previous version of SharePoint that you’ll get encoded claims usernames in the places you may have seen normal usernames (DOMAIN\Username) syntaxes before. In my case it was a matter of finding a ton of custom code and have it check whether the username was a claims encoded username or not.

    This is what we saw:


    However, what we really wanted our code to output was this:


    There’s a good reason for why the username that is claims encoded look the way it does. The format tells us what type of claim it is. Wictor has done a nice breakdown and explained the claims here:

    The solution to this problem

    There’s a pretty simple solution for this that it looks like a lot of people are missing out on. The code snippets I’ve seen in the last project are all parsing the string manually with custom logic and then trying to determine on a string.split() if it is a claim and what type of claim it is.

    Instead of going down that dark and horrible road, you should take a look at the built-in functions in the API that does this just fine for us:

    private string GetUsernameFromClaim(string claimsEncodedUsername)
        using (new SPMonitoredScope("GetUsernameFromClaim method start"))
                SPClaimProviderManager spClaimProviderManager = SPClaimProviderManager.Local;
                if (spClaimProviderManager != null)
                    if (SPClaimProviderManager.IsEncodedClaim(claimsEncodedUsername))
                        // return the normal domain/username without any claims identification data
                        return spClaimProviderManager.ConvertClaimToIdentifier(claimsEncodedUsername);
            catch (Exception ex)
                // You should handle any exceptions in here instead of ignoring them!
                // Logger.Log("An exception occured in the GetUsernameFromClaim() method");
                return claimsUsername; // Or just return the original username.
            // Return the original username value if it couldn't be resolved as a claims username
            return claimsUsername;


    Since I saw so many places in the previous few projects where people have been referencing custom methods for string-splits to sort out the claims usernames into default domain\username formats, I thought you’d benefit from knowing that there’s a built-in method for that. Nothing fancy, but worth to know about it.

    Check out these resources for additional and more in-depth information about related things:

    Programmatically converting login name to claim and vice versa, by Waldek Mastykarz

    How claims work in SharePoint 2010, by Wictor Wilén

    Enjoy this quick tip.

    Author: Tobias Zimmergren | | | @zimmergren


    When dealing with SharePoint development, there’s tons of things to consider. There’s the aspects of code quality, aspects of proficient testing taking place and of course having a reliable process to re-do and fix anything that comes up along the way. Most importantly in my opinion is to not do all iterative work manually over and over. Most of these things are part of any common ALM (Application Lifecycle Management) cycles in a modern-day development team. In this article I’ll talk briefly about how I’ve set up most of my recent projects with routines for the daily iterative development work. This includes Continuous Integration, Nightly Builds, Automated SharePoint Deployments and Automated UITesting. Pretty slick.

    Background: Why automate?

    One of the most common questions I’m getting today when I introduce the concept of code and deployment automation is "Why?". Why would you invest time to set up an automated environment for iterative builds, deployment and automated tests? For obvious reasons (to me) I shouldn’t need to answer that question because the answer lies within the question itself. That’s just the point – it’s automated and you don’t have to do it yourself over, and over, and over.. and over and over and over again!

    As a quick go-to-guide for automation benefits, this should make do:

    • Continuously finding problems and fixing them in time.
    • Revert the entire code base back to a state where it was working, in case trouble arises.
    • Early warnings of things that could be a potential danger to the success of the project.
    • Frequent check-in policies forces developers to validate their code more often.
    • We can iteratively have an up-to-date testing environment for stakeholders to verify our work continuously.
    • Automate testing by PowerShell.
    • Automate UI Testing.
    • Etc etc etc.

    The list can grow pretty long, and I’ll try to cover a pro/con list of things to consider in the ALM process, but for now I think it’s important to focus on what’s important for your team. For me and my teams the most important thing is to make the processes more efficient and managing some of the daily tasks automatically. In one of my latest projects I’ve taken the time to enforce a new type of development routine for the team which did not exist before – an iterative development process. This has been key to our development in the upgraded SharePoint solutions where we’ve upgraded older projects from SharePoint 2010 to SharePoint 2013.

    See my previous article about Iterative Upgrade Process. It discusses another type of automation that I set up – namely a full SharePoint 2010 to SharePoint 2013 upgrade – every week – before the real upgrade finally approaches.

    So if the "why’s" are clear, let’s move on to an overview of what the process could look like.

    Implementation: How are we enforcing code- and test automation?

    First of all, our main process of automatic development looks something like this:


    This is similar to one of the processes I’ve introduced in one of my projects, but as always it’s individual for each project and can vary quite widely with your requirements and setups. Once we’ve done enough iterations and fixes in this simple ALM cycle, we move on to pushing this out to the production environments. But before we get into that, let’s break these steps down a bit:


    During the development phase we focus on getting things done .Taking the collected requirements, nailing them down into a technical implementation and finally checking the code into our code repository. This is the first step of the technical ALM cycle where we actually start to implement the code to fulfill the requirements. But it can also be the last step in the cycle, where bugs have been reported in the "Test & Verification" step, resulting in additional code fixes – which causes the cycle to start all over. For most people this is common sense, and in most experienced teams there are some type of ALM routines in place.


    When you commit or check-in your code, we can set up rules for the actual check-in. In my case there’s always a few things I tend to do:

    • Build projects and package SharePoint solutions to make sure there’s no basic issues in the project files etc.
    • Execute tools to check the quality of the code, for example the SPCAF tool which is doing a good job of analyzing the packages and resources.
    • Automatically deploy the new packages to a dev-test farm for quick tests by the developers themselves.

    Automatic code verification

    So as mentioned in the earlier step, we perform automatic verification of various things in our code and packages. You can use the code analysis tools for this, and the SPCAF tools and so on. I will automatically fail the build instantaneously if there are any errors reported, which will cause the build server to e-mail the entire team saying "the latest build broke, here’s why…". The reason for having the build server notify everyone is of course that we should be aware of any issues that occur due to our recent changes – better have everyone know about it than nobody.

    In short, we can perform these types of tasks here:

    • Unit Tests (if any)
    • Code analysis
    • SPCAF analysis for SharePoint artifacts
    • etc etc.

    Automatic deployment to dev-test farm

    When we check-in, we perform a new deployment on the developer test environment, which is a separate farm setup for only the developers to verify their changes and code. Each time a check-in is found in the TFS, we trigger a new deployment to this environment. If a developer is doing tests in the environment for some reason, he/she can flag to the build server that no builds should push any deployments until he/she is done with their tests. Then the build server will pick it up right where it left off before pausing.

    Automatic deployment to pre-production farm

    After the code is automatically verified and the solution packages are flagged as OK we proceed with deploying the packages to a test environment. This is also a fully automated step but instead of having this triggered on every check-in I’ve chosen to trigger this once per day in my current setup. The reason for this is that we generally want a better uptime on this environment so we can do some actual tests both automatically and manually. I’ve set up rules that first checks that everything went fine when deploying to the dev-test environment. If not, it’ll flag it as a failed build – otherwise it’ll continue to do the full deployment in the pre-production environment as well.

    Our pre-production environment is a place where more of the UAT and QA tests happen. We have this environment connected to the proper Domain (AD), proper services and the exact same setup (and even content) as in our production environment. Read more about how we duplicate the content in my previous article, more precisely on the "Copy fresh databases" section.

    Summary of the pre-production environment: An almost exact replica of the actual production environment. Proper, life-like tests can be performed here.

    Automatic SharePoint & UI Tests

    One of the coolest things we’ve got setup are something called UI Tests. If you haven’t seen these before, I urge you to take a look here: MSDN – Verifying Code by Using UI Automation.

    With our coded UI Tests, we can simply record logical tests in the UI from our web browser, have Visual Studio convert these recorded actions into a sequence of commands in C# which then are executed automatically once they are run. We’ve made sure that our build server are executing the UI tests on a remote machine, enabling us to test graphical changes and things in the user interface by automatic measures so we don’t have to do this every time ourselves. This is awesome. I promise this is very, very awesome.

    On top of the UI tests that we can have automated, we can also have some logical tests performed by PowerShell. In this step we also conduct generic tests to make sure we can create all types of sites based on our templates, that we can change all settings that we have designed in our applications, that we can create all types of lists and so on. In other words: Automating the boring dirty work that otherwise may be very redundant and you normally fall asleep while doing ;-)

    Technicalities: Team City as a build server

    All of the aforementioned tasks are things that happen automatically by utilizing a build environment. Personally I am pretty hooked on Team City these days. Best part is that it’s free for up to 20 configurations. Cor blimey!

    Why use this specific build server? Some people argue that using a TFS build server is the best, some argue that Jenkins beats all else, some argue that Team City is the most awesome tool in the world and so on. They are all good, and they all serve a purpose. For me, the core benefits of Team City are:

    • Installation is done in less than 1 minute (pow!)
    • Configuration and setup of new builds is as easy as 1-2-3
    • Build agent statistics to keep track of build performance etc.
    • It works with multiple version control systems out of the box

    TeamCity supports a broad range of version controls out of the box, from the most popular ones to somewhat deprecated but still used systems.

    For a single build TeamCity can take source code from several different VCS repositories. You can even mix different types of VCS.

    I’m not going to sell you either TeamCity or any other build server, you can just have a look for yourself at the various options there are out there:

    My obvious recommendations falls on TeamCity and TFS. It all comes down to where you are, what the projects and budgets look like and what features you’re prone to have included. Some of my projects are on TFS build and some are with TeamCity – most of them in TeamCity actually.

    In a future post I might describe in more detail how I’ve technically set everything up for my build environments.

    Build Server Configuration: Setting up proper routines

    It doesn’t really matter which build server configuration you prefer, as long as it fulfills your requirements. My normal requirements in any type of long-running project for example intranets may look like this:

    Continuous Integration

    • Configure a trigger to automatically execute a build, package and code analysis on each check-in.
    • Deploy to dev-test on each check-in

    Nightly Deployment

    • Configure a trigger to automatically execute a build, package and code analysis each night.
    • Configure a trigger to automatically execute a deployment to dev-test and pre-production each night.

    Weekly Deployment

    This is a special configuration that doesn’t really tie into just this ALM cycle, but also to my previously described "Iterative Upgrade Process" cycle. Every week in a current project, we’re tearing down the entire SharePoint environment and we’re building it up again automatically and performing an upgrade. This configuration can trigger that full scheme of things, but it can also be used to just deploy all the latest artifacts. Hence, it’s a pretty special step that we manually trigger today in order to keep the chaos to a minimum.

    PowerShell Verification Tests & Nightly UI Tests

    By using PowerShell to verify that artifacts exist, can be created and seemingly contain the right properties and by using coded UI tests to perform recorded tests of UX interaction tasks, we’re pretty confident that we’re continuously finding out any issues along the way with less manual work.

    As always though; The tests are only as good as the person who wrote them – if you create bad tests, the results will be bad. Keep this in mind and don’t just trust the green lights :-)

    Visual overview: Common setup in my projects

    My thoughts are along the lines of "the less work you have to manually do yourself, the more work you get done" which are pretty common if you’re into making routines more efficient. I’ll try to describe from a birds perspective what our routines are technically doing by showing you an awesome visualization of my most common type of setups. Following is an illustration of my setup, and below the illustration I’ll describe each section more precisely:


    Description of the legend

    In the legend (the gray box) you can see that I’ve marked some of the areas in different colors.

    Developer Area is where the devs are playing around. Nobody else are touching those environments.

    Build Master Area is where the build master (usually me) and potentially a few more designated build administrators hang out. Nobody else are allowed on this environment – it is strictly locked down to only the few people with the required skillset to configure them. If we make a mistake here, we could easily screw up our deadlines in the projects.

    Dev Test Farm is where the developers do their initial tests. A rather simple environment with only one goal: Successfully deploy the packages and activate the required artifacts.

    Pre-Production Farm is where the artifacts are deployed right after they are verified in the dev-test environments. When we push things to this environment we enable the business decision makers and product owners to get in the game and finally try out what we’ve created. Normally this means that they perform initial requirement matching (that is; did we create what they wanted) and then we’ll connect parts of the organizations users in order to perform the UAT (User Acceptance Tests) properly. Once that is done, and we’ve iterated the cycles enough times to fix all the bugs (yay!) then we’ll push it on to the production environment.

    Production Farm is where everything is going live. This is the real deal, and nobody has any access to this environment except the farm administrators. The exception is that the build environment have permission to remotely deploy SharePoint solution packages and execute the required PowerShell commands in order to successfully (and automatically) ensure that the proper artifacts are deployed in a safe manner. We don’t want to allow people to perform manual deployments here! Hands off please :-)


    Well that’s a short story about what my current ALM adventure looks like. I’ve been rigging up some pretty cool environments lately, and I’m loving every bit of it. The confidence of automation is unbeatable – not to mention how much time we actually save.

    I could easily hand any of my clients a presentation saying how many man-hours they’ve saved by investing the hours needed to set this up. Priceless!

    In the future I might write about the technical implementation of the build servers and so on, but that’ll be a post for another time.


    Author: Tobias Zimmergren | | | @zimmergren


    Every cycle of SharePoint comes with challenges around upgrades and migrations. In one of my current projects I’ve been part of designing an iterative upgrade process – as I like to call it – which means we’ll be upgrading our Farm (all content databases) from SharePoint 2010 to SharePoint 2013 every week. Yes, that’s right – we upgrade SharePoint 2010 to SharePoint 2013 every single week with the latest content from the production environments. This of course happens on a separate SharePoint 2013 farm setup specifically for this purpose.

    In this article I’ll talk about the benefits of my “Iterative Upgrade Process” and what it means in terms of benefits for the teams and people involved with your Intranet, Extranet, Public site or whatever you are using your SharePoint farm for.

    Please do note that this is not an article describing the steps to upgrade your SharePoint environment – this is a process for iterative development and iterative testing in environments that we tear down, build up and upgrade from SharePoint 2010 to SharePoint 2013 every week.

    Background: Everyone is effected by an upgrade or migration

    It’s not uncommon that you bumb into a lot of problems while upgrading your farms from one version to the other. Common problems include customizations, faulty configurations and general bad practices being implemented in the original farm. But for most organizations an upgrade doesn’t “just happen” overnight and then everything works flawlessly – on the contrary there’s always a bunch of problems brought to light that needs to be taken care of.

    Working a lot on SharePoint Intranets like I currently do, we often see problems before, during and after upgrades. Not only technical issues that we can handle, but issues with people in the organization not reaching the right information or not being able to perform their daily tasks. This can have very complicated impacts on the organization if the migration fails to run smoothly and everything isn’t up and running after the service windows you’ve specified.

    The end-user is affected in terms of downtime and possible being hindered to perform their tasks, which in the end hurts the organization since these guys are the thing that keeps the organization running! The IT-departments (or technical folks in your organization involved with SharePoint) may be affected if the migration or upgrade doesn’t go as planned. The business as a whole relies on the system to be functioning and for every minute or hour that the systems aren’t fully available the organization may loose both time and money.

    So in order to minimize any pain in upgrading from one version of SharePoint to another, we need to consider the implications of a troublesome upgrade. With the iterative upgrade process we’ve got in place right now at one of my clients you can test and verify all your changes and customizations and whatever you want to assure the quality of – over and over again before the real deal.

    Implementation: How am I doing this?

    So boiling down the steps included in our iterative upgrade process, gives something similar to this:


    In a nutshell this is what the process looks like from a bird perspective, even though some of the steps require extensive amount of preparation-work and time to get done. Below is an explanation of all these steps in more details, to give you an understanding of what this really means.

    Setup SP 2013 Farm

    The very first step that we need to do is to setup and configure the SharePoint 2013 Farm where our upgraded content will eventually land. In our case we’ve set up this as a one-off configuration, re-using the 2013 farm on every new iteration. You could – as an alternative – argue that it would be beneficial to tear down the entire farm and have it set up again. It would be theoretically possible, but in our specific case it simply doesn’t work that easily – too many dependencies rely on things outside of my and my team’s control.

    Uninstall any custom solutions

    This step is of course only necessary if you’ve already upgraded at least once in the new farm, then by the time you’ve scheduled your next iterative upgrade you’ll need to uninstall any and all old solutions in order to clean up the farm a bit before we proceed.

    Remove any content databases

    Again, this step is only necessary if you’ve already upgraded at least once in the new farm. If you have, there’ll be upgraded content databases that you need to get rid off before we commence the process to the next step. We’re doing this with the PowerShell cmdlet Remove-SPContentDatabase.

    Deploy SP 2010 Solutions

    Deploy your old 2010 solutions. The reason why we would want to do this is that when we later perform the actual mount of the databases, it’s pretty nice if the mounting-process can find the references to the features, web parts and any other resources within those solutions. This is a temporary deployment and the 2010 packages are soon to be removed again.

    Copy fresh databases

    Next step is to ensure that the SQL Server in our new 2013 farm is up to date with containing the latest content databases from the SharePoint 2010 farm. This is why we’re using Flexclone (described in more detail further down in this article). Actually, Flexclone makes virtual copies which are true clones without demanding additional storage space. Pow! Totally awesome.

    Attach databases

    After the databases are copied to the SQL Server, we’ll have to attach them to SQL Server as you would normally do.

    Mount databases

    Next step is where we mount the actual databases to SharePoint. The databases are still in SharePoint 2010 mode, since the copies of our databases comes from the SharePoint 2010 environment. This is why we need to have our 2010 WSP solutions in place before we perform the mount – otherwise reading the mount-logs will be… well, not so fun ;)

    We do this with the PowerShell cmdlet Mount-SPContentDatabase.

    Uninstall SP 2010 Solutions

    When the mounting is done, we’ll need to uninstall the 2010 version of the old solutions and move on to the next step.

    Deploy upgraded SP 2013 Solutions

    Yay, finally we’re at a much cooler step – deploying SharePoint 2013 solutions. So, to give a little background on what these solutions should be:

    You should’ve already upgraded your SharePoint projects to SharePoint 2013 solutions, have them ready to go and use in this step.

    Notes:  This is probably the most time-consuming step if you have custom solutions. Anything you’ve built in SharePoint 2010 and customized there needs to be upgraded to SharePoint 2013 and work there as well. Good thing we’ve got an iterative upgrade process so we can fine-tune this iteratively every day and just hit a button to re-upgrade the farm with the latest builds in our test- and pre-production environments. Yay!

    Upgrade-SPSite with all Site Collections

    Once the new and freshly upgraded 2013 packages have been deployed, we’ll continue by upgrading the actual Site Collections from SharePoint 2010 mode to SharePoint 2013 mode.

    We’ll be using the PowerShell cmdlet Upgrade-SPSite for every one of our Site Collections.

    Misc automated configuration scripts

    We’ve got a lot of custom scripts getting run after the upgrade, as part of the finalization process of the actual upgrade. This includes custom re-branding scripts, re-creation of My Sites and moving content between old and new My Sites, custom scripts to disable and remove artifacts that aren’t used in SharePoint 2013 projects and solutions anymore, modification to removed or altered Content Types etc etc. The list can be made long – if you’re reading this you’ve probably already understood that each scenario is unique, but this process can be applied to most scenarios with a tweak here and there.

    Tools: What tools are we using to make this happen?

    Obviously things doesn’t get done by themselves, so I’ve automated much of the process with various tools and techniques, defined below.

    Deployment Automation with Team City

    There’s tons of ways to automate things in any ALM cycle. Be it a development lifecycle or an infrastructural upgrade lifecycle like this – something to automate the process will be your best bet. Since we’re doing this every week, and the process in itself is pretty complex with plenty of steps that needs to be done properly, I’ve chosen to go with Team City for all of the automation work.

    I’ve gotten the question why use Team City instead of TFS Build or Jenkins or any other available build automation tools. Simply put: Team City is free for up to 20 configurations, easy (very very easy) to configure, works with multiple data sources and repositories and it just works – every time. But that’s a discussion for another day.

    Database copies with Flexclone

    In order to easily get set up with the databases in our environments, we’ve been using Netapp’s Flexclone software very successfully the last year. As quoted from their own website:

    NetApp® FlexClone® technology instantly replicates data volumes and datasets as transparent, virtual copies—true clones—without compromising performance or demanding additional storage space.

    So in essence, the usage of Flexclone allows us to with a single click (almost) replace all of the databases in our test- and pre-production environments and get real copies of the actual production environments in a matter of minutes. There’s no denying that this is awesomenss in its true form.

    Iterative code upgrades with Visual Studio 2013

    In order to maintain and upgrade the new codebase (upgraded from SharePoint 2010), we’re using Visual Studio 2013 like most professional Microsoft-related developers do today. You can use VS 2012 as well, should you like – but do try out 2013 if you can, it’s multiple times faster than previous versions of Visual Studio.

    I have pushed hard for implementing a real ALM process in the team, and we’ve finally got that in place and it’s working pretty nicely right now. We’re using TeamCity to automate builds with continuous integration, nightly builds and scheduled and on-demand deployments to our environments. I will cover code automation more thoroughly in another post in the future, as it would be too much info to cover in this single post.


    So this is a process we follow every week. Once a week I tear down the entire SP 2013 test farm and rig up a new snapshot of the databases on the SQL environment. Then I re-iterate this upgrade process (Team City, PowerShell and PowerShell Remoting to the rescue). This means we can literally try what the real upgrade will be like once we get there. Every week. Also we can have a nice agile iterative way of handling bugs that appear in the environments.

    Oh yeah, should we break something – we click a button or two and we’ve got a freshly upgraded environment with the latest builds from the SP 2013 dev rigs.

    It simplifies the overall process:

    • When time comes for the real upgrade, everything including upgraded code base and automated upgrade scripts is in place!
    • Find and report errors early in the upgrade process of your project
    • Find compatibility errors in code and solutions
    • Find out what will upgrade, and what will not upgrade before its too late
    • Be confident that once we reach the point of upgrade, we’ve done it so many times already that we know what might go wrong
    • The Product Owners, Project Managers, Testers and any other involved people have already verified the state of the system, so once we hit the button in the Production environments – we’re pretty much in a “accepted release” state already.

    I hope you enjoyed this little read about my iterative upgrade process. It’s pretty darn good if you ask me – but requires some time to set up initially, but in larger projects it’s definitely worth it!


    Author: Tobias Zimmergren | | @zimmergren


    So recently, while working with the (awesome!) Work Management Service Application in some of our environments, we got the common problem of not receiving any actual tasks on our My Sites. The reason is that we see this message instead:

    Last updated at 1/1/1901 12:00 AM

    Now, throw a google query and you’ll find plenty of resources and fixes for how to configure the permissions of your Service Applications in order to make this service work.

    My Solution

    Due to various policies, restrictions and IT related stuff we couldn’t just configure permissions in any way we wanted. So we needed to figure out another way to fix this simple problem.

    The solution is simple, for us:

    • Delete your Work Management Service Application
    • Re-create a new Work Management Service Application
      • Create a new Application Pool, but use the same account as for the Application Pool hosting your My Sites/Social or Portal.
    • Run a crawl
      • Incremental, continuous or full crawl should suffice.
    • Bingo.

    In some scenarios this may work, in others it may not work. For our various farms (Test, Pre-Production, Production) it works great, and given it works in 3 different environments (with different accounts et al) it’s pretty neat.

    After the crawl did it’s job, I could start engaging the Tasks list on my My Site with collective tasks throughout my entire farm:


    Looks like it did the trick, and the tasks are now working like a charm including all data related to the task.


    Other options

    If this still doesn’t work,  check this TechNet article out about configuring the service permissions. Doing the above and configuring the permissions should definitely do the trick (

    And here’s another tip if you’re still having issues:


    Instead of messing about with permissions (for various reasons) we’ve managed to get it started and working with simply configuring the same Application Pool account. Should that not suffice, a combination will more likely work.

    Author: Tobias Zimmergren | | @zimmergren


    In one of the projects I’m currently involved, we’re in the process of upgrading from SharePoint 2010 to SharePoint 2013. One of the problems we faced were the fact that we had some orphaned content databases in our production environments, but the problem didn’t surface in SharePoint 2010 but was given light in 2013. So this short post is talking about how I fixed those issues, which was a bit of a pain to be honest.

    In the environments we’re working, we’ve set up a scheduled upgrade that takes place once every week on a schedule. The reason for this is to re-iterate the upgrade process as many times we can, with production data, before the actual upgrade which will take place later down the road when all bugs, code tweaks/customizations and other random problems have been taken care of. One of the problems that surfaced recently was that we couldn’t create any new Site Collections, where the ULS spit out the unfortunate message:

    Application error when access /_admin/createsite.aspx, Error=Object reference not set to an instance of an object.  
    at Microsoft.SharePoint.Administration.SPContentDatabaseCollection.FindBestContentDatabaseForSiteCreation(IEnumerable`1 contentDatabases, Guid siteIdToAvoid, Guid webIdToAvoid, SPContentDatabase database, SPContentDatabase databaseTheSiteWillBeDeletedFrom)

    While it took some time to boil down the nuts of what was going on, here’s the details in case you end up with the same issues.

    Cannot create new Site Collections

    So the problem we faced of not being able to create new Site Collections surfaced itself in the ULS logs, stating this message:

    Application error when access /_admin/createsite.aspx, Error=Object reference not set to an instance of an object.  
    at Microsoft.SharePoint.Administration.SPContentDatabaseCollection.FindBestContentDatabaseForSiteCreation(IEnumerable`1 contentDatabases, Guid siteIdToAvoid, Guid webIdToAvoid, SPContentDatabase database, SPContentDatabase databaseTheSiteWillBeDeletedFrom)    
    at Microsoft.SharePoint.Administration.SPContentDatabaseCollection.FindBestContentDatabaseForSiteCreation(SPSiteCreationParameters siteCreationParameters, Guid siteIdToAvoid, Guid webIdToAvoid, SPContentDatabase database, SPContentDatabase databaseTheSiteWillBeDeletedFrom)    
    at Microsoft.SharePoint.Administration.SPContentDatabaseCollection.FindBestContentDatabaseForSiteCreation(SPSiteCreationParameters siteCreationParameters)    
    at Microsoft.SharePoint.Administration.SPSiteCollection.Add(SPContentDatabase database, SPSiteSubscription siteSubscription, String siteUrl, String title, String description, UInt32 nLCID, Int32 compatibilityLevel, String webTemplate, String ownerLogin, String ownerName, String ownerEmail, String secondaryContactLogin, String secondaryContactName, String secondaryContactEmail, String quotaTemplate, String sscRootWebUrl, Boolean useHostHeaderAsSiteName, Boolean overrideCompatibilityRestriction)    
    at Microsoft.SharePoint.Administration.SPSiteCollection.Add(SPSiteSubscription siteSubscription, String siteUrl, String title, String description, UInt32 nLCID, Int32 compatibilityLevel, String webTemplate, String ownerLogin, String ownerName, String ownerEmail, String secondaryContactLogin, String secondaryContactName, String secondaryContactEmail, Boolean useHostHeaderAsSiteName)    
    at Microsoft.SharePoint.ApplicationPages.CreateSitePage.BtnCreateSite_Click(Object sender, EventArgs e)    
    at System.Web.UI.WebControls.Button.RaisePostBackEvent(String eventArgument)    
    at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint)

    Given some reflector magic and investigations I found out that this specific method causing the problem was looking for the best Content Database to put the new Site Collection in. While it was trying to do this, it obviously want to balance the Site Collections in a way that means they’re evenly distributed over the Content Databases.

    The reason for why we got this error message is due to invalid references in our Config database pointing to Content Databases that no longer exist, for whatever reason. The result of this is that the method tried to create the new Site Collection into a Content Database that doesn’t really exist, even though SharePoint thought it existed.

    Steps to find and kill the broken/invalid references to the non-existent content databases

    After some SQL magic, finding out the null-references were rather easy. Following these steps allowed me to figure out the details of the broken databases:

    Step 1: Get the Web Application ID

    Either use SharePoint Manager or simply a quick PowerShell statement to quickly figure out the GUID of your Web Application where the problem is persisted:

    $wa = Get-SPWebApplication

    Obviously you should note/save this ID for reference in the next steps.

    Step 2: Query your Config database for the appropriate information

    Save this ID, head on over to your SQL server and run this command (replace GUID with your ID from Web App)

    USE SP13_Config
    SELECT ID, CAST(Properties as XML) AS 'Properties'
    FROM Objects
    WHERE ID = 'GUID' -- GUID of the Web Application

    As you can see when using the CAST(Properties as XML) bit of the query, you can get a clickable link in the results window given you an awesome overview of the XML represented. Thanks to a SQL friend of mine for pointing that out, saved the day :-)

    Here’s what the results looks like (1 row):


    Step 3: Investigate the returned results (XML) and find your null-values

    Click the XML link and find this section containing the Microsoft.SharePoint.Administration.SPContentDatabaseCollection and see if you find any place where the fld value is null, something like this:


    As you can see, most of the databases in our environment has a sFld and a fld xml node where the GUID of the database are stored. However in some cases (in 2 places in our environment!) you may find that it says null instead. That is essentially your invalid reference pointing to nothing at all. So SharePoint tries to create the Site Collection in the Content Database with the null-fld.

    As with previous steps, make a note of the GUID from your broken database references.

    Step 4: Delete the database(s) using PowerShell

    The best way we found to delete these databases were by using PowerShell. At first I didn’t think it actually worked, but after re-running the SQL query after running the PowerShell command it occurred to me that the command had actually removed the invalid reference. The reason for why I didn’t think it worked is because PowerShell is throwing some errors on the screen for you, but it looks as if it’s actually working the right magic under the hood for us – thus leaving us with an intact and working farm again.

    So, make sure you’ve got the ID’s of your broken databases and first and foremost make sure that you haven’t copied the incorrect GUID (!) – what I did was simply query my Web Application and filtered the query to give me the ID and Names of all Content Databases so I could make sure that I didn’t delete an actual Content Database by mistake.

    Verification command:

    $wa.ContentDatabases | ft ID, Name

    After running this command we got a list of databases where we could just make sure that the GUID’s we’ve coped didn’t actually represent any of our real databases that were intact:


    Great, now that I’m sure the ID of the databases I copied isn’t the ID of a production DB which I know is intact, but represents my broken ones, I can execute the delete-command on those buggers!

    In order to do that, I simply ran this PowerShell command:


    The results of this were as follows, causing a lot of nice error messages.. However, the magic under the hood still worked:


    Step 5: Verify by running the SQL query again

    So the PowerShell throws an error message stating that “Object reference not set to an instance of an object.”, however under the hood the magic has been applied properly and in my Config-database the values that were incorrect are now deleted as can be verified if we re-run the SQL query:



    Well, I’ve learnt a lot this week about the Config database and playing around with the GUIDs within. The scary part was that these errors didn’t surface in SharePoint 2010, but they did in 2013 once we upgraded. Another good reason to get a good iterative upgrade-routine in place before an actual upgrade is attempted.

    Speaking about iterative upgrade processes I might discuss that in a future post, namely how we commence our upgrades every week without lifting a finger (almost) :-)


    Author: Tobias Zimmergren | | @zimmergren


    In one of my previous articles where we investigated some of the new and awesome delegate controls in SharePoint 2013. It walks you through some of the interesting DelegateControl additions that I was playing around with. On top of that I got a comment about how you could extend it further by adding the current site title in the Suite bar:


    Sure enough, I took a dive into the Seattle.master to take a look at how the title is rendered and found the control called SPTitleBreadcrumb. This is the control that is responsible for rendering the default out of the box title in a normal SharePoint team site like this:


    So to follow the question through and provide an answer to get you started, we’ll take a quick look on how we can build further on our old sample from the previous blog post and add the site title (including or excluding breadcrumb-functionality) to the Suite bar.

    Investigating the out of the box Seattle.master

    In the OOTB Seattle.master, the title is rendered using the following code snippet:

    <h1 id="pageTitle" class="ms-core-pageTitle">
      <SharePoint:AjaxDelta id="DeltaPlaceHolderPageTitleInTitleArea" runat="server">
        <asp:ContentPlaceHolder id="PlaceHolderPageTitleInTitleArea" runat="server">
              <SharePoint:ClusteredDirectionalSeparatorArrow runat="server" />
      <SharePoint:AjaxDelta BlockElement="true" id="DeltaPlaceHolderPageDescription" CssClass="ms-displayInlineBlock ms-normalWrap" runat="server">
        <a href="javascript:;" id="ms-pageDescriptionDiv" style="display:none;">
          <span id="ms-pageDescriptionImage">&#160;</span>
        <span class="ms-accessible" id="ms-pageDescription">
          <asp:ContentPlaceHolder id="PlaceHolderPageDescription" runat="server" />
        <SharePoint:ScriptBlock runat="server">

    What we can see in this file is that there’s a lot of action going on to simply render the title (or title + breadcrumb). You can play around with this in tons of ways, both server-side and client side. In this article we’ll take a look at how we can extend the Suite bar delegate control from my previous article in order to – using server side code – modify the title and breadcrumb and move it around a bit.

    Should you want to get the title using jQuery or client side object models, that works fine too. But we can save that for another post.

    Adding the Title Breadcrumb to the Suite bar

    I’m going to make this short and easy. The very first thing you should do is head on over to my previous article “Some new DelegateControl additions to the SharePoint 2013 master pages” and take a look at the “SuiteBarBrandingDelegate Delegate Control” section and make sure you’ve got that covered.

    Once you’ve setup like that, here’s some simple additional tweaks you can add to your Delegate Control in order for the breadcrumb to be displayed in the top row of SharePoint. Modify the content of the “” (the file in my previous sample is named like that, in your case it may differ) to now look something like this:

    protected void Page_Load(object sender, EventArgs e)
        // Register any custom CSS we may need to inject, unless we've added it previously through the masterpage or another delegate control...
        Controls.Add(new CssRegistration { Name = "/_layouts/15/Zimmergren.DelegateControls/Styles.css", ID = "CssReg_SuiteBarBrandingDelegate", After = "corev5.css" });
        BrandingTextControl.Controls.Add(new Literal
            Text = string.Format("<a href='{0}'><img src='{1}' alt='{2}' /></a>",
        // Create a new Title Breadcrumb Control
        SPTitleBreadcrumb titleBc = new SPTitleBreadcrumb();
        titleBc.RenderCurrentNodeAsLink = true;
        titleBc.SiteMapProvider = "SPContentMapProvider";
        titleBc.CentralAdminSiteMapProvider = "SPXmlAdminContentMapProvider";
        titleBc.CssClass = "suitebar-titlebreadcrumb";
        titleBc.DefaultParentLevelsDisplayed = 5;
        // Add the Title Breadcrumb Control

    As an indication, the end-result might look something like this when you’re done. What we’ve done is simply copied the logic from the Seattle.master into the code behind file of our delegate control and set the “DefaultParentLevelsDisplayed” to a higher number than 0 so it’ll render the actual breadcrumb. By setting this value to 0, only the title will be displayed.


    Then if you want to hide the default title, you can do that by using this small CSS snippet:

        display: none;

    And it’s gone:


    From there you should be able to take it onwards and upwards in term of the styling. I haven’t really put any effort into making it pretty here :-)


    With these small additions and changes to my original code samples you can make the title bar, including or excluding the breadcrumb, appear in the top bar instead of in the default title-area.

    Additional important comments:

    You may need to consider ensuring that the Site Map datasource is available on every page, including system pages for example. If it isn’t or you land on a page that don’t want to render your breadcrumb/title, it may not be able to properly render your navigation as you would expect it to. However that’s something to look into further from that point.

    For example, by default the “Site Content” link will not render the full breadcrumb properly, but rather just say “Home”. In order to fix smaller issues like that, we can further extend the code logic a few lines and take care of those bits.

    My recommendation:

    Always make sure you consider the approach you take for any type of customization and development. For this specific case, we’ve already used some code-behind to display our logo in the top left corner, so we’ve just built some additional sample code on top of that to render the breadcrumbs. However should we only want to do that and not move further onwards with the logic from here – I would most likely suggest you do this using jQuery/CSOM instead to be “Office 365 compliant” and keeping your customizations to a minimum.

    Hope you enjoyed this small tweak. And keep in mind: Recommendations going forward (however hard it’ll be to conform to them) are to keep customizations to a minimum!

    Cheers, Tob.

    Author: Tobias Zimmergren | | @zimmergren


    SharePoint 2013 comes with tons of enhancements and modifications to previous versions of the product. One of the cool features I’ve played around with lately is the Geolocation field. Back in 2010 I wrote a custom-coded solution for displaying location information in our SharePoint lists, integrating some fancy-pants Google Maps – in SharePoint 2013, a similar field exist out of the box.

    In this article I’ll mention what this field does, a sample of creating and using the field, getting and setting the Bing Maps keys to make sure our maps are properly working and displaying.

    Update 2013-09-14: As pointed out by Leon Zandman in the comments, there’s a some updated pre-requisites required in order to view the geolocation field value or data in a list. Information from Microsoft:

    An MSI package named SQLSysClrTypes.msi must be installed on every SharePoint front-end web server to view the geolocation field value or data in a list. This package installs components that implement the new geometry, geography, and hierarchy ID types in SQL Server 2008. By default, this file is installed for SharePoint Online. However, it is not for an on-premises deployment of SharePoint Server 2013. You must be a member of the Farm Administrators group to perform this operation. To download SQLSysClrTypes.msi, see Microsoft SQL Server 2008 R2 SP1 Feature Pack for SQL Server 2008, or Microsoft SQL Server 2012 Feature Packfor SQL Server 2012 in the Microsoft Download Center.

    Introduction to the Geolocation Field

    I’ll showcase what the Geolocation field can do for us in a SharePoint list. In the sample below I’ve used a list called “Scandinavian Microsoft Offices” which contains a few office names (Sweden, Denmark, Finland and Norway). What I want to do in my list is to display the location visually to my users, not only the name and address of the location. With the new Geolocation field you can display an actual map, as I’ll show you through right now – skip down to the “Adding a Geolocation Field to your list” section if you want to know how to get the same results yourself.

    A plain SharePoint list before I’ve added my Geolocation field


    As you can see, no modifications or extra awesomeness exist in this list view – it’s a vanilla SharePoint 2013 list view.

    The same list, with the Geolocation field added to it

    When we’ve added the Geolocation field to support our Bing Maps, you can see that a new column is displayed in the list view and you can interact with it. In my sample here I’ve filled in the coordinates for the four Microsoft offices I’ve listed in my list.


    Pressing the small globe icon will bring up a nice hover card kind of dialog with the actual map, with options to view the entire map on Bing Maps as well (which is essentially just a link that’ll take you onwards to the actual bing map):


    Viewing an actual list item looks like this, with the map fully integrated by default into the display form:


    And should you want to Add or Edit a list item with the Geolocation field, you can click either “Specify location” or “Use my location“. If you browser supports the usage and tracking of your location, you can use the latter alternative to have SharePoint automagically fill in your coordinates for you. Compare it with how you check in at Facebook and it recognizes your current location and can put a pin on the map for you.


    In my current setup I don’t have support for “Use my location” so I’ll have to go with the “Specify location” option – giving me this pretty dull dialog:


    As you can see, you don’t have an option for searching for your office on Bing Maps and then selecting the search result and have it automatically insert the correct Lat/Long coordinates. But, that’s where developers come in handy.

    Create a new Map View

    Let’s not forget about this awesome feature – you can create a new View in your list now, called a “Map View”, which will give you a pretty nice map layout of your tagged locations with pins on the map. Check these steps out:

    1) Head on up to “List” -> “Create View” in your List Ribbon Menu:


    2) Select the new “Map View”


    3) Enter a name, choose your fields and hit “Ok”


    4) Enjoy your newly created out of the box view in SharePoint. AWESOME


    Adding a Geolocation Field to your list

    Right, let’s move on to the fun part of actually adding the field to our list. I’m not sure if it’s possible to add the field through the UI in SharePoint but you can definitely add it using code and scripts, which is my preferred way to add stuff anyway.

    Adding a Geolocation field using PowerShell

    With the following PowerShell snippet you can easily add a new Geolocation field to your list:

    Add-PSSnapin Microsoft.SharePoint.PowerShell
    $web = Get-SPWeb "http://tozit-sp:2015"
    $list = $web.Lists["Scandinavian Microsoft Offices"]
        "<Field Type='Geolocation' DisplayName='Office Location'/>",

    Adding a Geolocation field using the .NET Client Object Model

    With the following code snippet for the CSOM you can add a new Geolocation field to your list:

    // Hardcoded sample, you may want to use a different approach if you're planning to use this code :-)
    var webUrl = "http://tozit-sp:2015";
    ClientContext ctx = new ClientContext(webUrl);
    List officeLocationList = ctx.Web.Lists.GetByTitle("Scandinavian Microsoft Offices");
        "<Field Type='Geolocation' DisplayName='Office Location'/>", 

    Adding a Geolocation field using the Javascript Client Object Model

    With the following code snippet for the JS Client Object Model you can add a new Geolocation field to your list:

    function AddGeolocationFieldSample()
        var clientContext = new SP.ClientContext();
        var targetList = clientContext.get_web().get_lists().getByTitle('Scandinavian Microsoft Offices');
        fields = targetList.get_fields();
            "<Field Type='Geolocation' DisplayName='Office Location'/>",
        clientContext.executeQueryAsync(Function.createDelegate(this, this.onContextQuerySuccess), Function.createDelegate(this, this.onContextQueryFailure));

    Adding a Geolocation field using the Server Side Object Model

    With the following code snippet of server-side code you can add a new Geolocation field to your list:

    // Assumes you've got an SPSite object called 'site'
    SPWeb web = site.RootWeb;
    SPList list = web.Lists.TryGetList("Scandinavian Microsoft Offices");
    if (list != null)
        list.Fields.AddFieldAsXml("<Field Type='Geolocation' DisplayName='Office Location'/>",

    Be amazed, its that easy!

    Bing Maps – getting and setting the credentials in SharePoint

    Okay now I’ve added the fields to my lists and everything seems to be working out well, except for one little thing… The Bing Map tells me “The specified credentials are invalid. You can sign up for a free developer account at“, which could look like this:


    Get your Bing Maps keys

    If you don’t have any credentials for Bing Maps, you can easily fetch them by going to the specified Url ( and follow these few simple steps.

    1) First off (after you’ve signed up or signed in), you’ll need to click on the “Create or view keys” link in the left navigation:


    2) Secondly, you will have to enter some information to create a new key and then click ‘Submit’:


    After you’ve clicked ‘Submit’ you’ll be presented with a list of your keys, looking something like this:


    Great, you’ve got your Bing Maps keys/credentials. Now we need to let SharePoint know about this as well!

    Telling SharePoint 2013 what credentials you want to use for the Bing Maps

    Okay – so by this time we’ve created a Geolocation field and set up a credential for our key with Bing Maps. But how does SharePoint know what key to use?

    Well that’s pretty straight forward, we have a Property Bag on the SPWeb object called “BING_MAPS_KEY” which allows us to configure our key.

    Since setting a property bag is so straight forward I’ll only use one code snippet sample to explain it – it should be easily translated over to the other object models, should you have the need for it.

    Setting the BING MAPS KEY using PowerShell on the Farm

    If you instead want to configure one key for your entire farm, you can use the Set-SPBingMapsKey PowerShell Cmdlet.

    Set-SPBingMapsKey -BingKey "FFDDuWzmanbiqeF7Ftke68y4K8vtU1vDYFEWg1J5J4o2x4LEKqJzjDajZ0XQKpFG"

    Setting the BING MAPS KEY using PowerShell on a specific Web

    Add-PSSnapin Microsoft.SharePoint.PowerShell
    $web = Get-SPWeb "http://tozit-sp:2015"
    $web.AllProperties["BING_MAPS_KEY"] = "FFDDuWzmanbiqeF7Ftke68y4K8vtU1vDYFEWg1J5J4o2x4LEKqJzjDajZ0XQKpFG"

    Update 2013-03-31: More examples of setting the property bag

    I got a comment in the blog about having more examples for various approaches (like CSOM/JS and not only PowerShell). Sure enough, here comes some simple samples for that.

    Setting the BING MAPS KEY using JavaScript Client Object Model on a specific Web

    var ctx = new SP.ClientContext.get_current();
    var web = ctx.get_site().get_rootWeb();
    var webProperties = web.get_allProperties();
    webProperties.set_item("BING_MAPS_KEY", "FFDDuWzmanbiqeF7Ftke68y4K8vtU1vDYFEWg1J5J4o2x4LEKqJzjDajZ0XQKpFG");
    // Shoot'em queries away captain!
    ctx.executeQueryAsync(function (){
    },function () {
        alert("Fail.. Doh!");

    Setting the BING MAPS KEY using .NET Client Object Model on a specific Web

    // Set the Url to the site, or get the current context. Choose your own approach here..
    var ctx = new ClientContext("http://tozit-sp:2015/");
    var siteCollection = ctx.Site;
    var web = siteCollection.RootWeb;
    ctx.Load(web, w => w.AllProperties);
    var allProperties = web.AllProperties;
    // Set the Bing Maps Key property
    web.AllProperties["BING_MAPS_KEY"] = "FFDDuWzmanbiqeF7Ftke68y4K8vtU1vDYFEWg1J5J4o2x4LEKqJzjDajZ0XQKpFG";
    ctx.Load(web, w => w.AllProperties);

    So that’s pretty straight forward. Once you’ve set the Bing Maps Key, you can see that the text in your maps has disappeared and you can now start utilizing the full potential of the Geolocation field.


    The Geolocation field is pretty slick to play around with. It leaves a few holes in terms of functionality that we’ll have to fill ourselves – but of course that depends on our business requirements. One example is that I rarely want to enter the coordinates into the Geolocation field myself, but I might just want to do a search and select a location which is the added and the coordinates populated into the field automatically, or use the (built in) functionality of “Use my location” – good thing we’ve got developers to fine-tune this bits and pieces :-)


    Author: Tobias Zimmergren | | @zimmergren


    Recently someone asked me about how to attack the major pain of upgrading their custom coded projects and solution from SharePoint 2010 to SharePoint 2013. Given that question and my experiences thus far I’ll try to pinpoint the most important things to consider when upgrading. There’s TONS of things you need to consider, but we’ll touch on the most fundamental things to consider just to get up and running. After that I’m sure you’ll bump into a few more issues, and then you’re on your way ;-)

    Keep your developer tools updated

    Visual Studio 2012 Update 1

    The first step is to make sure that you’re running the latest version of Visual Studio 2012. As of this writing that means you should be running Visual Studio 2012 and then apply the Visual Studio 2012 Update 1 pack (vsupdate_KB2707250.exe) if it isn’t installed on your system already.

    Download Visual Studio 2012 Update 1 here:

    Visual Studio 2012 Office Developer Tools

    The second step is to make sure you’ve got the latest developer tools for SharePoint installed. The package comes as an update in the Web Platform Installer which I urge you to have installed on your dev-box if you for some reason don’t already have it installed.

    So, launch the Web Platform Installer and make a quick search for “SharePoint” and you should see the new developer tools there (note that the release date is 2013-02-26, which is the release date for the RTM tools):


    Select the “Microsoft Office Developer Tools for Visual Studio 2012” and click “Add“. It will ask you to install a bunch of prerequisites which you should accept if you want to continue:


    Let the tools be installed and the components updated. This could take anywhere from a few seconds to a few Microsoft minutes. It took about 5 minutes on my current development environment, so that wasn’t too bad.


    Once the tools are installed, you are ready to get going with your upgrade.

    Open your projects/solutions after upgrading Visual Studio 2012 with the latest tools

    When the tools have been successfully installed and you open your solution the new tools will be in effect. If you’re opening a SharePoint 2010 project that you wish to upgrade to SharePoint 2013, you’ll get a dialog saying “Do you want to upgrade <project name> to a SharePoint 2013 solution? Once the upgrade is complete, the solution can’t be deployed to SharePoint 2010. Do you want to continue?”


    Hit Yes if you get this dialog. If you want to upgrade your project to SharePoint 2013.

    Once the project is loaded and have made all the necessary changes to the project files (which it now does automatically, unlike in the beta/preview tools where we had to do some manual tweaks), you should get an upgrade report telling you how things went. Hopefully there’ll be no Errors, only warnings and Messages.


    If you check out the assembly references in your project that are pointing to any SharePoint assemblies, note that they have automatically been updated to the correct version of the SharePoint 2013 assembly:


    Additional notes

    If you upgraded without the latest version of the developer tools you only had the option to launch your projects in 2013-mode if you manually went into the .csproj file to modify (or add if one of them were missing) the following two lines:


    This was true when the developer tools were in Preview/beta. But now when they’re released to RTM you shouldn’t be doing those manual hacks anymore. Trust the tools!

    Tip: Some general code updates that may be required

    When you deploy artifacts to the SharePointRoot folder in SharePoint 2013 they are now deployed to the /15 folder instead of the older /14 folder. SharePoint 2013 has a much better support for upgrade scenarios than previous versions of SharePoint (2010) which is why we’ve got the double hives. So, if you want to properly upgrade your solution you should also make sure to replace all the paths in your project from:

    Path to the Images folder

    From the images folder:




    Path to the layouts folder

    Make sure to not forget the general layouts path:




    Path to the ControlTemplates folder

    Also make sure to replace the following paths:




    Well you get the general idea here.. Should you find paths pointing to your old 14-hive instead of the new 15-folder, make sure to change the path/url.


    As always, you will not be an efficient developer without the proper tools at hand to make the daily tasks easier.

    If you enjoyed using CKS Dev for SharePoint 2010 development, you’ll still be able to enjoy some of that awesomeness by simply installing the CKS Dev tools for SharePoint 2010 on your Visual Studio 2012/SP2013 box. They seem to work fine on Visual Studio 2012 as well – so until there’s a proper update of the tools, you’ll be able to knacker some of your code with the old tools.

    Do note that there’s certain features of the CKS Dev that doesn’t work fully, so should you encounter issues with the tool in various scenarios that’ll most likely be because they’re not engineered for Visual Studio 2012 (yet).


    After you’ve done enough tinkering you’ll be ready to rock this baby up on SharePoint 2013.


    Author: Tobias Zimmergren | | @zimmergren


    Okay so this will be a pretty short one, but hopefully help some folks that are upgrading their solutions from SharePoint 2010 to SharePoint 2013.

    While developing fields, content types and the likes in SharePoint 2010, there’s always a few good rules and practices to follow. A good rule of thumb that I tend to stick to is to never use any reserved or system-name for your fields that you create. In this quick post I’ll talk about how to fix the "Duplicate field name was found" after you upgrade from 2010 to SharePoint 2013 and try to deploy and activate your feature(s).

    In one of my projects that I am involved in, I was tasked to upgrade their existing SharePoint 2010 solutions to SharePoint 2013 – and this is where these problems hit us, hard.

    A duplicate field name "Name" was found

    If you have upgraded your solution from SharePoint 2010 to SharePoint 2013 and you deploy your code, only to find out that you are getting the notorious error message saying "A duplicate field name ‘fieldname’ was found" you might think you did something wrong in the deployment steps or otherwise failed to successfully upgrade your solution. What actually might have happened is a case of the "don’t use any reserved or system names, please" fever.

    After some digging around our 30 projects, I found that the features and finally fields it were complaining about. While investigating the xml here, I noted that the "Name" was the failing factor. If we changed the Name property to something unique (hence, not a built-in field name), it seems to work out nicely.

    Field xml for the SharePoint 2010 project

      Description="Short info on the tag"
      Group="My Awesome Fields"

    Field xml for the modified 2013 project, after modification

      Description="Short info on the tag"
      Group="My Awesome Fields"

    What’s the difference?

    So if you look at the two basic samples above, you can see that the small difference is what’s in the "Name" property. If I changed the value to something unique, it started working immediately.

    But, doing this will of course bring up other questions that you need to take into consideration and think about.

    • Is there any code reliant on your field’s name property?
    • Will it break any functionality in your code or configured lists/views etc?
    • What happens to data that potentially will be migrated from the old environment into the new environment? Can you map the data to the correct fields properly?


    I thought I’d post this to save someone else from having to spend a few hours digging into the various bits and pieces of an enterprise project to find out where it breaks down after the upgrade. Should you encounter the error message in the title of this post immediately after upgrading your solutions, this may very well be the cause.

    Please also make note that this solution is one solution to the problem. Perhaps there’s more solutions available to we can use to fix these issues. Should you know of any such solution, don’t hesitate to comment on the post and I’ll update it and give you the cred :-)

    SharePoint Server 2013 is an awesome product that is still uncharted territory for many organizations, but I’m seeing a huge increase in the adoption of 2013 locally and with that we’ll have plenty of time to dig into these fine bits of SharePoint magic :-)