Posts Tagged ‘SharePoint 2013’

Microsoft just released Service Pack 1 (SP1) for SharePoint Server 2013 and Office 2013 (and a bunch of other products that doesn’t quite relate to my focus).

As per request from a few clients and friends, here’s a quick link list with download details.

SharePoint Server 2013 SP1 improvements

In this spreadsheet you can get a full list of improvements with SP 1. For SharePoint, these are the immediate changes:

Description of fixes

Metadata is lost when documents that use a custom content type with a "Description" field are opened for editing.

When an item is deleted, restored from recycle bin, and then deleted again, there is a primary key constraint error.

An error occurs when files are moved between document libraries and the web time zone is behind that of the server.

Metadata filtering at list level always lists all metadata terms.

The hyperlink popup window drops the selected word to be linked when there is a delay of more than one second in opening the window.

Multiple-column, SummaryLinkWebParts with a group heading style of "Separator" are rendered incorrectly.

A hash tag that contains a full width space does not get created successfully.

Search schema compression is now enabled by default to allow larger search schemas.

Highlighting for FQL queries is now enabled for FQL as well as KQL.

Opening a custom SharePoint list in datasheet view and applying multiple custom filters, where each filter has more than one condition, can result in an incomplete set of list items.

When the "Export to Excel" button is clicked in a SharePoint document library that has the Content Type field displayed, the Content Type field does not appear in the Excel workbook.

An error occurs after changing the "Manager" property in EditProfile.aspx page when the My Sites WebApp is not in the same farm as the UPA.

SharePoint REST API does not return a well-defined error response for a duplicate key exception.

Developers are unable to specify a Content Type ID when creating Content Types in the client object model.

On list views in SharePoint sites, the Connect to Outlook button in the ribbon may be erroneously disabled.

In some non-English languages of SharePoint, the text displayed in the callout UI for a document or list item, describing who last edited the item, may not be grammatically correct.

Copy and Paste in a datasheet does not work correctly with Internet Explorer 11.

Pages do not render in Safari for iPad when private browsing mode is used.

When editing rich text fields in SharePoint, if the editing session exceeds 30 minutes, the edits may not be saved.

An error that says "SCRIPT12004: An internal error occurred in the Microsoft Internet extensions" may occur intermittently when users visit their SkyDrive Pro or other pages on their personal site.

InfoPath may crash when a form that points to a SharePoint list, with a lookup to another SharePoint list, is opened.

An InfoPath form with extended characters in its name fails to open.

An error that says "Security Validation for the form has timed out" may occur when an InfoPath form is digitally signed and hosted in a SharePoint site collection that uses the SharePoint version 2010 user experience.

"Show document icon" remains unchecked and the document icon does not show in Edit Properties for a list item.

A "Failed tagging this page" error occurs when the "I like it" button is clicked.

The wrong term is removed when manually editing a multi-valued taxonomy field.

When tagging list items using a language that is different from the term store default language, suggestions for labels are offered in multiple languages. The suggestions appear confusing because both language suggestions are listed without any identification of the language.

An error that says "There was an error processing this request" may appear when editing the user profile.

Times are missing from Date/Time results in certain filtered list web service calls.

Minimal and no metadata are now enabled as supported JSON formats.

Actions4 schema workflow actions can’t be deployed to SharePoint.

Using Client Object Model, Stream.Seek() to seek to a particular position doesn’t seek at the proper offset.

Refreshing a workflow status page generates the following error: "System.Collections.Generic.KeyNotFoundException: The given key was not present in the dictionary."

Setting custom, non-English outcomes in web pages on tasks in a workflow fails to set the value.

Configurations of SharePoint using Azure Hybrid mode and Workflow Manager together can cause workflow callbacks to fail.

Workflow task processes on wiki pages won’t start.

Workflows won’t wait for changes to content approval status fields.

E-mails generated by workflow cannot be disabled for approvals in SharePoint workflows.

Workflows may fail to send an e-mail or send too many e-mails.

Association variables do not update correctly for auto-start workflows.

A KeyNotFoundException error may occur in a workflow when the associated task list uses unique permissions.

Incomplete tasks are deleted when workflow task activities complete.

Task activity is suspended when the task is completed using app-only credentials.

An error that says "This task could not be updated at this time" occurs when trying to complete a workflow task using the "Open this task" button in Outlook.

A workflow doesn’t respond properly when waiting for changes in specific types of list columns, such as Boolean, Date Time, and User.

Details & Download

Slipstream install SharePoint Server 2013 with Service Pack 1

Yesterday (28th of February 2014), Microsoft released the slipstream installation media on MSDN. So if you’re aiming to install a new SharePoint 2013 environment, you can get the one with SP1 bundled in the installation.


Please note, as per Microsofts recommendations it is UNSUPPORTED to create your own slipstream installation. You should get the prepeared slipstream installation as per above (MSDN, MPN or VLSC) instead. As quoted by Stefan Goßner:

A couple of people already tried to create their own slipstream version using RTM and SP1. Please avoid doing this as this is unsupported due to a change in the package layout which was introduced with March 2013 PU. Due to this change in the package layout it is not guaranteed that the manually created slipstream version will update all modules correctly. Ensure to download the official SharePoint 2013 with SP1 from one of the above listed sites.

Author: Tobias Zimmergren | | | @zimmergren


With a new year comes new technology, new business models and new adventures. Like previous years, I’ll post some predictions for what I believe are things that’ll happen during 2014, or at least will start to happen.

My previous predictions and year-in-review posts:

I usually write my predictions before the year begins but hands down, I’m too busy helping my clients. That is a convenient problem of course. With that said, let’s quickly move onto the predictions for 2014.

Prediction for 2014

Here goes. My predictions for the year to come.

We will see more "cloud readiness"

In my "Year in review" for 2011, I wrote:

One thing that clearly sticks out in the year that passed is the preparation work being made for migrations to future versions of SharePoint and migrations into the cloud (Office 365 with SharePoint Online).

Looking back at 2012 and 2013 it truly was a year of "cloud preparations" or "cloud readiness". Some of my clients have moved to the cloud entirely, some are deployed with hybrid solutions and a majority are still lingering on-premises. However one thing that ties them all together is that they are all focused on aligning with the capabilities in the cloud. This means that the solutions being developed and the information structure are being done in such ways that they could potentially be migrated to the cloud one day or align well with hybrid approaches.

I believe that 2014 will be a year where a wider range of organizations realize the value of going into the cloud, or the value of having their on-premises environment aligned with the capabilities in the cloud. Cloud doesn’t mean everything, but aligning your on-premises work with what happens in Office 365 is a very good idea.

Of course in the end it all comes down to aligning your organization with the business, despite what technology hosts your data.

Hybrid solutions with Office 365

I believe that 2014 will be a year where organizations embrace the fact that some things are easier to put in the cloud. Some things obviously needs to stay in-house, but that doesn’t mean that everything has to.

My prediction in this space is that we will see more and more medium-sized businesses and even some enterprise-class businesses move to the cloud in hybrid models. Sensitive data stays in-house and services are offered from the cloud.

Growth of the SharePoint and Office App Marketplace

Before anyone would be willing to give a more precise indication on this topic, we’d want Microsoft to disclose more information about what the stats are for the Marketplace today. While pushing the marketing buttons really hard, it is probably safe to say that the Marketplace never got the traction they expected.

I do believe that in 2014 we’ll see some additions to the Marketplace both in terms of the underlying frameworks and app-publishing models but also in that more players will get onto the wagon and start publishing apps. I guess we’ll just have to wait and see in what direction it grows.

Some more attention for SkyDrive Pro

Microsoft is competing with a lot of other vendors such as Dropbox and Google Drive when it comes to consumer data storage in the cloud. But we are also seeing more competitors going after the enterprise customers which means Microsoft will have to accelerate their business for SkyDrive Pro, which targets the enterprise customers for storing data. (See Microsoft’s site for a comparison between SkyDrive (consumer) and SkyDrive Pro (enterprise), or watch this youtube video).

Since Microsoft is pushing all-in on the cloud and their Office 365 services, I’m pretty confident that we’ll see some new features to the SkyDrive Pro offerings (and hopefully the normal SkyDrive too actually) during the year that comes.

Go social by integrating Yammer

Since Microsoft’s acquisition of Yammer, there’s been a ton of rumors about the future of social in SharePoint. My take is that Yammer will be more tighter integrated with SharePoint Online (Office 365) but we may have to wait longer for additional on-premises functionality.

My prediction for 2014 in this area is some key investments to enhance the offerings for the cloud first, and then potentially additional features and integration points for on-premises deployments wouldn’t be unthinkable.

There is currently a published white paper on TechNet for "Integrating Yammer with on-premises SharePoint 2013 environments". In honest words, I’d say that this comes close to the worst white paper I have ever read on TechNet. It even has a step-by-step instructions for how to change files in the SharePoint Root folder (15-hive) which is utterly forbidden, as any seasoned SharePoint expert well knows. It is safe to say that most on-premises environments I have come across aren’t integrating Yammer in an efficient way today.

Microsoft, please bring forth the awesomeness – and a clear roadmap for the foreseeable future.

Customization madness has to stop

Microsoft is pretty clear on this point. Don’t customize.

I will have to align with those words, and given the impression I have from fellow business partners and community followers I would assume that 2014 will be a year where a lot of organizations revise their customizations.

Just think about it for a second; Have you ever had problems upgrading just because of some nonsense customizations? Thought so. I know I have.

My prediction here is that people will start to realize the importance of a clean environment when the updates start to roll out more often. While aligning our on-premises deployments with the offering in the cloud, this is an extremely important consideration. This doesn’t mean that your SharePoint environment will look like any other SharePoint environment – you can still modify some look and feel, but the awareness of staying on the right side of the "recommended practices" is increasing.

Stop solving problems you don’t have (over-engineering)

Some folks call it over-engineering, some folks call it over-architecting and some folks just say "Don’t solve problems you don’t have".

With the guidelines to stay as close to the cloud-offering as possible, it is also important to accept the fact that you no longer should solve problems you don’t have. What I mean is that oftentimes organizations have some awesome members of their team, and some of these awesome members may be gurus with the code and infrastructure – but if you don’t actually have a business case, don’t implement a solution for it. If 10 out of 10 000 users want a specific solution, I wouldn’t call it a business case. Spending hours, days or even weeks or months (been there, seen that) on developing a custom solution that wasn’t requested by the organization but rather a few single individuals will cost you more than it will benefit you. Especially if the developers of these solutions likes to over-architect their their solutions with unnecessary complexity.

Keep it simple and revise the requirements before you decide to implement a solution. Revise the solution proposal before you decide to have it developed.

My predictions for 2014 is that more teams and more organizations are getting the hang of how it works, and are more prone on making efficient decisions that are aligned with their business and their technical implementation.

We well see more subscription-based solutions and services

Overall, I think we will see more subscription-based services and products offered by not only Microsoft but partners and ISV’s as well.

In the enterprise I believe we’ll see some changes for vendors and services – option to subscribe instead of paying license fees up front is getting more common. Cheap per-user licenses which you don’t have to keep paying for if you cancel the service, that’s a winning deal and I believe the enterprise is no different than the consumer-market on this point – who wouldn’t want to lower up-front costs and pay for the usage instead?


Honestly I’ve got tons of more things I’d like to discuss for the future, but I think I’ve covered what I believe are the most interesting things to keep in mind and watch out for in 2014.


Author: Tobias Zimmergren | | | @zimmergren


Every cycle of SharePoint comes with challenges around upgrades and migrations. In one of my current projects I’ve been part of designing an iterative upgrade process – as I like to call it – which means we’ll be upgrading our Farm (all content databases) from SharePoint 2010 to SharePoint 2013 every week. Yes, that’s right – we upgrade SharePoint 2010 to SharePoint 2013 every single week with the latest content from the production environments. This of course happens on a separate SharePoint 2013 farm setup specifically for this purpose.

In this article I’ll talk about the benefits of my “Iterative Upgrade Process” and what it means in terms of benefits for the teams and people involved with your Intranet, Extranet, Public site or whatever you are using your SharePoint farm for.

Please do note that this is not an article describing the steps to upgrade your SharePoint environment – this is a process for iterative development and iterative testing in environments that we tear down, build up and upgrade from SharePoint 2010 to SharePoint 2013 every week.

Background: Everyone is effected by an upgrade or migration

It’s not uncommon that you bumb into a lot of problems while upgrading your farms from one version to the other. Common problems include customizations, faulty configurations and general bad practices being implemented in the original farm. But for most organizations an upgrade doesn’t “just happen” overnight and then everything works flawlessly – on the contrary there’s always a bunch of problems brought to light that needs to be taken care of.

Working a lot on SharePoint Intranets like I currently do, we often see problems before, during and after upgrades. Not only technical issues that we can handle, but issues with people in the organization not reaching the right information or not being able to perform their daily tasks. This can have very complicated impacts on the organization if the migration fails to run smoothly and everything isn’t up and running after the service windows you’ve specified.

The end-user is affected in terms of downtime and possible being hindered to perform their tasks, which in the end hurts the organization since these guys are the thing that keeps the organization running! The IT-departments (or technical folks in your organization involved with SharePoint) may be affected if the migration or upgrade doesn’t go as planned. The business as a whole relies on the system to be functioning and for every minute or hour that the systems aren’t fully available the organization may loose both time and money.

So in order to minimize any pain in upgrading from one version of SharePoint to another, we need to consider the implications of a troublesome upgrade. With the iterative upgrade process we’ve got in place right now at one of my clients you can test and verify all your changes and customizations and whatever you want to assure the quality of – over and over again before the real deal.

Implementation: How am I doing this?

So boiling down the steps included in our iterative upgrade process, gives something similar to this:


In a nutshell this is what the process looks like from a bird perspective, even though some of the steps require extensive amount of preparation-work and time to get done. Below is an explanation of all these steps in more details, to give you an understanding of what this really means.

Setup SP 2013 Farm

The very first step that we need to do is to setup and configure the SharePoint 2013 Farm where our upgraded content will eventually land. In our case we’ve set up this as a one-off configuration, re-using the 2013 farm on every new iteration. You could – as an alternative – argue that it would be beneficial to tear down the entire farm and have it set up again. It would be theoretically possible, but in our specific case it simply doesn’t work that easily – too many dependencies rely on things outside of my and my team’s control.

Uninstall any custom solutions

This step is of course only necessary if you’ve already upgraded at least once in the new farm, then by the time you’ve scheduled your next iterative upgrade you’ll need to uninstall any and all old solutions in order to clean up the farm a bit before we proceed.

Remove any content databases

Again, this step is only necessary if you’ve already upgraded at least once in the new farm. If you have, there’ll be upgraded content databases that you need to get rid off before we commence the process to the next step. We’re doing this with the PowerShell cmdlet Remove-SPContentDatabase.

Deploy SP 2010 Solutions

Deploy your old 2010 solutions. The reason why we would want to do this is that when we later perform the actual mount of the databases, it’s pretty nice if the mounting-process can find the references to the features, web parts and any other resources within those solutions. This is a temporary deployment and the 2010 packages are soon to be removed again.

Copy fresh databases

Next step is to ensure that the SQL Server in our new 2013 farm is up to date with containing the latest content databases from the SharePoint 2010 farm. This is why we’re using Flexclone (described in more detail further down in this article). Actually, Flexclone makes virtual copies which are true clones without demanding additional storage space. Pow! Totally awesome.

Attach databases

After the databases are copied to the SQL Server, we’ll have to attach them to SQL Server as you would normally do.

Mount databases

Next step is where we mount the actual databases to SharePoint. The databases are still in SharePoint 2010 mode, since the copies of our databases comes from the SharePoint 2010 environment. This is why we need to have our 2010 WSP solutions in place before we perform the mount – otherwise reading the mount-logs will be… well, not so fun ;)

We do this with the PowerShell cmdlet Mount-SPContentDatabase.

Uninstall SP 2010 Solutions

When the mounting is done, we’ll need to uninstall the 2010 version of the old solutions and move on to the next step.

Deploy upgraded SP 2013 Solutions

Yay, finally we’re at a much cooler step – deploying SharePoint 2013 solutions. So, to give a little background on what these solutions should be:

You should’ve already upgraded your SharePoint projects to SharePoint 2013 solutions, have them ready to go and use in this step.

Notes:  This is probably the most time-consuming step if you have custom solutions. Anything you’ve built in SharePoint 2010 and customized there needs to be upgraded to SharePoint 2013 and work there as well. Good thing we’ve got an iterative upgrade process so we can fine-tune this iteratively every day and just hit a button to re-upgrade the farm with the latest builds in our test- and pre-production environments. Yay!

Upgrade-SPSite with all Site Collections

Once the new and freshly upgraded 2013 packages have been deployed, we’ll continue by upgrading the actual Site Collections from SharePoint 2010 mode to SharePoint 2013 mode.

We’ll be using the PowerShell cmdlet Upgrade-SPSite for every one of our Site Collections.

Misc automated configuration scripts

We’ve got a lot of custom scripts getting run after the upgrade, as part of the finalization process of the actual upgrade. This includes custom re-branding scripts, re-creation of My Sites and moving content between old and new My Sites, custom scripts to disable and remove artifacts that aren’t used in SharePoint 2013 projects and solutions anymore, modification to removed or altered Content Types etc etc. The list can be made long – if you’re reading this you’ve probably already understood that each scenario is unique, but this process can be applied to most scenarios with a tweak here and there.

Tools: What tools are we using to make this happen?

Obviously things doesn’t get done by themselves, so I’ve automated much of the process with various tools and techniques, defined below.

Deployment Automation with Team City

There’s tons of ways to automate things in any ALM cycle. Be it a development lifecycle or an infrastructural upgrade lifecycle like this – something to automate the process will be your best bet. Since we’re doing this every week, and the process in itself is pretty complex with plenty of steps that needs to be done properly, I’ve chosen to go with Team City for all of the automation work.

I’ve gotten the question why use Team City instead of TFS Build or Jenkins or any other available build automation tools. Simply put: Team City is free for up to 20 configurations, easy (very very easy) to configure, works with multiple data sources and repositories and it just works – every time. But that’s a discussion for another day.

Database copies with Flexclone

In order to easily get set up with the databases in our environments, we’ve been using Netapp’s Flexclone software very successfully the last year. As quoted from their own website:

NetApp® FlexClone® technology instantly replicates data volumes and datasets as transparent, virtual copies—true clones—without compromising performance or demanding additional storage space.

So in essence, the usage of Flexclone allows us to with a single click (almost) replace all of the databases in our test- and pre-production environments and get real copies of the actual production environments in a matter of minutes. There’s no denying that this is awesomenss in its true form.

Iterative code upgrades with Visual Studio 2013

In order to maintain and upgrade the new codebase (upgraded from SharePoint 2010), we’re using Visual Studio 2013 like most professional Microsoft-related developers do today. You can use VS 2012 as well, should you like – but do try out 2013 if you can, it’s multiple times faster than previous versions of Visual Studio.

I have pushed hard for implementing a real ALM process in the team, and we’ve finally got that in place and it’s working pretty nicely right now. We’re using TeamCity to automate builds with continuous integration, nightly builds and scheduled and on-demand deployments to our environments. I will cover code automation more thoroughly in another post in the future, as it would be too much info to cover in this single post.


So this is a process we follow every week. Once a week I tear down the entire SP 2013 test farm and rig up a new snapshot of the databases on the SQL environment. Then I re-iterate this upgrade process (Team City, PowerShell and PowerShell Remoting to the rescue). This means we can literally try what the real upgrade will be like once we get there. Every week. Also we can have a nice agile iterative way of handling bugs that appear in the environments.

Oh yeah, should we break something – we click a button or two and we’ve got a freshly upgraded environment with the latest builds from the SP 2013 dev rigs.

It simplifies the overall process:

  • When time comes for the real upgrade, everything including upgraded code base and automated upgrade scripts is in place!
  • Find and report errors early in the upgrade process of your project
  • Find compatibility errors in code and solutions
  • Find out what will upgrade, and what will not upgrade before its too late
  • Be confident that once we reach the point of upgrade, we’ve done it so many times already that we know what might go wrong
  • The Product Owners, Project Managers, Testers and any other involved people have already verified the state of the system, so once we hit the button in the Production environments – we’re pretty much in a “accepted release” state already.

I hope you enjoyed this little read about my iterative upgrade process. It’s pretty darn good if you ask me – but requires some time to set up initially, but in larger projects it’s definitely worth it!


Author: Tobias Zimmergren | | @zimmergren


So recently, while working with the (awesome!) Work Management Service Application in some of our environments, we got the common problem of not receiving any actual tasks on our My Sites. The reason is that we see this message instead:

Last updated at 1/1/1901 12:00 AM

Now, throw a google query and you’ll find plenty of resources and fixes for how to configure the permissions of your Service Applications in order to make this service work.

My Solution

Due to various policies, restrictions and IT related stuff we couldn’t just configure permissions in any way we wanted. So we needed to figure out another way to fix this simple problem.

The solution is simple, for us:

  • Delete your Work Management Service Application
  • Re-create a new Work Management Service Application
    • Create a new Application Pool, but use the same account as for the Application Pool hosting your My Sites/Social or Portal.
  • Run a crawl
    • Incremental, continuous or full crawl should suffice.
  • Bingo.

In some scenarios this may work, in others it may not work. For our various farms (Test, Pre-Production, Production) it works great, and given it works in 3 different environments (with different accounts et al) it’s pretty neat.

After the crawl did it’s job, I could start engaging the Tasks list on my My Site with collective tasks throughout my entire farm:


Looks like it did the trick, and the tasks are now working like a charm including all data related to the task.


Other options

If this still doesn’t work,  check this TechNet article out about configuring the service permissions. Doing the above and configuring the permissions should definitely do the trick (

And here’s another tip if you’re still having issues:


Instead of messing about with permissions (for various reasons) we’ve managed to get it started and working with simply configuring the same Application Pool account. Should that not suffice, a combination will more likely work.

Author: Tobias Zimmergren | | @zimmergren


In one of the projects I’m currently involved, we’re in the process of upgrading from SharePoint 2010 to SharePoint 2013. One of the problems we faced were the fact that we had some orphaned content databases in our production environments, but the problem didn’t surface in SharePoint 2010 but was given light in 2013. So this short post is talking about how I fixed those issues, which was a bit of a pain to be honest.

In the environments we’re working, we’ve set up a scheduled upgrade that takes place once every week on a schedule. The reason for this is to re-iterate the upgrade process as many times we can, with production data, before the actual upgrade which will take place later down the road when all bugs, code tweaks/customizations and other random problems have been taken care of. One of the problems that surfaced recently was that we couldn’t create any new Site Collections, where the ULS spit out the unfortunate message:

Application error when access /_admin/createsite.aspx, Error=Object reference not set to an instance of an object.  
at Microsoft.SharePoint.Administration.SPContentDatabaseCollection.FindBestContentDatabaseForSiteCreation(IEnumerable`1 contentDatabases, Guid siteIdToAvoid, Guid webIdToAvoid, SPContentDatabase database, SPContentDatabase databaseTheSiteWillBeDeletedFrom)

While it took some time to boil down the nuts of what was going on, here’s the details in case you end up with the same issues.

Cannot create new Site Collections

So the problem we faced of not being able to create new Site Collections surfaced itself in the ULS logs, stating this message:

Application error when access /_admin/createsite.aspx, Error=Object reference not set to an instance of an object.  

at Microsoft.SharePoint.Administration.SPContentDatabaseCollection.FindBestContentDatabaseForSiteCreation(IEnumerable`1 contentDatabases, Guid siteIdToAvoid, Guid webIdToAvoid, SPContentDatabase database, SPContentDatabase databaseTheSiteWillBeDeletedFrom)    

at Microsoft.SharePoint.Administration.SPContentDatabaseCollection.FindBestContentDatabaseForSiteCreation(SPSiteCreationParameters siteCreationParameters, Guid siteIdToAvoid, Guid webIdToAvoid, SPContentDatabase database, SPContentDatabase databaseTheSiteWillBeDeletedFrom)    

at Microsoft.SharePoint.Administration.SPContentDatabaseCollection.FindBestContentDatabaseForSiteCreation(SPSiteCreationParameters siteCreationParameters)    

at Microsoft.SharePoint.Administration.SPSiteCollection.Add(SPContentDatabase database, SPSiteSubscription siteSubscription, String siteUrl, String title, String description, UInt32 nLCID, Int32 compatibilityLevel, String webTemplate, String ownerLogin, String ownerName, String ownerEmail, String secondaryContactLogin, String secondaryContactName, String secondaryContactEmail, String quotaTemplate, String sscRootWebUrl, Boolean useHostHeaderAsSiteName, Boolean overrideCompatibilityRestriction)    

at Microsoft.SharePoint.Administration.SPSiteCollection.Add(SPSiteSubscription siteSubscription, String siteUrl, String title, String description, UInt32 nLCID, Int32 compatibilityLevel, String webTemplate, String ownerLogin, String ownerName, String ownerEmail, String secondaryContactLogin, String secondaryContactName, String secondaryContactEmail, Boolean useHostHeaderAsSiteName)    

at Microsoft.SharePoint.ApplicationPages.CreateSitePage.BtnCreateSite_Click(Object sender, EventArgs e)    

at System.Web.UI.WebControls.Button.RaisePostBackEvent(String eventArgument)    

at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint)

Given some reflector magic and investigations I found out that this specific method causing the problem was looking for the best Content Database to put the new Site Collection in. While it was trying to do this, it obviously want to balance the Site Collections in a way that means they’re evenly distributed over the Content Databases.

The reason for why we got this error message is due to invalid references in our Config database pointing to Content Databases that no longer exist, for whatever reason. The result of this is that the method tried to create the new Site Collection into a Content Database that doesn’t really exist, even though SharePoint thought it existed.

Steps to find and kill the broken/invalid references to the non-existent content databases

After some SQL magic, finding out the null-references were rather easy. Following these steps allowed me to figure out the details of the broken databases:

Step 1: Get the Web Application ID

Either use SharePoint Manager or simply a quick PowerShell statement to quickly figure out the GUID of your Web Application where the problem is persisted:

$wa = Get-SPWebApplication

Obviously you should note/save this ID for reference in the next steps.

Step 2: Query your Config database for the appropriate information

Save this ID, head on over to your SQL server and run this command (replace GUID with your ID from Web App)

USE SP13_Config
SELECT ID, CAST(Properties as XML) AS 'Properties'
FROM Objects
WHERE ID = 'GUID' -- GUID of the Web Application

As you can see when using the CAST(Properties as XML) bit of the query, you can get a clickable link in the results window given you an awesome overview of the XML represented. Thanks to a SQL friend of mine for pointing that out, saved the day :-)

Here’s what the results looks like (1 row):


Step 3: Investigate the returned results (XML) and find your null-values

Click the XML link and find this section containing the Microsoft.SharePoint.Administration.SPContentDatabaseCollection and see if you find any place where the fld value is null, something like this:


As you can see, most of the databases in our environment has a sFld and a fld xml node where the GUID of the database are stored. However in some cases (in 2 places in our environment!) you may find that it says null instead. That is essentially your invalid reference pointing to nothing at all. So SharePoint tries to create the Site Collection in the Content Database with the null-fld.

As with previous steps, make a note of the GUID from your broken database references.

Step 4: Delete the database(s) using PowerShell

The best way we found to delete these databases were by using PowerShell. At first I didn’t think it actually worked, but after re-running the SQL query after running the PowerShell command it occurred to me that the command had actually removed the invalid reference. The reason for why I didn’t think it worked is because PowerShell is throwing some errors on the screen for you, but it looks as if it’s actually working the right magic under the hood for us – thus leaving us with an intact and working farm again.

So, make sure you’ve got the ID’s of your broken databases and first and foremost make sure that you haven’t copied the incorrect GUID (!) – what I did was simply query my Web Application and filtered the query to give me the ID and Names of all Content Databases so I could make sure that I didn’t delete an actual Content Database by mistake.

Verification command:

$wa.ContentDatabases | ft ID, Name

After running this command we got a list of databases where we could just make sure that the GUID’s we’ve coped didn’t actually represent any of our real databases that were intact:


Great, now that I’m sure the ID of the databases I copied isn’t the ID of a production DB which I know is intact, but represents my broken ones, I can execute the delete-command on those buggers!

In order to do that, I simply ran this PowerShell command:


The results of this were as follows, causing a lot of nice error messages.. However, the magic under the hood still worked:


Step 5: Verify by running the SQL query again

So the PowerShell throws an error message stating that “Object reference not set to an instance of an object.”, however under the hood the magic has been applied properly and in my Config-database the values that were incorrect are now deleted as can be verified if we re-run the SQL query:



Well, I’ve learnt a lot this week about the Config database and playing around with the GUIDs within. The scary part was that these errors didn’t surface in SharePoint 2010, but they did in 2013 once we upgraded. Another good reason to get a good iterative upgrade-routine in place before an actual upgrade is attempted.

Speaking about iterative upgrade processes I might discuss that in a future post, namely how we commence our upgrades every week without lifting a finger (almost) :-)


Author: Tobias Zimmergren | | @zimmergren


In one of my previous articles where we investigated some of the new and awesome delegate controls in SharePoint 2013. It walks you through some of the interesting DelegateControl additions that I was playing around with. On top of that I got a comment about how you could extend it further by adding the current site title in the Suite bar:


Sure enough, I took a dive into the Seattle.master to take a look at how the title is rendered and found the control called SPTitleBreadcrumb. This is the control that is responsible for rendering the default out of the box title in a normal SharePoint team site like this:


So to follow the question through and provide an answer to get you started, we’ll take a quick look on how we can build further on our old sample from the previous blog post and add the site title (including or excluding breadcrumb-functionality) to the Suite bar.

Investigating the out of the box Seattle.master

In the OOTB Seattle.master, the title is rendered using the following code snippet:

<h1 id="pageTitle" class="ms-core-pageTitle">
  <SharePoint:AjaxDelta id="DeltaPlaceHolderPageTitleInTitleArea" runat="server">
    <asp:ContentPlaceHolder id="PlaceHolderPageTitleInTitleArea" runat="server">
          <SharePoint:ClusteredDirectionalSeparatorArrow runat="server" />
  <SharePoint:AjaxDelta BlockElement="true" id="DeltaPlaceHolderPageDescription" CssClass="ms-displayInlineBlock ms-normalWrap" runat="server">
    <a href="javascript:;" id="ms-pageDescriptionDiv" style="display:none;">
      <span id="ms-pageDescriptionImage">&#160;</span>
    <span class="ms-accessible" id="ms-pageDescription">
      <asp:ContentPlaceHolder id="PlaceHolderPageDescription" runat="server" />
    <SharePoint:ScriptBlock runat="server">

What we can see in this file is that there’s a lot of action going on to simply render the title (or title + breadcrumb). You can play around with this in tons of ways, both server-side and client side. In this article we’ll take a look at how we can extend the Suite bar delegate control from my previous article in order to – using server side code – modify the title and breadcrumb and move it around a bit.

Should you want to get the title using jQuery or client side object models, that works fine too. But we can save that for another post.

Adding the Title Breadcrumb to the Suite bar

I’m going to make this short and easy. The very first thing you should do is head on over to my previous article “Some new DelegateControl additions to the SharePoint 2013 master pages” and take a look at the “SuiteBarBrandingDelegate Delegate Control” section and make sure you’ve got that covered.

Once you’ve setup like that, here’s some simple additional tweaks you can add to your Delegate Control in order for the breadcrumb to be displayed in the top row of SharePoint. Modify the content of the “” (the file in my previous sample is named like that, in your case it may differ) to now look something like this:

protected void Page_Load(object sender, EventArgs e)
    // Register any custom CSS we may need to inject, unless we've added it previously through the masterpage or another delegate control...
    Controls.Add(new CssRegistration { Name = "/_layouts/15/Zimmergren.DelegateControls/Styles.css", ID = "CssReg_SuiteBarBrandingDelegate", After = "corev5.css" });

    BrandingTextControl.Controls.Add(new Literal
        Text = string.Format("<a href='{0}'><img src='{1}' alt='{2}' /></a>",

    // Create a new Title Breadcrumb Control
    SPTitleBreadcrumb titleBc = new SPTitleBreadcrumb();
    titleBc.RenderCurrentNodeAsLink = true;
    titleBc.SiteMapProvider = "SPContentMapProvider";
    titleBc.CentralAdminSiteMapProvider = "SPXmlAdminContentMapProvider";
    titleBc.CssClass = "suitebar-titlebreadcrumb";
    titleBc.DefaultParentLevelsDisplayed = 5;

    // Add the Title Breadcrumb Control

As an indication, the end-result might look something like this when you’re done. What we’ve done is simply copied the logic from the Seattle.master into the code behind file of our delegate control and set the “DefaultParentLevelsDisplayed” to a higher number than 0 so it’ll render the actual breadcrumb. By setting this value to 0, only the title will be displayed.


Then if you want to hide the default title, you can do that by using this small CSS snippet:

    display: none;

And it’s gone:


From there you should be able to take it onwards and upwards in term of the styling. I haven’t really put any effort into making it pretty here :-)


With these small additions and changes to my original code samples you can make the title bar, including or excluding the breadcrumb, appear in the top bar instead of in the default title-area.

Additional important comments:

You may need to consider ensuring that the Site Map datasource is available on every page, including system pages for example. If it isn’t or you land on a page that don’t want to render your breadcrumb/title, it may not be able to properly render your navigation as you would expect it to. However that’s something to look into further from that point.

For example, by default the “Site Content” link will not render the full breadcrumb properly, but rather just say “Home”. In order to fix smaller issues like that, we can further extend the code logic a few lines and take care of those bits.

My recommendation:

Always make sure you consider the approach you take for any type of customization and development. For this specific case, we’ve already used some code-behind to display our logo in the top left corner, so we’ve just built some additional sample code on top of that to render the breadcrumbs. However should we only want to do that and not move further onwards with the logic from here – I would most likely suggest you do this using jQuery/CSOM instead to be “Office 365 compliant” and keeping your customizations to a minimum.

Hope you enjoyed this small tweak. And keep in mind: Recommendations going forward (however hard it’ll be to conform to them) are to keep customizations to a minimum!

Cheers, Tob.

Author: Tobias Zimmergren | | @zimmergren


SharePoint 2013 comes with tons of enhancements and modifications to previous versions of the product. One of the cool features I’ve played around with lately is the Geolocation field. Back in 2010 I wrote a custom-coded solution for displaying location information in our SharePoint lists, integrating some fancy-pants Google Maps – in SharePoint 2013, a similar field exist out of the box.

In this article I’ll mention what this field does, a sample of creating and using the field, getting and setting the Bing Maps keys to make sure our maps are properly working and displaying.

Update 2013-09-14: As pointed out by Leon Zandman in the comments, there’s a some updated pre-requisites required in order to view the geolocation field value or data in a list. Information from Microsoft:

An MSI package named SQLSysClrTypes.msi must be installed on every SharePoint front-end web server to view the geolocation field value or data in a list. This package installs components that implement the new geometry, geography, and hierarchy ID types in SQL Server 2008. By default, this file is installed for SharePoint Online. However, it is not for an on-premises deployment of SharePoint Server 2013. You must be a member of the Farm Administrators group to perform this operation. To download SQLSysClrTypes.msi, see Microsoft SQL Server 2008 R2 SP1 Feature Pack for SQL Server 2008, or Microsoft SQL Server 2012 Feature Packfor SQL Server 2012 in the Microsoft Download Center.

Introduction to the Geolocation Field

I’ll showcase what the Geolocation field can do for us in a SharePoint list. In the sample below I’ve used a list called “Scandinavian Microsoft Offices” which contains a few office names (Sweden, Denmark, Finland and Norway). What I want to do in my list is to display the location visually to my users, not only the name and address of the location. With the new Geolocation field you can display an actual map, as I’ll show you through right now – skip down to the “Adding a Geolocation Field to your list” section if you want to know how to get the same results yourself.

A plain SharePoint list before I’ve added my Geolocation field


As you can see, no modifications or extra awesomeness exist in this list view – it’s a vanilla SharePoint 2013 list view.

The same list, with the Geolocation field added to it

When we’ve added the Geolocation field to support our Bing Maps, you can see that a new column is displayed in the list view and you can interact with it. In my sample here I’ve filled in the coordinates for the four Microsoft offices I’ve listed in my list.


Pressing the small globe icon will bring up a nice hover card kind of dialog with the actual map, with options to view the entire map on Bing Maps as well (which is essentially just a link that’ll take you onwards to the actual bing map):


Viewing an actual list item looks like this, with the map fully integrated by default into the display form:


And should you want to Add or Edit a list item with the Geolocation field, you can click either “Specify location” or “Use my location“. If you browser supports the usage and tracking of your location, you can use the latter alternative to have SharePoint automagically fill in your coordinates for you. Compare it with how you check in at Facebook and it recognizes your current location and can put a pin on the map for you.


In my current setup I don’t have support for “Use my location” so I’ll have to go with the “Specify location” option – giving me this pretty dull dialog:


As you can see, you don’t have an option for searching for your office on Bing Maps and then selecting the search result and have it automatically insert the correct Lat/Long coordinates. But, that’s where developers come in handy.

Create a new Map View

Let’s not forget about this awesome feature – you can create a new View in your list now, called a “Map View”, which will give you a pretty nice map layout of your tagged locations with pins on the map. Check these steps out:

1) Head on up to “List” -> “Create View” in your List Ribbon Menu:


2) Select the new “Map View”


3) Enter a name, choose your fields and hit “Ok”


4) Enjoy your newly created out of the box view in SharePoint. AWESOME


Adding a Geolocation Field to your list

Right, let’s move on to the fun part of actually adding the field to our list. I’m not sure if it’s possible to add the field through the UI in SharePoint but you can definitely add it using code and scripts, which is my preferred way to add stuff anyway.

Adding a Geolocation field using PowerShell

With the following PowerShell snippet you can easily add a new Geolocation field to your list:

Add-PSSnapin Microsoft.SharePoint.PowerShell

$web = Get-SPWeb "http://tozit-sp:2015"
$list = $web.Lists["Scandinavian Microsoft Offices"]
    "<Field Type='Geolocation' DisplayName='Office Location'/>",

Adding a Geolocation field using the .NET Client Object Model

With the following code snippet for the CSOM you can add a new Geolocation field to your list:

// Hardcoded sample, you may want to use a different approach if you're planning to use this code :-)
var webUrl = "http://tozit-sp:2015";

ClientContext ctx = new ClientContext(webUrl);
List officeLocationList = ctx.Web.Lists.GetByTitle("Scandinavian Microsoft Offices");
    "<Field Type='Geolocation' DisplayName='Office Location'/>", 


Adding a Geolocation field using the Javascript Client Object Model

With the following code snippet for the JS Client Object Model you can add a new Geolocation field to your list:

function AddGeolocationFieldSample()
    var clientContext = new SP.ClientContext();
    var targetList = clientContext.get_web().get_lists().getByTitle('Scandinavian Microsoft Offices');
    fields = targetList.get_fields();
        "<Field Type='Geolocation' DisplayName='Office Location'/>",

    clientContext.executeQueryAsync(Function.createDelegate(this, this.onContextQuerySuccess), Function.createDelegate(this, this.onContextQueryFailure));

Adding a Geolocation field using the Server Side Object Model

With the following code snippet of server-side code you can add a new Geolocation field to your list:

// Assumes you've got an SPSite object called 'site'
SPWeb web = site.RootWeb;
SPList list = web.Lists.TryGetList("Scandinavian Microsoft Offices");
if (list != null)
    list.Fields.AddFieldAsXml("<Field Type='Geolocation' DisplayName='Office Location'/>",

Be amazed, its that easy!

Bing Maps – getting and setting the credentials in SharePoint

Okay now I’ve added the fields to my lists and everything seems to be working out well, except for one little thing… The Bing Map tells me “The specified credentials are invalid. You can sign up for a free developer account at“, which could look like this:


Get your Bing Maps keys

If you don’t have any credentials for Bing Maps, you can easily fetch them by going to the specified Url ( and follow these few simple steps.

1) First off (after you’ve signed up or signed in), you’ll need to click on the “Create or view keys” link in the left navigation:


2) Secondly, you will have to enter some information to create a new key and then click ‘Submit’:


After you’ve clicked ‘Submit’ you’ll be presented with a list of your keys, looking something like this:


Great, you’ve got your Bing Maps keys/credentials. Now we need to let SharePoint know about this as well!

Telling SharePoint 2013 what credentials you want to use for the Bing Maps

Okay – so by this time we’ve created a Geolocation field and set up a credential for our key with Bing Maps. But how does SharePoint know what key to use?

Well that’s pretty straight forward, we have a Property Bag on the SPWeb object called “BING_MAPS_KEY” which allows us to configure our key.

Since setting a property bag is so straight forward I’ll only use one code snippet sample to explain it – it should be easily translated over to the other object models, should you have the need for it.

Setting the BING MAPS KEY using PowerShell on the Farm

If you instead want to configure one key for your entire farm, you can use the Set-SPBingMapsKey PowerShell Cmdlet.

Set-SPBingMapsKey -BingKey "FFDDuWzmanbiqeF7Ftke68y4K8vtU1vDYFEWg1J5J4o2x4LEKqJzjDajZ0XQKpFG"

Setting the BING MAPS KEY using PowerShell on a specific Web

Add-PSSnapin Microsoft.SharePoint.PowerShell

$web = Get-SPWeb "http://tozit-sp:2015"
$web.AllProperties["BING_MAPS_KEY"] = "FFDDuWzmanbiqeF7Ftke68y4K8vtU1vDYFEWg1J5J4o2x4LEKqJzjDajZ0XQKpFG"

Update 2013-03-31: More examples of setting the property bag

I got a comment in the blog about having more examples for various approaches (like CSOM/JS and not only PowerShell). Sure enough, here comes some simple samples for that.

Setting the BING MAPS KEY using JavaScript Client Object Model on a specific Web

var ctx = new SP.ClientContext.get_current();
var web = ctx.get_site().get_rootWeb();
var webProperties = web.get_allProperties();

webProperties.set_item("BING_MAPS_KEY", "FFDDuWzmanbiqeF7Ftke68y4K8vtU1vDYFEWg1J5J4o2x4LEKqJzjDajZ0XQKpFG");

// Shoot'em queries away captain!
ctx.executeQueryAsync(function (){
},function () {
    alert("Fail.. Doh!");

Setting the BING MAPS KEY using .NET Client Object Model on a specific Web

// Set the Url to the site, or get the current context. Choose your own approach here..
var ctx = new ClientContext("http://tozit-sp:2015/");

var siteCollection = ctx.Site;

var web = siteCollection.RootWeb;
ctx.Load(web, w => w.AllProperties);

var allProperties = web.AllProperties;

// Set the Bing Maps Key property
web.AllProperties["BING_MAPS_KEY"] = "FFDDuWzmanbiqeF7Ftke68y4K8vtU1vDYFEWg1J5J4o2x4LEKqJzjDajZ0XQKpFG";

ctx.Load(web, w => w.AllProperties);

So that’s pretty straight forward. Once you’ve set the Bing Maps Key, you can see that the text in your maps has disappeared and you can now start utilizing the full potential of the Geolocation field.


The Geolocation field is pretty slick to play around with. It leaves a few holes in terms of functionality that we’ll have to fill ourselves – but of course that depends on our business requirements. One example is that I rarely want to enter the coordinates into the Geolocation field myself, but I might just want to do a search and select a location which is the added and the coordinates populated into the field automatically, or use the (built in) functionality of “Use my location” – good thing we’ve got developers to fine-tune this bits and pieces :-)


Author: Tobias Zimmergren | | @zimmergren


Recently someone asked me about how to attack the major pain of upgrading their custom coded projects and solution from SharePoint 2010 to SharePoint 2013. Given that question and my experiences thus far I’ll try to pinpoint the most important things to consider when upgrading. There’s TONS of things you need to consider, but we’ll touch on the most fundamental things to consider just to get up and running. After that I’m sure you’ll bump into a few more issues, and then you’re on your way ;-)

Keep your developer tools updated

Visual Studio 2012 Update 1

The first step is to make sure that you’re running the latest version of Visual Studio 2012. As of this writing that means you should be running Visual Studio 2012 and then apply the Visual Studio 2012 Update 1 pack (vsupdate_KB2707250.exe) if it isn’t installed on your system already.

Download Visual Studio 2012 Update 1 here:

Visual Studio 2012 Office Developer Tools

The second step is to make sure you’ve got the latest developer tools for SharePoint installed. The package comes as an update in the Web Platform Installer which I urge you to have installed on your dev-box if you for some reason don’t already have it installed.

So, launch the Web Platform Installer and make a quick search for “SharePoint” and you should see the new developer tools there (note that the release date is 2013-02-26, which is the release date for the RTM tools):


Select the “Microsoft Office Developer Tools for Visual Studio 2012” and click “Add“. It will ask you to install a bunch of prerequisites which you should accept if you want to continue:


Let the tools be installed and the components updated. This could take anywhere from a few seconds to a few Microsoft minutes. It took about 5 minutes on my current development environment, so that wasn’t too bad.


Once the tools are installed, you are ready to get going with your upgrade.

Open your projects/solutions after upgrading Visual Studio 2012 with the latest tools

When the tools have been successfully installed and you open your solution the new tools will be in effect. If you’re opening a SharePoint 2010 project that you wish to upgrade to SharePoint 2013, you’ll get a dialog saying “Do you want to upgrade <project name> to a SharePoint 2013 solution? Once the upgrade is complete, the solution can’t be deployed to SharePoint 2010. Do you want to continue?”


Hit Yes if you get this dialog. If you want to upgrade your project to SharePoint 2013.

Once the project is loaded and have made all the necessary changes to the project files (which it now does automatically, unlike in the beta/preview tools where we had to do some manual tweaks), you should get an upgrade report telling you how things went. Hopefully there’ll be no Errors, only warnings and Messages.


If you check out the assembly references in your project that are pointing to any SharePoint assemblies, note that they have automatically been updated to the correct version of the SharePoint 2013 assembly:


Additional notes

If you upgraded without the latest version of the developer tools you only had the option to launch your projects in 2013-mode if you manually went into the .csproj file to modify (or add if one of them were missing) the following two lines:


This was true when the developer tools were in Preview/beta. But now when they’re released to RTM you shouldn’t be doing those manual hacks anymore. Trust the tools!

Tip: Some general code updates that may be required

When you deploy artifacts to the SharePointRoot folder in SharePoint 2013 they are now deployed to the /15 folder instead of the older /14 folder. SharePoint 2013 has a much better support for upgrade scenarios than previous versions of SharePoint (2010) which is why we’ve got the double hives. So, if you want to properly upgrade your solution you should also make sure to replace all the paths in your project from:

Path to the Images folder

From the images folder:




Path to the layouts folder

Make sure to not forget the general layouts path:




Path to the ControlTemplates folder

Also make sure to replace the following paths:




Well you get the general idea here.. Should you find paths pointing to your old 14-hive instead of the new 15-folder, make sure to change the path/url.


As always, you will not be an efficient developer without the proper tools at hand to make the daily tasks easier.

If you enjoyed using CKS Dev for SharePoint 2010 development, you’ll still be able to enjoy some of that awesomeness by simply installing the CKS Dev tools for SharePoint 2010 on your Visual Studio 2012/SP2013 box. They seem to work fine on Visual Studio 2012 as well – so until there’s a proper update of the tools, you’ll be able to knacker some of your code with the old tools.

Do note that there’s certain features of the CKS Dev that doesn’t work fully, so should you encounter issues with the tool in various scenarios that’ll most likely be because they’re not engineered for Visual Studio 2012 (yet).


After you’ve done enough tinkering you’ll be ready to rock this baby up on SharePoint 2013.


Author: Tobias Zimmergren | | @zimmergren


Okay so this will be a pretty short one, but hopefully help some folks that are upgrading their solutions from SharePoint 2010 to SharePoint 2013.

While developing fields, content types and the likes in SharePoint 2010, there’s always a few good rules and practices to follow. A good rule of thumb that I tend to stick to is to never use any reserved or system-name for your fields that you create. In this quick post I’ll talk about how to fix the "Duplicate field name was found" after you upgrade from 2010 to SharePoint 2013 and try to deploy and activate your feature(s).

In one of my projects that I am involved in, I was tasked to upgrade their existing SharePoint 2010 solutions to SharePoint 2013 – and this is where these problems hit us, hard.

A duplicate field name "Name" was found

If you have upgraded your solution from SharePoint 2010 to SharePoint 2013 and you deploy your code, only to find out that you are getting the notorious error message saying "A duplicate field name ‘fieldname’ was found" you might think you did something wrong in the deployment steps or otherwise failed to successfully upgrade your solution. What actually might have happened is a case of the "don’t use any reserved or system names, please" fever.

After some digging around our 30 projects, I found that the features and finally fields it were complaining about. While investigating the xml here, I noted that the "Name" was the failing factor. If we changed the Name property to something unique (hence, not a built-in field name), it seems to work out nicely.

Field xml for the SharePoint 2010 project

  Description="Short info on the tag"
  Group="My Awesome Fields"

Field xml for the modified 2013 project, after modification

  Description="Short info on the tag"
  Group="My Awesome Fields"

What’s the difference?

So if you look at the two basic samples above, you can see that the small difference is what’s in the "Name" property. If I changed the value to something unique, it started working immediately.

But, doing this will of course bring up other questions that you need to take into consideration and think about.

  • Is there any code reliant on your field’s name property?
  • Will it break any functionality in your code or configured lists/views etc?
  • What happens to data that potentially will be migrated from the old environment into the new environment? Can you map the data to the correct fields properly?


I thought I’d post this to save someone else from having to spend a few hours digging into the various bits and pieces of an enterprise project to find out where it breaks down after the upgrade. Should you encounter the error message in the title of this post immediately after upgrading your solutions, this may very well be the cause.

Please also make note that this solution is one solution to the problem. Perhaps there’s more solutions available to we can use to fix these issues. Should you know of any such solution, don’t hesitate to comment on the post and I’ll update it and give you the cred :-)

SharePoint Server 2013 is an awesome product that is still uncharted territory for many organizations, but I’m seeing a huge increase in the adoption of 2013 locally and with that we’ll have plenty of time to dig into these fine bits of SharePoint magic :-)

Author: Tobias Zimmergren | | @zimmergren


In my previous post ( I talked about how you could use the new delegate controls in the master page (seattle.master) to modify a few things in the SharePoint UI, including the text in the top left corner saying "SharePoint". If your goal is simply to change the text, or hardcode a link without the need for any code behind, you could do it even easier with PowerShell.

Changing the SharePoint text to something else using PowerShell





PowerShell Snippet

$webApp = Get-SPWebApplication http://tozit-sp:2015
$webApp.SuiteBarBrandingElementHtml = "Awesome Text Goes Here"



Author: Tobias Zimmergren | | @zimmergren


In this post we’ll take a quick look at some of the new DelegateControls I’ve discovered for SharePoint 2013 and how you can replace or add information to your new master pages using these new controls, without modifying the master pages. This is done exactly the same way as you would do it back in the 2010 projects (and 2007), the only addition in this case are a few new controls that we’ll investigate.

New DelegateControls

Searching through the main master page, Seattle.master, I’ve found these three new DelegateControls:

  • PromotedActions
  • SuiteBarBrandingDelegate
  • SuiteLinksDelegate

So let’s take a look at where these controls are placed on the Master page and how we can replace them.

PromotedActions Delegate Control

The PromotedActions delegate control allows you to add your own content to the following area on a SharePoint site in the top-right section of the page:


An example of adding an additional link may look like this:


So what does the files look like for these parts of the project?


<?xml version="1.0" encoding="utf-8"?>
<Elements xmlns="">
  <!-- DelegateControl reference to the PromotedActions Delegate Control -->
  <Control ControlSrc="/_controltemplates/15/Zimmergren.DelegateControls/PromotedAction.ascx"
           Sequence="1" />

PromotedActions.aspx (User Control)

<!-- Note: I've removed the actual Facebook-logic from this snippet for easier overview of the structure. -->
<a title="Share on Facebook" class="ms-promotedActionButton" style="display: inline-block;" href="#">
    <span class="s4-clust ms-promotedActionButton-icon" style="width: 16px; height: 16px; overflow: hidden; display: inline-block; position: relative;">
        <img style="top: 0px; position: absolute;" alt="Share" src="/_layouts/15/images/Zimmergren.DelegateControls/facebookshare.png"/>
    <span class="ms-promotedActionButton-text">Post on Facebook</span>

SuiteBarBrandingDelegate Delegate Control

This DelegateControl will allow you to override the content that is displayed in the top-left corner of every site. Normally, there’s a text reading "SharePoint" like this:


If we override this control we can easily replace the content here. For example, most people would probably like to add either a logo or at least make the link clickable so you can return to your Site Collection root web. Let’s take a look at what it can look like if we’ve customized it (this is also a clickable logo):


So what does the files look like for this project?


<?xml version="1.0" encoding="utf-8"?>
<Elements xmlns="">
  <!-- SuiteBarBrandingDelegate (the top-left "SharePoint" text on a page) -->
  <Control ControlSrc="/_controltemplates/15/Zimmergren.DelegateControls/SuiteBarBrandingDelegate.ascx"
           Sequence="1" />

SuiteBarBrandingDelegate.ascx (User Control)

This is the only content in my User Control markup:

<div class="ms-core-brandingText" id="BrandingTextControl" runat="server" /> (User Control Code Behind)

protected void Page_Load(object sender, EventArgs e)
    BrandingTextControl.Controls.Add(new Literal
        Text = string.Format("<a href='{0}'><img src='{1}' alt='{2}' /></a>", 

SuiteLinksDelegate Delegate Control

The SuiteLinksDelegate control will allow us to modify the default links, and to add our own links, in the "suit links" section:


By adding a custom link to the collection of controls, it can perhaps look like this:


What does the project files look like for modifying the SuiteLinksDelegate? Well, here’s an example:


<?xml version="1.0" encoding="utf-8"?>
<Elements xmlns="">
  <!-- DelegateControl reference to the SuiteLinksDelegate Delegate Control -->
  <Control ControlSrc="/_controltemplates/15/Zimmergren.DelegateControls/SuiteLinksDelegate.ascx"
           Sequence="1" />


SuiteLinksDelegate.aspx.cs (User Control Code Behind)

public partial class SuiteLinksDelegate : MySuiteLinksUserControl
    protected override void Render(HtmlTextWriter writer)
        writer.Write(".ms-core-suiteLinkList {display: inline-block;}");
        writer.AddAttribute(HtmlTextWriterAttribute.Class, "ms-core-suiteLinkList");
        // The true/false parameter means if it should be the active link or not - since I'm shooting off this to an external URL, it will never be active..
        RenderSuiteLink(writer, "", "Time Report", "ReportYourTimeAwesomeness", false);


Solution overview

For reference: I’ve structured the project in a way where I’ve put all the changes into one single Elements.xml file and they’re activated through a Site Scoped feature called DelegateControls. The solution is a Farm solution and all artifacts required are deployed through this package.



In this post we’ve looked at how we can customize some of the areas in a SharePoint site without using master page customizations. We’ve used the good-old approach of hooking up a few Delegate Control overrides to our site collection. Given the approach of Delegate Controls, we can easily just de-activate the feature and all our changes are gone. Simple as that.

In SharePoint 2013 we can still do Delegate Control overrides just like we did back in 2007 and 2010 projects, and it’s still pretty slick. I haven’t investigated any other master pages other than the Seattle.master right now – perhaps there’s more new delegate controls somewhere else. Let’s find out..


Author: Tobias Zimmergren | | @zimmergren


In one of my previous posts I talked about "Using the SPField.JSLink property to change the way your field is rendered in SharePoint 2013". That article talks about how we can set the JSLink property in our SharePoint solutions and how we easily can change the rendering of a field in any list view.

It’s pretty slick and I’ve already had one client make use of the functionality in their pilot SharePoint 2013 farm. However I got a comment from John Lui asking about what the performance would be like when executing the same iterative code over thousands of items. Since I hadn’t done any real tests with this myself, I thought it’d be a good time to try and pinpoint and measure if the client-side load times differ while using the JSLink property.


The tests will be performed WITH the JSLink property set and then all tests will be executed WITHOUT the JSLink property set.

I’ve set up my scenario with various views on the same list. The total item count of the list is 5000 items, but we’ll base our tests on the limit of our views:

Test 1  
View Limit 100
Test 2  
View Limit 500

The results will be output with a difference between using the JSLink property and not using it. Should be fun.


I’be been using these tools for measuring the performance on the client:

Fiddler4, YSlow, IE Developer Tools

The code that will be executed will be the same JavaScript code as I had in my previous article.

SharePoint 2013 browser compatibility

If you intend to measure performance on various browsers, make sure you’ve pinned down what browsers are actually supported by SharePoint 2013. The following versions of browsers are supported:

  • IE 10, IE 9, IE 8
  • Chrome
  • Safari
  • Firefox

Let’s start measuring

Let’s start by taking a look at how I’ve done my tests and what the results were.

Testing methods

Each test was executed 100 times and the loading time result I’ve used for each test is the average of the 100 attempts. These tests are executed in a virtual environments so obviously the exact timings will differ from your environments – what’s mostly interesting right here though is the relative difference between having JSLink set and not set when rendering a list view, so that’s what we’ll focus on.

I’ve performed these tests on IE 10 and the latest version of Firefox. Since older browsers may handle scripts in a less efficient way than these two versions of the browsers, you may experience different results when using for example IE 8.

Results overview

SharePoint 2013 is pretty darn fantastic in its way that it renders the contents and pages. The measurements I’ve done here are based on the entire page and all contents to load. The chrome of the page (Navigation/Headers etc) loads instantly, literally in less than 25ms but the entire page takes longer since the content rendered for the list view will take considerably longer. Here’s the output…

Using 100 Item Limit in the View


Difference: 969 milliseconds


There’s not really much to argue about with the default 100-item limit. There’s a difference on almost one second, which is pretty bad to be honest. I would definitely revise these scripts and optimize the performance if I wanted quicker load times. However, if I changed the scripts and removed the rendering of images and used plain text instead, there was very little difference. So I guess it comes down to what you actually put into those scripts and how you optimize your JavaScript.

Using 500 Item Limit in the View


Difference: 529 milliseconds


The load times are naturally longer when returning 500 items, but the difference was smaller on a larger result set. I also performed the same tests using 1000 item limit in the view, and the difference per page load was between 500ms to 1000ms, essentially the same as these two tests. If your page takes 7-8 seconds to load without the usage of JS Link like these samples did in the virtual environments, I’d probably focus on fixing that before being too concerned about the impact the JS Link rendering process will have on your page. However, be advised that if you put more advanced logic into the scripts it may very well be worth your while to draft up some tests for it.

Things to take into consideration

  • The sample script here only replaces some strings based on the context object and replaces with an image. No heavy operations.
  • Replacing strings with images took a considerably longer time to render than just replacing text and render. Consider the code you put in your script and make sure you’ve optimized it for performance and scope your variables properly and so on.
  • Take your time to learn proper JavaScript practices. It’ll be worth it in the end if you’re going to do a lot of client side rendering stuff down the road.
  • If you’ve got access to Scot Hillier’s session from SPC12, review them!


Its not very often I’ve seen anyone use 1000 items as the item limit per view in an ordinary List View Web Part. Most of my existing configurations are using 100 or less (most likely around 30) items per page for optimal performance – however should you have larger views you should of course consider the impact the rendering will have if you decide to hook up your own custom client side rendering awesomeness.

You’ll notice the biggest difference between page load times if you’ve got a smaller item row limit in your view, simply because it looks like using the custom JS link property adds between 500 – 1000 milliseconds whether if I’m returning 100 items, 500 items or 2500 items in my view. Worth considering.

With that said – It’s a pretty cool feature and I’ve already seen a lot of more use cases for some of my clients to utilize these types of customizations. It’s a SUPER-AWESOME way to customize the way your list renders data instead of converting your List View Web Part (or Xslt List View Web Parts and so on) into Data View Web Parts like some people did with SharePoint Destroyer.. Err.. SharePoint Designer. For me as a developer/it/farm admin guy this will make upgrades easier as well (hopefully) as the list itself will be less customized, and only load an external script in order to make the customizations appear. Obviously I’m hoping for all scripts to end up in your code repositories with revision history, fully documented and so on – but then again I do like to dream :-)


SP 2013: Searching in SharePoint 2013 using the REST new API’s

December 26th, 2012 by Tobias Zimmergren

Author: Tobias Zimmergren | | @zimmergren


Search has always been a great way to create custom solutions that aggregate and finds information in SharePoint. With SharePoint 2013, the search capabilities are heavily invested in and we see a lot of new ways to perform our searches. In this post I’ll show a simple example of how you can utilize the REST Search API’s in SharePoint 2013 to perform a search.

REST Search in SharePoint 2013

So in order to get started, we’ll need to have an ASPX Page containing some simple markup and a javascript file where we’ll put our functions to be executed. The approach mentioned in this post is also compatible with SharePoint Apps, should you decide to develop an App that relys on Search. In my example I’ve created a custom Page which loads my jQuery and JavaScript files, and I’m deploying those files using a Module to the SiteAssets library.

Preview of the simple solution


Creating a Search Application using REST Search Api’s

Let’s examine how you can construct your search queries using REST formatting and by simply changing the URL to take in the proper query strings.

Formatting the Url

By formatting the Url properly you can retrieve search results pretty easily using REST.

Query with querytext

If you simply want to return all results with no limits or filters from a search, shoot out this formatted url:’Awesome’

will yield the following result:


Essentially we’re getting a bunch of XML returned from the query which we then have to parse and handle somehow. We can then use the result of this query in our application in whatever fashion we want. Let’s see what a very simple application could look like, utilizing the SharePoint 2013 rest search api’s.

ASP.NET Markup Snippet

As we can see in the following code snippet, we’re simply loading a few jQuery and JavaScript files that we require and we define a Search Box and a Button for controlling our little Search application.

<!-- Load jQuery 1.8.3 -->
<script type="text/javascript" src="/SiteAssets/ScriptArtifacts/jquery-1.8.3.min.js"></script>

<!-- Load our custom Rest Search script file -->
<script type="text/javascript" src="/SiteAssets/ScriptArtifacts/RestSearch.js"></script>

<!-- Add a Text Box to use as a search box -->
<input type="text" value="Awesome" id="searchBox" />

<!-- Add a button that will execute the search -->
<input type="button" value="Search" onclick="executeSearch()" />

<div id="searchResults"></div>

JavaScript logic (RestSearch.js)

I’ve added a file to the project called RestSearch.js, which is a custom javascript file containing the following code which will perform an ajax request to SharePoint using the Search API’s:

// in reality we should put this inside our own namespace, but this is just a sample.
var context;
var web;
var user;

// Called from the ASPX Page
function executeSearch()
    var query = $("#searchBox").val();

    SPSearchResults =
        element: '',
        url: '',

        init: function(element) 
            SPSearchResults.element = element;
            SPSearchResults.url = _spPageContextInfo.webAbsoluteUrl + "/_api/search/query?querytext='" + query + "'";

        load: function() 
                    url: SPSearchResults.url,
                    method: "GET",
                            "accept": "application/json;odata=verbose",
                    success: SPSearchResults.onSuccess,
                    error: SPSearchResults.onError

        onSuccess: function (data)
            var results = data.d.query.PrimaryQueryResult.RelevantResults.Table.Rows.results;

            var html = "<div class='results'>";
            for (var i = 0; i < results.length; i++)
                var d = new Date(results[i].Cells.results[8].Value);
                var currentDate = d.getFullYear() + "-" + d.getMonth() + "-" + d.getDate() + " " + d.getHours() + ":" + d.getMinutes();

                html += "<div class='result-row' style='padding-bottom:5px; border-bottom: 1px solid #c0c0c0;'>";
                var clickableLink = "<a href='" + results[i].Cells.results[6].Value + "'>" + results[i].Cells.results[3].Value + "</a><br/><span>Type: " + results[i].Cells.results[17].Value  + "</span><br/><span>Modified: " + currentDate + "</span>";
                html += clickableLink;
                html += "</div>";

            html += "</div>";

        onError: function (err) {
            $("#searchResults").html("<h3>An error occured</h3><br/>" + JSON.stringify(err));

    // Call our Init-function

    // Call our Load-function which will post the actual query

The aforementioned script shows some simple javascript that will call the Search REST API (the "_api/…" part of the query) and then return the results in our html markup. Simple as that.


By utilizing the REST Search API we can very quickly and easily create an application that searches in SharePoint 2013.

This can be implemened in SharePoint Apps, Sandboxed Solutions or Farm Solutions. Whatever is your preference and requirements, the Search API’s should be easy enough to play around with.


Author: Tobias Zimmergren | | @zimmergren


So just a simple tip in case anyone bumps into the same issue as I had a while back. Going from Beta to RTM, some things changed in the way you retrieve values using REST in the SharePoint 2013 client object model.

Description & Solution

In the older versions of the object model, you could simply use something like this in your REST call:

        url: SPSearchResults.url,
        method: "GET",
                "accept": "application/json",
        success: SPSearchResults.onSuccess,
        error: SPSearchResults.onError

As you can see the "headers" is specifying "application/json".

The fix is simply to swap that statement into this:

        url: SPSearchResults.url,
        method: "GET",
                "accept": "application/json;odata=verbose",
        success: SPSearchResults.onSuccess,
        error: SPSearchResults.onError

And that’s a wrap.


I hope this can save someone a few minutes (or more) of debugging for using old example code or ripping up older projects. I’ve found that a lot of examples online, based on the beta of SharePoint 2013, are using the older version of the headers-statement which inevitably will lead to this problem. So with that said, enjoy.

Author: Tobias Zimmergren | | @zimmergren


Recently I’ve had the pleasure (and adventure..) of upgrading a few SharePoint 2010 solutions to SharePoint 2013. One of the things that come up in literally every project I’m involved in, is the ability to quickly and easily change how a list is rendered. More specifically how the fields in that list should be displayed.

Luckily in SharePoint 2013, Microsoft have extended the SPField with a new proprety called JSLink which is a JavaScript Link property. There’s a JSLink property on the SPField class as well as a "JS Link" property on for example List View Web Parts. If we specify this property and point to a custom JavaScript file we can have SharePoint render our fields in a certain way. We can also tell for example our List View Web Parts to point to a specific JavaScript file, so they’re also extended with the JS Link property.

In this blog post I’ll briefly explain what the "JS Link" for a List View Web Part can look like and how you can set the property using PowerShell, C# and in the UI. I’ll also mention ways to set the SPField JSLink property, should you want to use that.

Final results

If you follow along with this article you should be able to render a similar result to this view in a normal Task List in SharePoint 2013:


You can see that the only modification I’ve made right now is to display an icon displaying the importance of the task. Red icons for High Priority and Blue and Yellow for Low and Medium.

Since it’s all based on JavaScript and we’re fully in control of the rendering, we could also change the rendering to look something like this, should we want to:


As you’ve may have noticed I haven’t put down a lot of effort on styling these elemenbts – but you could easily put some nicer styling on these elements through the JavaScript either by hooking up a CSS file or by using inline/embedded styles.

Configuring the JSLink properties

Okay all of what just happened sounds cool and all. But where do I actually configure this property?

Set the JS Link property on a List View Web Part

If you just want to modify an existing list with a custom rendering template, you can specify the JSLink property of any existing list by modifying it’s Web Part Properties and configure the "JS Link" property, like this:


If you configure the aforementioned property on the List View Web Part your list will automatically load your custom JavaScript file upon rendering.

Set the SPField.JSLink property in the Field XML definition

If you are creating your own field, you can modify the Field XML and have the property set through the definition like this:

<?xml version="1.0" encoding="utf-8"?>
<Elements xmlns="">  
       Name="My Awesome Sample Field"
       DisplayName="My Awesome Sample Field"
       Group="Blog Sample Columns">

Set the SPField.JSLink property using the Server-Side Object Model

Simply set the SPField.JSLink property like this. Please note that this code was executed from a Console Application, hence the instantiation of a new SPSite object:

using (SPSite site = new SPSite("http://tozit-sp:2015"))
    SPWeb web = site.RootWeb;
    SPField taskPriorityField = web.Fields["Priority"];

    taskPriorityField.JSLink = "/_layouts/15/Zimmergren.JSLinkSample/AwesomeFile.js";

Set the SPField.JSLink property using PowerShell

If you’re the PowerShell-kind-of-guy-or-gal (and you should be, if you’re working with SharePoint…) then you may find the following simple snippets interesting, as they should come in handy soon enough.

PowerShell: Configure the JSLink property of an SPField

$web = Get-SPWeb http://tozit-sp:2015
$field = $web.Fields["Priority"]
$field.JSLink = "/_layouts/15/Zimmergren.JSLinkSample/Awesomeness.js"

PowerShell: Configure the JSLink property of a Web Part

Note that this is what I’ve been doing in the sample code later on – I’m not setting a custom JSLink for the actual SPField, I’m setting it for the List View Web Part.

$web = Get-SPWeb http://tozit-sp:2015

$webPartPage = "/Lists/Sample%20Tasks/AllItems.aspx"
$file = $web.GetFile($webPartPage)

$webPartManager = $web.GetLimitedWebPartManager($webPartPage, [System.Web.UI.WebControls.WebParts.Pers

$webpart = $webPartManager.WebParts[0]

$webpart.JSLink = "/_layouts/15/Zimmergren.JSLinkSample/Awesomeness.js"


$file.CheckIn("Awesomeness has been delivered")

As you can see in the PowerShell snippet above we’re picking out a specific page (a Web Part page in a list in my case) and picks out the first Web Part (since I know I only have one Web Part on my page) and set the JS Link property there. This will result in the Web Part getting the proper link set and can now utilize the code in your custom javascript file to render the results.

So what does the JS Link JavaScript logic look like?

Okay, so I’ve seen a few ways to modify the JS Link property of a list or a field. But we still haven’t looked at how the actual JavaScript works or what it can look like. So let’s take a quick look at what it could look like for a List View Web Part rendering our items:

// Create a namespace for our functions so we don't collide with anything else
var zimmergrenSample = zimmergrenSample || {};

// Create a function for customizing the Field Rendering of our fields
zimmergrenSample.CustomizeFieldRendering = function ()
    var fieldJsLinkOverride = {};
    fieldJsLinkOverride.Templates = {};

    fieldJsLinkOverride.Templates.Fields =
        // Make sure the Priority field view gets hooked up to the GetPriorityFieldIcon method defined below
        'Priority': { 'View': zimmergrenSample.GetPriorityFieldIcon }

    // Register the rendering template

// Create a function for getting the Priority Field Icon value (called from the first method)
zimmergrenSample.GetPriorityFieldIcon = function (ctx) {
    var priority = ctx.CurrentItem.Priority;

    // In the following section we simply determine what the rendered html output should be. In my case I'm setting an icon.

    if (priority.indexOf("(1) High") != -1) {
        //return "<div style='background-color: #FFB5B5; width: 100%; display: block; border: 2px solid #DE0000; padding-left: 2px;'>" + priority + "</div>";
        return "<img src='/_layouts/15/images/Zimmergren.JSLinkSample/HighPrioritySmall.png' />&nbsp;" + priority;

    if (priority.indexOf("(2) Normal") != -1) {
        //return "<div style='background-color: #FFFFB5; width: 100%; display: block; border: 2px solid #DEDE00; padding-left: 2px;'>" + priority + "</div>";
        return "<img src='/_layouts/15/images/Zimmergren.JSLinkSample/MediumPrioritySmall.png' />&nbsp;" +priority;

    if (priority.indexOf("(3) Low") != -1) {
        //return "<div style='background-color: #B5BBFF; width: 100%; display: block; border: 2px solid #2500DE; padding-left: 2px;'>" + priority + "</div>";
        return "<img src='/_layouts/15/images/Zimmergren.JSLinkSample/LowPrioritySmall.png' />&nbsp;" + priority;

    return ctx.CurrentItem.Priority;

// Call the function. 
// We could've used a self-executing function as well but I think this simplifies the example

With the above script, we’ve simply told our field (Priority) that when it’s rendered it should look format the output HTML according to the contents in my methods. In this case we’re simply making a very simple replacement of text with image to visually indicate the importance of the task.

For examples of how you can construct your SPField.SPLink JavaScript, head on over to Dave Mann’s blog and check it out. Great info!


With a few simple steps (essentially just a JavaScript file and a property on the LVWP or Field) we’ve changed how a list is rendering its data. I’d say that the sky is the limit and I’ve already had one of my clients implement a solution using a custom JS Link to format a set of specific lists they have. What’s even better is that it’s so simple to do, we don’t even have to do a deployment of a new package if we don’t want to.

The reason I’ve chosen to do a Farm Solution (hence the /_layouts paths you see in the url’s) is that most of my clients still run Farm Solutions – and will be running them for a long time to come. And it wraps up a nice package for us to deploy globally in the farm, and then simply have a quick PowerShell script change the properties of the LVWP’s we want to modify and that’ll be that. Easy as 1-2-3.