Posts Tagged ‘SharePoint 2010’

Author: Tobias Zimmergren
www.zimmergren.net | www.tozit.com | www.sharepointdiscussions.com | @zimmergren

Introduction

Every cycle of SharePoint comes with challenges around upgrades and migrations. In one of my current projects I’ve been part of designing an iterative upgrade process – as I like to call it – which means we’ll be upgrading our Farm (all content databases) from SharePoint 2010 to SharePoint 2013 every week. Yes, that’s right – we upgrade SharePoint 2010 to SharePoint 2013 every single week with the latest content from the production environments. This of course happens on a separate SharePoint 2013 farm setup specifically for this purpose.

In this article I’ll talk about the benefits of my “Iterative Upgrade Process” and what it means in terms of benefits for the teams and people involved with your Intranet, Extranet, Public site or whatever you are using your SharePoint farm for.

Please do note that this is not an article describing the steps to upgrade your SharePoint environment – this is a process for iterative development and iterative testing in environments that we tear down, build up and upgrade from SharePoint 2010 to SharePoint 2013 every week.

Background: Everyone is effected by an upgrade or migration

It’s not uncommon that you bumb into a lot of problems while upgrading your farms from one version to the other. Common problems include customizations, faulty configurations and general bad practices being implemented in the original farm. But for most organizations an upgrade doesn’t “just happen” overnight and then everything works flawlessly – on the contrary there’s always a bunch of problems brought to light that needs to be taken care of.

Working a lot on SharePoint Intranets like I currently do, we often see problems before, during and after upgrades. Not only technical issues that we can handle, but issues with people in the organization not reaching the right information or not being able to perform their daily tasks. This can have very complicated impacts on the organization if the migration fails to run smoothly and everything isn’t up and running after the service windows you’ve specified.

The end-user is affected in terms of downtime and possible being hindered to perform their tasks, which in the end hurts the organization since these guys are the thing that keeps the organization running! The IT-departments (or technical folks in your organization involved with SharePoint) may be affected if the migration or upgrade doesn’t go as planned. The business as a whole relies on the system to be functioning and for every minute or hour that the systems aren’t fully available the organization may loose both time and money.

So in order to minimize any pain in upgrading from one version of SharePoint to another, we need to consider the implications of a troublesome upgrade. With the iterative upgrade process we’ve got in place right now at one of my clients you can test and verify all your changes and customizations and whatever you want to assure the quality of – over and over again before the real deal.

Implementation: How am I doing this?

So boiling down the steps included in our iterative upgrade process, gives something similar to this:

image

In a nutshell this is what the process looks like from a bird perspective, even though some of the steps require extensive amount of preparation-work and time to get done. Below is an explanation of all these steps in more details, to give you an understanding of what this really means.

Setup SP 2013 Farm

The very first step that we need to do is to setup and configure the SharePoint 2013 Farm where our upgraded content will eventually land. In our case we’ve set up this as a one-off configuration, re-using the 2013 farm on every new iteration. You could – as an alternative – argue that it would be beneficial to tear down the entire farm and have it set up again. It would be theoretically possible, but in our specific case it simply doesn’t work that easily – too many dependencies rely on things outside of my and my team’s control.

Uninstall any custom solutions

This step is of course only necessary if you’ve already upgraded at least once in the new farm, then by the time you’ve scheduled your next iterative upgrade you’ll need to uninstall any and all old solutions in order to clean up the farm a bit before we proceed.

Remove any content databases

Again, this step is only necessary if you’ve already upgraded at least once in the new farm. If you have, there’ll be upgraded content databases that you need to get rid off before we commence the process to the next step. We’re doing this with the PowerShell cmdlet Remove-SPContentDatabase.

Deploy SP 2010 Solutions

Deploy your old 2010 solutions. The reason why we would want to do this is that when we later perform the actual mount of the databases, it’s pretty nice if the mounting-process can find the references to the features, web parts and any other resources within those solutions. This is a temporary deployment and the 2010 packages are soon to be removed again.

Copy fresh databases

Next step is to ensure that the SQL Server in our new 2013 farm is up to date with containing the latest content databases from the SharePoint 2010 farm. This is why we’re using Flexclone (described in more detail further down in this article). Actually, Flexclone makes virtual copies which are true clones without demanding additional storage space. Pow! Totally awesome.

Attach databases

After the databases are copied to the SQL Server, we’ll have to attach them to SQL Server as you would normally do.

Mount databases

Next step is where we mount the actual databases to SharePoint. The databases are still in SharePoint 2010 mode, since the copies of our databases comes from the SharePoint 2010 environment. This is why we need to have our 2010 WSP solutions in place before we perform the mount – otherwise reading the mount-logs will be… well, not so fun ;)

We do this with the PowerShell cmdlet Mount-SPContentDatabase.

Uninstall SP 2010 Solutions

When the mounting is done, we’ll need to uninstall the 2010 version of the old solutions and move on to the next step.

Deploy upgraded SP 2013 Solutions

Yay, finally we’re at a much cooler step – deploying SharePoint 2013 solutions. So, to give a little background on what these solutions should be:

You should’ve already upgraded your SharePoint projects to SharePoint 2013 solutions, have them ready to go and use in this step.

Notes:  This is probably the most time-consuming step if you have custom solutions. Anything you’ve built in SharePoint 2010 and customized there needs to be upgraded to SharePoint 2013 and work there as well. Good thing we’ve got an iterative upgrade process so we can fine-tune this iteratively every day and just hit a button to re-upgrade the farm with the latest builds in our test- and pre-production environments. Yay!

Upgrade-SPSite with all Site Collections

Once the new and freshly upgraded 2013 packages have been deployed, we’ll continue by upgrading the actual Site Collections from SharePoint 2010 mode to SharePoint 2013 mode.

We’ll be using the PowerShell cmdlet Upgrade-SPSite for every one of our Site Collections.

Misc automated configuration scripts

We’ve got a lot of custom scripts getting run after the upgrade, as part of the finalization process of the actual upgrade. This includes custom re-branding scripts, re-creation of My Sites and moving content between old and new My Sites, custom scripts to disable and remove artifacts that aren’t used in SharePoint 2013 projects and solutions anymore, modification to removed or altered Content Types etc etc. The list can be made long – if you’re reading this you’ve probably already understood that each scenario is unique, but this process can be applied to most scenarios with a tweak here and there.

Tools: What tools are we using to make this happen?

Obviously things doesn’t get done by themselves, so I’ve automated much of the process with various tools and techniques, defined below.

Deployment Automation with Team City

There’s tons of ways to automate things in any ALM cycle. Be it a development lifecycle or an infrastructural upgrade lifecycle like this – something to automate the process will be your best bet. Since we’re doing this every week, and the process in itself is pretty complex with plenty of steps that needs to be done properly, I’ve chosen to go with Team City for all of the automation work.

I’ve gotten the question why use Team City instead of TFS Build or Jenkins or any other available build automation tools. Simply put: Team City is free for up to 20 configurations, easy (very very easy) to configure, works with multiple data sources and repositories and it just works – every time. But that’s a discussion for another day.

Database copies with Flexclone

In order to easily get set up with the databases in our environments, we’ve been using Netapp’s Flexclone software very successfully the last year. As quoted from their own website:

NetApp® FlexClone® technology instantly replicates data volumes and datasets as transparent, virtual copies—true clones—without compromising performance or demanding additional storage space.

So in essence, the usage of Flexclone allows us to with a single click (almost) replace all of the databases in our test- and pre-production environments and get real copies of the actual production environments in a matter of minutes. There’s no denying that this is awesomenss in its true form.

Iterative code upgrades with Visual Studio 2013

In order to maintain and upgrade the new codebase (upgraded from SharePoint 2010), we’re using Visual Studio 2013 like most professional Microsoft-related developers do today. You can use VS 2012 as well, should you like – but do try out 2013 if you can, it’s multiple times faster than previous versions of Visual Studio.

I have pushed hard for implementing a real ALM process in the team, and we’ve finally got that in place and it’s working pretty nicely right now. We’re using TeamCity to automate builds with continuous integration, nightly builds and scheduled and on-demand deployments to our environments. I will cover code automation more thoroughly in another post in the future, as it would be too much info to cover in this single post.

Summary

So this is a process we follow every week. Once a week I tear down the entire SP 2013 test farm and rig up a new snapshot of the databases on the SQL environment. Then I re-iterate this upgrade process (Team City, PowerShell and PowerShell Remoting to the rescue). This means we can literally try what the real upgrade will be like once we get there. Every week. Also we can have a nice agile iterative way of handling bugs that appear in the environments.

Oh yeah, should we break something – we click a button or two and we’ve got a freshly upgraded environment with the latest builds from the SP 2013 dev rigs.

It simplifies the overall process:

  • When time comes for the real upgrade, everything including upgraded code base and automated upgrade scripts is in place!
  • Find and report errors early in the upgrade process of your project
  • Find compatibility errors in code and solutions
  • Find out what will upgrade, and what will not upgrade before its too late
  • Be confident that once we reach the point of upgrade, we’ve done it so many times already that we know what might go wrong
  • The Product Owners, Project Managers, Testers and any other involved people have already verified the state of the system, so once we hit the button in the Production environments – we’re pretty much in a “accepted release” state already.

I hope you enjoyed this little read about my iterative upgrade process. It’s pretty darn good if you ask me – but requires some time to set up initially, but in larger projects it’s definitely worth it!

Enjoy.

Sweden SharePoint User Group (SSUG) – Updates and information

September 21st, 2012 by Tobias Zimmergren

Author: Tobias Zimmergren
http://www.zimmergren.net | http://www.tozit.com | @zimmergren

Introduction

Since we kicked off the Sweden SharePoint User Group (or SSUG) a couple of years back, we’ve managed to keep a great flow and balance in keeping free meetings and sessions throughout the Swedish SharePoint community. Recently I launched (actually re-launched) the SSUG group down in the southern parts of Sweden – and with great success. It’s awesome to see what a huge interest there is for SharePoint in general and our user group in particular.

Popular: Coming events

I’ve re-mapped the domain www.ssug.se to point to our EventBrite page, which is where you’ll find any and all upcoming events with the SSUG group in Sweden.

Stockholm – September 24th, 2012

There is only 50 seats. PLEASE - if you cannot attend and have registered, please unregister. It’s not fair to our host or the people on the waitlist to not show up!

This event is SOLD OUT. 50 out of 50 attendees are signed up, and unfortunately we don’t have more room at this event. You can sign up on the waiting list at the event booking form

  • 17:30 Registration – you need to register here on Eventbrite to get a seat
  • 18:00 Wictor Wilén introduces SSUG and kicks off this season
  • 18:05 Steria welcomes everyone
  • 18:10 Matthias Einig (Steria) will present Application Life Cycle Management in SharePoint
  • 19:00 Beer and something to chew on
  • 19:15 Wictor Wilén (Connecta) will give you an introduction to all the new and shiny stuff in SharePoint 2013
  • 20:00 All good things must eventually end, but we’ll wrap up with a quick Q&A session about SharePoint 2013

Malmö – Oktober 4th, 2012

There is only 50 seats. PLEASE - if you cannot attend and have registered, please unregister. It’s not fair to our host or the people on the waitlist to not show up!

This event is SOLD OUT. 50 out of 50 attendees are signed up, and unfortunately we don’t have more room at this event. You can sign up on the waiting list at the event booking form

  • 17:30 – 18:00: Mingle and meetup
  • 18:00 : Tobias Zimmergren (TOZIT AB) introduces SSUG and kicks the meeting off
  • 18:05 : Carl-Johan Tiréus (Connecta) welcomes us to Connecta
  • 18:15 : One year with SharePoint – Alfa Laval presents their experiences and lessons learned from introducing SharePoint in their organization
  • 19:00 : Food & Beer!
  • 19:15 : Wictor Wilén (Connecta) talks about the most important news in SharePoint 2013
  • 20:00 : The End. Who’s up for a SharePint?

Where can you find us?

Upcoming events: www.ssug.se

Facebook Group: www.facebook.com/SharePointSweden

Introduction

LogoWithText_DarkBlue

During the last few years, we’ve been enabling our clients with enhanced discussion forum solutions for SharePoint 2007 and SharePoint 2010 in their intranets, extranets and public facing web sites. Given the great success of the last few years implementations of discussion solutions with our clients, we have now dedicated an entirely new initiative to manage the discussion solution suite.

Head on over to www.sharepointdiscussions.com to take a closer look!

We’re live!

We’ve successfully launched a new site called SharePointDiscussions.com which will now host the content of our products and services related to our discussion forum solutions and software. All new features and updates to the solutions will be accessible from this location and any requests related to these products and solutions can be relayed to the support team at support {at} sharepointdiscussions.com

Highlights

Some of the things I’d love to highlight are:

- Competitive pricing!
    – We can offer a single server license for only $399 per server
    – You purchase it once, and then you’re done. No annual fees!
- Language support!
    – We support multiple languages, including these:
    – English – Swedish – Danish – German – Greek – Farsi – Norwegian – Dutch
    – If you need your language localized, get in touch
- Features
    – Mark posts as answers
    – Mark posts as helpful
    – Mark posts as abusive
    – Collect user statistics
    – Earn points for posts, helpful posts and answers
    – Categorize discussions in different forums
    – Multiple threads in each forum
    – Create posts and threads easily
    – Delete entire threads easily, or single posts
    – RSS: Subscribe to new threads in a forum
    – RSS: Subscribe to new posts in a thread
    – Additional free Web Part: Recent Posts
    – Additional free Web Part: Top Viewer
        – Display users with most total points
        – Display users with most total posts
        – Display users with most answered posts
        – Display users with most helpful posts
    – Additional free Web Part: Forum Search

… and much more.

Summary

It’s about time to continue writing on our SharePoint 2013 versions of our Apps, products and solutions – so with that, I thank you for your time to read this!

Enjoy!

Author: Tobias Zimmergren
http://www.zimmergren.net | http://www.tozit.com | @zimmergren

Introduction

A while back an announcement was made that TFSPreview.com had been made available for general testing. Various bloggers at Microsoft put an invitation token in their MSDN blogs so everyone can have a go at it.

In this article series we’ll take a very quick look at what the hosted TFS solution by Microsoft looks like.

Articles currently in the series:

Steps to hook up a project to your new build server

This article obviously assumes that you’ve already followed along with the previous articles and hooked up a build configuration for  your TFSpreview account. Now we’ll take a look at how we can get our projects hooking up in a CI/Automated Build scenario with Team Foundation Services.

The steps from this point onwards are basically the same as if it would be an on-premise TFS server in your own domain. Your build server is configured, your code is hosted in TFS and all you’ll need to do is connect your project to the actual TFS and then create a new build definition.

Create a new project (or connect an existing one) and connect to TFS

We’ll start from the beginning and create a new Visual Studio 2010 project (in my case it’ll be an Empty SharePoint Project), and remember to tick the Checkbox "Add to source control":
image

Make sure that the project is connected to your TFS server, check in the source and we can get started:
image

Create a new build definition

At this point (if you’ve followed the articles in this article series) you should have a TFS server, a connection from Team Explorer to your TFS server and also a new project hooked up in your repository. Now its time to create our first build definition so we can automate the builds and deployments.

Start by navigating to your Team Explorer and right-click on Builds and then click the "New Build Definition…":
image

This will give you the following new dialog where you can specify details for your build:
image

Move on to the "Trigger" tab. In my case I want to enable CI (Continous Integration) for my project:
image

Move on to the "Workspace" tab. In my case I’ll leave the Source Control Folder as the default root as seen below. You can choose to specify a specific TFS project if you don’t want to include all.
image

Move on to the "Build Defaults" tab. You’ll need to specify a build controller (you should have one here since we created one in the previous article). You will also need to specify a drop folder, where your binaries are going to be delivered upon the build: 
image

Move on to the "Process" tab. This is where things get really interesting. You can from this dialog specify a variety of different variables for your project when it builds. I’m not going to dig into details here because my good mate Chris O’Brien have covered all of that in his article series about automated builds.
image

Save the build definition and validate that it appears in the Team Explorer:
image

Test the build configuration

In order to validate that our setup now works with TFSpreview.com and our own build server and to validate our newly created build definition, simply make some changes to your project and check it in and have it automatically schedule a new build (We chose Continuous Integration, which will build on each check in). You can see that the build is now scheduled and currently running:
image

And after a while you can validate that it is Completed:
image

The final validation is of course to see the drop folder that we specified and make sure that it now contains our newly built sources:
image

Voila. Build seems to be working.

Summary

This post was intended to give you an overview over the simplicity of creating a simple build definition and getting started with automated builds in TFSpreview (hosted Microsoft TFS). Pretty neat and it seems to be working just the way we want.

Obviously there’s some apparent questions like:

  • What if I want to output my .wsp files as well?
  • What if I want to execute a specific script upon the execution of the build so I can automate test-deployments?
  • Etc. etc.

My first recommendation is to visit Chris O’Brien and read all the posts in his CI/automation series which is simply amazing.

Author: Tobias Zimmergren
http://www.zimmergren.net | http://www.tozit.com | @zimmergren

Introduction

A while back an announcement was made that TFSPreview.com had been made available for general testing. Various bloggers at Microsoft put an invitation token in their MSDN blogs so everyone can have a go at it.

In this article series we’ll take a very quick look at what the hosted TFS solution by Microsoft looks like.

Articles currently in the series:


Connect Visual Studio 2010 to your new hosted team project

In order to be able to connect to the hosted TFSPreview team project, you’ll need to comply with the prerequisites I’m listing here.

Prerequisites

Hook up Visual Studio to your new repository/project

Alright, if you’ve downloaded and installed KB2581206 (which means you’re spinning VS2010 SP1 already) you are read to connect. The procedure to connect to the hosted TFS service is basically the same as if you were to connect to any other TFS repository, which is easy and awesome.

In Visual Studio 2010 SP1, simply make these smooth ninja moves and you’re done:
image

Make sure to fetch the URL of your account (As seen in your dashboard, like depicted below):
image

Enter this URL in the Visual Studio 2010 dialogs and we’re ready to kick off:
image

It’ll ask you for your credentials which you need to use to verify your account details:
image

You should now be authenticated and your repository should be available:
image

You’ll go ahead as you normally do and choose the projects that interests you and then you’re basically done:
image

Your Team Explorer should contain your TFS project and you should be able to work with it as you normally would from Visual Studio 2010:
image

This means you’ve got all of your standard tasks and operations available straight from VS 2010 (So you don’t have to go to the website to make changes …):
image

Summary

Easy enough. As soon as you’ve downloaded the required tooling to get connected, you can hook up your new cloud-hosted team project in Visual Studio 2010 without any problems. Give it a spin, it flows quite nicely!

Enjoy.

Author: Tobias Zimmergren | www.tozit.com | @zimmergren

Introduction

Sometimes when you’re in a development project you can feel the pain of debugging. If there’s a lot of code floating around it may be hard to sort out the method calls and how the depend on each other if it’s a very complex solution. To ease the task of debugging there’s a great VS 2010 plugin called Debugger Canvas, which will help you to sort out a lot of the hassle while debugging.

In this article we’ll just take a quick look at what Debugger Canvas is and how it can assist us in our daily debugging adventures.

Getting Started with Debugger Canvas

Firstly, you obviously need to download the extension for Visual Studio 2010, which can be done HERE.

Please note: The Debugger Canvas Extensions are only available for VS 2010 Ultimate

Debugger Canvas in Action

When you’ve installed the extension, there’s a few new opportunities presented when debugging. Your new “F5” experience will be based on the new Debugger Canvas UI instead of the traditional debugging experience which means you’ll be able to more easily follow the calls within your code, like this:

image

When you step into the code deeper, you’ll see how the calls were made quite easily:

image

Summary

You should definitely take a look at Debugger Canvas if you haven’t already as it’ll be most helpful for you in your development adventures.

Get a better overview here and watch the introductory video!

Enjoy.

Author: Tobias Zimmergren
http://www.zimmergren.net | http://www.tozit.com | @zimmergren

Introduction

In most of my recent projects I’ve been required to hook up some custom functionality and add custom forms, pages and Web Parts. Some of the forms and pages I designed needed to be launched from the Ribbon menu, which of course is contextual. This basically means that when you visit a specific list which inherits from a specific content type, we can choose to display our custom Ribbon controls. One of the most common requirements I bumped into was having some kind of conditional check whether to enable or disable the button based on a set of conditions.

In your Ribbon XML for the CommandUIHandler there’s a property called “EnabledScript” which is a tag that enables you to enter a validation script to determine whether or not the ribbon button should be enabled. In my case I need to disable the custom Ribbon-controls if one item is selected, but otherwise always disable it.

Use the following snippet from the SP.ListOperation, which contains the Selection.getSelectedItems method:

<CommandUIHandler
Command=”Ribbon.Awesome.NavButton_CMD”
CommandAction=”javascript:Alert(‘My Awesome Button Was Clicked’);
          EnabledScript=”javascript:SP.ListOperation.Selection.getSelectedItems().length == 1;” />

It’s really only the last line that is of interest here since that’s where the script magic happens to determine if the control should be enabled or not.

MSDN have some nice samples in one of their articles over here.

Results

If you select one (and only one) item in the list, your custom command will be enabled:

image

If you didn’t select or selected more than one item, the command will be disabled as such:

imageimage

Summary

I know many people have been struggling with the Ribbon and making it behave. In this article I simply wanted to highlight one of the very common tasks I’ve seen developers looking for and trying to achieve in some of the last few projects I’ve been involved.

Since my awesome mate Wictor covered a bunch of awesome posts about the Ribbon, I’m not going to dive into any more details than so :-)

Enjoy.

Author: Tobias Zimmergren
http://www.zimmergren.net | http://www.tozit.com | @zimmergren

Introduction

SharePoint 2010 developing for performance article series:
In this series of articles I will briefly introduce you to some key concepts when it comes to developing for performance in our SharePoint 2010 applications.

Related articles in this article series

Part 8 (this article):
As most of you know, in any ASP.NET project (SharePoint included) there’s something you know as ViewState.  With the ViewState we can persist state of our properties in the page controls and objects across postbacks that happen in our solutions. If we are not careful and plan the usage of ViewState properly, we can end up with a performance hit that will slightly decrease the performance in the page rendering process.

In this article we will take a look at what impact the ViewState can have if we “forget about it”, and of course how we can prevent our pages from being unnecessarily big in page size.

ViewState in our SharePoint projects

If you’ve been developing SharePoint projects of any type, you’ve most certainly been doing some kind of asp.net UI development as well. (Application Pages, Web Parts, User Controls and so on) that are inheriting the asp.net capabilities and hence the ViewState.

What you should know as an ASP.NET developer: Be sure to know when you should, or shouldn’t use the ViewState. You can disable the usage of ViewState for certain components, or the entire page.

Performance considerations

With an increased ViewState you get an increased Page Size which in turn obviously means that the page will take a bit longer to render.

We’ll take a quick look at how the performance can differ when we’re using ViewState and when we’re disabling the ViewState in a SharePoint project.

Taking a look: ViewState

In order for us to really understand what impact the ViewState can have on the page rendering process, we’ll dig into the details of the what the ViewState looks like. To do this we can check out the source of the rendered page, and of course we’ll rely on our faithful squire; Fiddler2.

Before we’ll start digging with the Fiddler2-application, we can simply check the Page Source of any given page being rendered and we’ll most likely find a heap of ViewState madness going on in there.

In my sample application that only consist of a simple GridView control and a few rows of data – the ViewState is turned into this:

 <input type= "hidden "

 

  name= "__VIEWSTATE "

 

  id= "__VIEWSTATE "

 

  value= "/wEPDwULLTE5NjIxMzI1MDIPZBYCZg9kFgICAQ9kFgQCAQ9kFgICDw9kFgJmD2QWAgIBD

 

  w8WAh4HVmlzaWJsZWdkZAIDD2QWBgIRD2QWBGYPZBYEAgEPZBYCZg9kFgQCAg9kFgYCAQ8WAh8Aa

 

  GQCAw8WCB4TQ2xpZW50T25DbGlja1NjcmlwdAVdamF2YVNjcmlwdDpDb3JlSW52b2tlKCdUYWtlT

 

  2ZmbGluZVRvQ2xpZW50UmVhbCcsMSwgMSwgJ2h0dHA6XHUwMDJmXHUwMDJmc3BmJywgLTEsIC0xL

 

  CAnJywgJycpHhhDbGllbnRPbkNsaWNrTmF2aWdhdGVVcmxkHihDbGllbnRPbkNsaWNrU2NyaXB0Q

 

  29udGFpbmluZ1ByZWZpeGVkVXJsZB4MSGlkZGVuU2NyaXB0BSFUYWtlT2ZmbGluZURpc2FibGVkK

 

  DEsIDEsIC0xLCAtMSlkAhUPFgIfAGhkAgMPDxYKHglBY2Nlc3NLZXkFAS8eD0Fycm93SW1hZ2VXa

 

  WR0aAIFHhBBcnJvd0ltYWdlSGVpZ2h0AgMeEUFycm93SW1hZ2VPZmZzZXRYZh4RQXJyb3dJbWFnZ

 

  U9mZnNldFkC6wNkZAIDD2QWAgIBD2QWAgIDD2QWAgIBDzwrAAUBAA8WAh4PU2l0ZU1hcFByb3ZpZ

 

  GVyBRFTUFNpdGVNYXBQcm92aWRlcmRkAgEPZBYCAgUPZBYCAgEPEBYCHwBoZBQrAQBkAjMPZBYCA

 

  gcPZBYCAgEPDxYCHwBoZBYCAgMPZBYCAgMPZBYCAgEPPCsACQEADxYCHg1OZXZlckV4cGFuZGVkZ

 

  2RkAkcPZBYCAgEPZBYCAgEPPCsADQIADxYEHgtfIURhdGFCb3VuZGceC18hSXRlbUNvdW50AukHZ

 

  AEQFgNmAgECAhYDPCsABQEAFgQeCkhlYWRlclRleHQFAklEHglEYXRhRmllbGQFAklEPCsABQEAF

 

  gQfDgUFVGl0bGUfDwUFVGl0bGU8KwAFAQAWBB8OBQtEZXNjcmlwdGlvbh8PBQtEZXNjcmlwdGlvb

 

  hYDZmZmFgJmD2QWNAIBD2QWBmYPDxYCHgRUZXh0BQEwZGQCAQ8PFgIfEAUNU2FtcGxlIEl0ZW0gM

 

  GRkAgIPDxYCHxAFIVppbW1lcmdyZW4ncyBQZXJmb3JtYW5jZSBTYW1wbGUgMGRkAgIPZBYGZg8PF

 

  gIfEAUBMWRkAgEPDxYCHxAFDVNhbXBsZSBJdGVtIDFkZAICDw8WAh8QBSFaaW1tZXJncmVuJ3MgU

 

  GVyZm9ybWFuY2UgU2FtcGxlIDFkZAIDD2QWBmYPDxYCHxAFATJkZAIBDw8WAh8QBQ1TYW1wbGUgS

 

  XRlbSAyZGQCAg8PFgIfEAUhWmltbWVyZ3JlbidzIFBlcmZvcm1hbmNlIFNhbXBsZSAyZGQCBA9kF

 

  gZmDw8WAh8QBQEzZGQCAQ8PFgIfEAUNU2FtcGxlIEl0ZW0gM2RkAgIPDxYCHxAFIVppbW1lcmdyZ

 

  W4ncyBQZXJmb3JtYW5jZSBTYW1wbGUgM2RkAgUPZBYGZg8PFgIfEAUBNGRkAgEPDxYCHxAFDVNhb

 

  XBsZSBJdGVtIDRkZAICDw8WAh8QBSFaaW1tZXJncmVuJ3MgUGVyZm9ybWFuY2UgU2FtcGxlIDRkZ

 

  AIGD2QWBmYPDxYCHxAFATVkZAIBDw8WAh8QBQ1TYW1wbGUgSXRlbSA1ZGQCAg8PFgIfEAUhWmltb

 

  WVyZ3JlbidzIFBlcmZvcm1hbmNlIFNhbXBsZSA1ZGQCBw9kFgZmDw8WAh8QBQE2ZGQCAQ8PFgIfE

 

  AUNU2FtcGxlIEl0ZW0gNmRkAgIPDxYCHxAFIVppbW1lcmdyZW4ncyBQZXJmb3JtYW5jZSBTYW1wb

 

  GUgNmRkAggPZBYGZg8PFgIfEAUBN2RkAgEPDxYCHxAFDVNhbXBsZSBJdGVtIDdkZAICDw8WAh8QB

 

  SFaaW1tZXJncmVuJ3MgUGVyZm9ybWFuY2UgU2FtcGxlIDdkZAIJD2QWBmYPDxYCHxAFAThkZAIBD

 

  w8WAh8QBQ1TYW1wbGUgSXRlbSA4ZGQCAg8PFgIfEAUhWmltbWVyZ3JlbidzIFBlcmZvcm1hbmNlI

 

  FNhbXBsZSA4ZGQCCg9kFgZmDw8WAh8QBQE5ZGQCAQ8PFgIfEAUNU2FtcGxlIEl0ZW0gOWRkAgIPD

 

  xYCHxAFIVppbW1lcmdyZW4ncyBQZXJmb3JtYW5jZSBTYW1wbGUgOWRkAgsPZBYGZg8PFgIfEAUCM

 

  TBkZAIBDw8WAh8QBQ5TYW1wbGUgSXRlbSAxMGRkAgIPDxYCHxAFIlppbW1lcmdyZW4ncyBQZXJmb

 

  3JtYW5jZSBTYW1wbGUgMTBkZAIMD2QWBmYPDxYCHxAFAjExZGQCAQ8PFgIfEAUOU2FtcGxlIEl0Z

 

  W0gMTFkZAICDw8WAh8QBSJaaW1tZXJncmVuJ3MgUGVyZm9ybWFuY2UgU2FtcGxlIDExZGQCDQ9kF

 

  gZmDw8WAh8QBQIxMmRkAgEPDxYCHxAFDlNhbXBsZSBJdGVtIDEyZGQCAg8PFgIfEAUiWmltbWVyZ

 

  3JlbidzIFBlcmZvcm1hbmNlIFNhbXBsZSAxMmRkAg4PZBYGZg8PFgIfEAUCMTNkZAIBDw8WAh8QB

 

  Q5TYW1wbGUgSXRlbSAxM2RkAgIPDxYCHxAFIlppbW1lcmdyZW4ncyBQZXJmb3JtYW5jZSBTYW1wb

 

  GUgMTNkZAIPD2QWBmYPDxYCHxAFAjE0ZGQCAQ8PFgIfEAUOU2FtcGxlIEl0ZW0gMTRkZAICDw8WA

 

  h8QBSJaaW1tZXJncmVuJ3MgUGVyZm9ybWFuY2UgU2FtcGxlIDE0ZGQCEA9kFgZmDw8WAh8QBQIxN

 

  WRkAgEPDxYCHxAFDlNhbXBsZSBJdGVtIDE1ZGQCAg8PFgIfEAUiWmltbWVyZ3JlbidzIFBlcmZvc

 

  m1hbmNlIFNhbXBsZSAxNWRkAhEPZBYGZg8PFgIfEAUCMTZkZAIBDw8WAh8QBQ5TYW1wbGUgSXRlb

 

  SAxNmRkAgIPDxYCHxAFIlppbW1lcmdyZW4ncyBQZXJmb3JtYW5jZSBTYW1wbGUgMTZkZAISD2QWB

 

  mYPDxYCHxAFAjE3ZGQCAQ8PFgIfEAUOU2FtcGxlIEl0ZW0gMTdkZAICDw8WAh8QBSJaaW1tZXJnc

 

  mVuJ3MgUGVyZm9ybWFuY2UgU2FtcGxlIDE3ZGQCEw9kFgZmDw8WAh8QBQIxOGRkAgEPDxYCHxAFD

 

  lNhbXBsZSBJdGVtIDE4ZGQCAg8PFgIfEAUiWmltbWVyZ3JlbidzIFBlcmZvcm1hbmNlIFNhbXBsZ

 

  SAxOGRkAhQPZBYGZg8PFgIfEAUCMTlkZAIBDw8WAh8QBQ5TYW1wbGUgSXRlbSAxOWRkAgIPDxYCH

 

  xAFIlppbW1lcmdyZW4ncyBQZXJmb3JtYW5jZSBTYW1wbGUgMTlkZAIVD2QWBmYPDxYCHxAFAjIwZ

 

  GQCAQ8PFgIfEAUOU2FtcGxlIEl0ZW0gMjBkZAICDw8WAh8QBSJaaW1tZXJncmVuJ3MgUGVyZm9yb

 

  WFuY2UgU2FtcGxlIDIwZGQCFg9kFgZmDw8WAh8QBQIyMWRkAgEPDxYCHxAFDlNhbXBsZSBJdGVtI

 

  DIxZGQCAg8PFgIfEAUiWmltbWVyZ3JlbidzIFBlcmZvcm1hbmNlIFNhbXBsZSAyMWRkAhcPZBYGZ

 

  g8PFgIfEAUCMjJkZAIBDw8WAh8QBQ5TYW1wbGUgSXRlbSAyMmRkAgIPDxYCHxAFIlppbW1lcmdyZ

 

  W4ncyBQZXJmb3JtYW5jZSBTYW1wbGUgMjJkZAIYD2QWBmYPDxYCHxAFAjIzZGQCAQ8PFgIfEAUOU

 

  2FtcGxlIEl0ZW0gMjNkZAICDw8WAh8QBSJaaW1tZXJncmVuJ3MgUGVyZm9ybWFuY2UgU2FtcGxlI

 

  DIzZGQCGQ9kFgZmDw8WAh8QBQIyNGRkAgEPDxYCHxAFDlNhbXBsZSBJdGVtIDI0ZGQCAg8PFgIfE

 

  AUiWmltbWVyZ3JlbidzIFBlcmZvcm1hbmNlIFNhbXBsZSAyNGRkAhoPDxYCHwBoZGQYAgUfY3RsM

 

  DAkUGxhY2VIb2xkZXJNYWluJEdyaWRWaWV3MQ88KwAKAQgCKWQFR2N0bDAwJFBsYWNlSG9sZGVyV

 

  G9wTmF2QmFyJFBsYWNlSG9sZGVySG9yaXpvbnRhbE5hdiRUb3BOYXZpZ2F0aW9uTWVudVY0Dw9kB

 

  QRIb21lZGnihW5zRhNmmnQef2E5KXJlKgIU" />

 

If you compare the aforementioned ViewState with the very same page but with the ViewState disabled, it would look like this:

 <input type="hidden"

 

  name="__VIEWSTATE"

 

  id="__VIEWSTATE"

 

  value="/wEPDwULLTE5NjIxMzI1MDJkGAIFH2N0bDAwJFBsYWNlSG9sZGVy

 

  TWFpbiRHcmlkVmlldzEPPCsACgEIAilkBUdjdGwwMCRQbGFjZUhvbGRlclR

 

  vcE5hdkJhciRQbGFjZUhvbGRlckhvcml6b250YWxOYXYkVG9wTmF2aWdhdG

 

  lvbk1lbnVWNA8PZAUESG9tZWTEsK7AlAZmIZYt/bke1dmkbPKxhg=="/>

 

What impact can these few lines of markup have on the page rendering process anyway, you say? Well, in order to find out – let’s summon our good friend Fiddler2 and do a quick comparison.

ViewState Enabled

ViewState Disabled

Body size (bytes) 14 534 bytes 12 883 bytes
Load time (seconds) 0.3765430 seconds 0.2031263 seconds

A visual comparison of the same page with versus without ViewState enabled:

Body Size comparison (bytes)

Load Time comparison (seconds)

image image

So what can I do to tune and tweak the ViewState?

There’s generally two good initial tips for tuning the ViewState.

  • Disable ViewState for the entire page
  • Disable ViewState for selected components

The first option is good if you don’t need to use ViewState in any of the components on your page. Then you can simply disable ViewState by setting the ViewStateEnabled property to false:

  <%@ Page Language="C#"

 

     AutoEventWireup="true"

 

     CodeBehind="ViewStateSample.aspx.cs"

 

     Inherits="Zimmergren.Samples.ViewState.ViewStateSample"

 

     DynamicMasterPageFile="~masterurl/default.master"

 

     EnableViewState="false"  %>

 

The second option is good if you need the ViewState for certain components, but you want to disable it for others. You can disable the ViewState for specific components;

     <asp:GridView ID="GridView1"

 

         runat="server"

 

         AutoGenerateColumns="False"

 

         AllowSorting="true"

 

         AllowPaging="true"

 

         PageSize="25"

 

 EnableViewState="false"

 

         />

 

Additional Tip: Take a look at HTTP compression

In addition to being aware of the ViewState in any asp.net project, you should take a look at HTTP Compression that you can enable in IIS.

Read more on HTTP Compression

Summary

Alright – the summer is officially over (at least of you take a look at the recent weather) and I’m back in the saddle. In this article I’ve been talking a bit about how the ViewState can impact your performance in any asp.net project (hence SharePoint project). The reason for talking about it is that I’ve seen quite the amount of projects as of late that doesn’t think about the impact a huge ViewState can have on the time it takes to download and render a page on the client.

An important consideration that I’ve learned throughout all our projects to create Internet-facing sites based on SharePoint (or plain old ASP.NET) is to measure the performance in different scenarios, including some stress testing and load tests. In most of these projects we’ve seen an increased performance if we take time to select what components or pages should disable ViewState. It might be worth considering.

Note that this is a minimal sample with minimal impact. Consider when you’ve got a huge page with 10+ Web Parts and all using the ViewState, even if they don’t really need to – can you imagine the page load times increasing? The answer is probably yes :-)

Enjoy.

Author: Tobias Zimmergren
http://www.zimmergren.net | http://www.tozit.com | @zimmergren

Introduction

After many-a-requests I’ve decided to do an article on how you can work with an Azure-hosted SQL Server and consume that data in SharePoint 2010.

Related articles for working with external data in SharePoint 2010:

A few introductory details about this article…

In this article I will discuss and guide you through how you can utilize the power, scalability, flexibility and awesomeness that comes with the cloud. By following this article you will get an introduction to how you can work with SharePoint together with Windows Azure to store business data.

This article will be an introduction to developing SharePoint solutions to work with Windows Azure, and in later articles I will discuss other approaches where Windows Azure may be a good solution to incorporate in the plans for your organization together with Office 365 and SharePoint Online.

Please note that this article is NOT intended to be an introduction to setting up Windows Azure. Its an introduction to setting up the connection from SharePoint to SQL Azure. More in-depth articles are coming up later.

Prerequisites

In order to follow along with this article and repro these steps yourself, you will need to have the following things in place already:

  • A Windows Azure developer account
  • An SQL Azure database and a table in that database
  • Visual Studio 2010
  • SharePoint Designer 2010
  • A few sprinkles of awesomeness in your pocket would be nice, just for fun

Please note that in SQL Azure you’d need to hook up the IP-address of the machine running this code or service in order to enable it for connectivity with the SQL Azure database. You’ll see more about that in your SQL Azure portal.

Connect to SQL Azure using Business Connectivity Services in SharePoint 2010

In this section I will talk about how we can create a connection to our SQL Azure database from SharePoint by utilizing BCS. I will for the ease of demo use SharePoint Designer to set it up – and to prove that it works!

1. Make sure you’ve got existing data in one of your SQL Azure databases

In my setup, I’ve got a database called ZimmergrenDemo and a table called ProductSales. I can access the database either from the Windows Azure Platform portal or directly from the SQL Server Management Studio:

image

image

I’ve got some sample data that I’ve popped into the SQL Azure Database:

image

2. Setting up a Secure Store Service configuration for your SQL Azure connection

In order for the BCS runtime to easily be able to authenticate to the SQL Azure database (which is using different credentials than your Windows Server/Domain), you can create a Secure Store application and use that for authentication.

1. Create a new Secure Store Application

Go to Central Admin > Mange Service Applications > Secure Store Service > New

Create a new Secure Store application, looking something like this:
image

2. Configure the Secure Store application fields

I add one field for User Name and one for Password, something like this:

image

3. Add the administrator account(s) needed

image

Voila! Your Secure Store application is setup, now let’s move on to working with the data in our SQL Azure database.

3. Working with the data though Business Connectivity Services

Now that the SQL Azure database is available and your Secure Store application is configured, it’s time to get the BCS up and running with SharePoint Designer.

The first and foremost option to get up and running quickly is of course to hook up an External List and be able to see your data straight through the standard SharePoint UI.

For a detailed step-by-step instruction for the whole routine to set up a new BCS connection, please refer to my previous articles.

1. Configure the BCS connection using SharePoint Designer

Launch SharePoint Designer and create a new External Content Type and select the SQL option for the data source. Enter the information to your SQL Azure database and the application ID for your Secure Store application.

Connecting to your SQL Azure database through BCS via SPD:
image

Since you need to enter the credentials for your impersonated custom identity (the SQL Azure database credentials) – you’ll get this dialog:

Enter the credentials to your SQL Azure database:
image

Once that is taken care of, you will be able to follow the normal routines for configuring your BCS connection.

My SQL Azure database, right inside of SPD:
image

2. Create an external list and navigate to it in your browser

In whatever way you prefer, create an external list for this External Content Type and navigate to it. You will probably see a link saying “Click here to authenticate“.

Click the link, and you will be provided with this interface:

image

I probably don’t have to explain that this is where you’ll enter your SQL Azure User Name and Password to make sure your BCS connection authenticates to your SQL Azure database properly.

Okay, when the external list is created and you’ve configured the authentication – you’ll see your data flying in directly from SQL Azure into your SharePoint external list for consumption!

image

And as always, the coolest thing here is that it’s read and write enabled straight away – you can work with the items in the list much like normal items in any list. Sweet.

Consume the data programmatically from SQL Azure instead

If you don’t want to go with the BCS-approach and just do code directly instead then all you need to do is make sure that you wear the developer-hat and start hacking away a few simple lines of code.

Working with SQL Azure is like working with any other data source, so there’s really no hunky dory magic going on behind the scenes – it’s all just pretty basic.

Here’s a sample Web Part I created to collect the data from SQL Azure and display in SharePoint 2010.

image

Here’s most of what the code could look like:

     public partial class VisualProductSalesUserControl : UserControl
     {
         private const string connectionString = "Server=tcp:YOURSERVER.database.windows.net;Database=ZimmergrenDemo;User ID=Username@YOURSERVER;Password=myAwesomePassword++;Trusted_Connection=False;Encrypt=True;" ;
         private string selectCommand = "select * from ZimmergrenDemo.dbo.ProductSales;" ;
         private DataTable productSalesData = new DataTable ("ProductSales" );
         protected void FetchAndFill(string connectionString, string selectCommand)
         {
             using (var connection = new SqlConnection (connectionString))
             {
                 var adaptor = new SqlDataAdapter
                 {
                     SelectCommand = new SqlCommand (selectCommand, connection),
                 };

 

                 adaptor.Fill(productSalesData);
                 salesGrid.DataSource = productSalesData;
                 salesGrid.DataBind();
             }
         }
         protected void Button1_Click(object sender, System.EventArgs e)
         {
             FetchAndFill(connectionString, selectCommand);
         }
     }

 

Summary

In this article I talked briefly about how you can connect to your SQL Azure database using BCS and then utilize that information from SharePoint – or create a custom solution to access the data.

The reason for this article is to show you that working with Azure isn’t a big and scary task to take upon you – it’s actually all very straight forward!

Enjoy.

Author: Tobias Zimmergren
http://www.zimmergren.net | http://www.tozit.com | @zimmergren

Introduction

SharePoint 2010 developing for performance article series:
In this series of articles I will briefly introduce you to some key concepts when it comes to developing for performance in our SharePoint 2010 applications.

Related articles in this article series

Part 1 – SP 2010: Developing for performance Part 1 – Developer Dashboard
Part 2 – SP 2010: Developing for performance Part 2 – SPMonitoredScope
Part 3 – SP 2010: Developing for performance Part 3 – Caching in SharePoint 2010
Part 4 – SP 2010: Developing for performance Part 4 – Logging
Part 5 – SP 2010: Developing for performance Part 5 – Disposal patterns and tools
Part 6 – SP 2010: Developing for performance Part 6 – CSS Sprites
Part 7 – SP 2010: Developing for performance Part 7 – Crunching those scripts
Part 8 – SP 2010: Developing for performance Part 8 – Control that ViewState

Part 7 (this article):
This article is a bit shorter than the others and will only cover the concept of crunching your script files in your projects. The reasoning behind a crunched file is to save on transfer-bytes between the client and server.

JavaScript crunching

The technique called script crunching (or JavaScript crunching) is often referred to a way of eliminating useless characters from the script files to allow them to load faster. This means that by eliminating unnecessary whitespaces, line breaks and putting semicolons in the right places you can achieve a file size that is smaller than the original.

The reasoning behind crunching the script files are much that you can save on the client/server transfer and therefore also minimize the HTTP requests – which in turn is one step in the right direction for minimizing the page load time and render time.

Short in short; Do consider the technique if you’ve got large scripts that are taking a bit too long to load.

SharePoint 2010 are using crunched scripts

In SharePoint 2010 we already have a few examples of where JavaScript crunching is working in action. One example is the SP.js file which is essentially a crunched JavaScript library in SharePoint 2010. You do however also have the ability to use the SP.debug.js which contains the same content, but without being crunched.

When you look at those two files in an editor, you’ll quickly see the difference between them:

SP.js – 381 KB – Crunched JavaScript file in SharePoint 2010
image

SP.debug.js – 561 KB – The same file, but without the crunch
image

You can see that the mail difference between these two files is the file size. This means that if you’ve using the crunched version of the JavaScript file, your application will load slightly faster.

How to: Crunch your script files

There’s tons of tools on the market for crunching your scripts. Here’s a few online tools for crunching those scripts:

(or just search for JavaScript crunch, and you’ll find all the tools you’ll ever need)

What is the difference when using crunched scripts?

As a quick summary I did a test with an application that are loading a somewhat large script file – first without any crunching and then the same application loading the files after they’ve been minimized with a crunch. These are the results in my SharePoint 2010 application.

Without crunching

After crunching

JavaScript file size: 445871 bytes
(435 KB)
JavaScript file size: 331798 bytes
(324 KB)
  Saves around 25.5% in file size

image

Summary

A brief summary of the result is that if you’re crunching your script files, you’ll get a slightly smaller footprint when loading the page and making the HTTP requests. The reason for bringing this to your attention is of course that it’s a technique that’s been around for quite some time, but people tend to miss out on it because they’ve not seen the results of it. So, here you go – a visual chart telling you how it differs to use the exact same script, with and without crunching.

Enjoy.

Author: Tobias Zimmergren
http://www.zimmergren.net | http://www.tozit.com | @zimmergren

Introduction

The last week that just passed was filled with quite some cool happenings over at the BPCUK Conference in London. If you’ll follow the #BPCUK tag on Twitter you’ll find all the juicy information that you missed out on – there’s no real need for me to repeat it here :-)

Downloads

As promised, my session deck on Silverlight and SharePoint 2010 can be downloaded here: Download!

Author: Tobias Zimmergren
http://www.zimmergren.net | http://www.tozit.com | @zimmergren

Introduction

It probably doesn’t surprise anyone that the BPC conference in London is one of the main events to look forward to in the realm of SharePoint conferences.

I will of course be attending this conference and I’ll be delivering an introductory  session about Silverlight and SharePoint.

My session: Developing with Silverlight + SharePoint 2010 = Awesome

I’ll be talking about how you can utilize Silverlight in SharePoint 2010 to create some cool RIA applications hosted inside (or outside!) of SharePoint.

In this session you’ll get acquainted with what Silverlight is and how it plays along with SharePoint 2010 in various ways.

We’ll of course be looking at how to create our very own Silverlight Web Parts and applications and host them inside of SharePoint 2010.

The preliminary agenda of the session looks something like this;

  • Silverlight 101
  • Integrate with SharePoint 2010
  • Preferred deployment methods and developer guidance
  • Developer patterns
  • OOB Experience (Out of Browser) – Bring your Silverlight app to your desktop
  • And much more

Last years conference (Evolutions conference)

Just like you already know there was a volcano that found it suitable to erupt just in time for the conference last year that put a cane in the wheels for a lot of the attendees and speakers.

My flights were cancelled in last minute so I had to find another way of making my way to the conference last year. One day before the conference I basically gave up hope and I just posted a tweet on Twitter saying "Hey, I need to go from Sweden to England, any takers?".

A few minutes later a good friend of mine calls me up saying "Hey, I’m going to The Netherlands to pick up my girl since the flights are cancelled and boats and trains are full. I’m leaving in a few hours by car toward NL. Care to join?"

Alright, why not I thought… We drove from Sweden through Denmark, Germany and finally reached Rotterdam in Holland after a long journey. This is where he picked up his girl and were bound to head back to Sweden again.

I’m dropped off there, in the middle of nowhere, without any means of getting back to Sweden or getting on to England. All the boats and trains were full, remember? 
This is where I’m using Twitter again and post a tweet asking if anyone is in Rotterdam and will be driving to England in the near future.

Marianne tweets me back saying "Hey, we’ve got a spare seat in the car, we’ll pick you up tomorrow at 08.00 outside your chosen hotel". Sweet! Now we’re talking.

We drove from Holland through Belgium and France to finally take the Eurotunnel and reach England.

Even though that trip was a real blast and can’t really be depicted in words here, I really hope that the flights are leaving as scheduled this time!

Steve explaining the long journey

Summary

So if you’re attending BPC UK this year and care to join us in our adventures in the SharePoint jungle and the pubs – ping me.

This year I’m counting on the flights and I’m hoping to see as many of you as possible for the conference in London next week.

Until then, be awesome.

Author: Tobias Zimmergren
http://www.zimmergren.net | http://www.tozit.com | @zimmergren

Introduction

Some exiting news is about to be published and I’m glad to announce that some fantastic plans we’re working on is about to take shape. As stated in the headline of this post, TOZIT AB is now hiring staff in Sweden. We’re looking for sales representatives, consultants and helpdesk colleagues.

TOZIT AB is located in Sweden with it’s main office in Malmö, and operates a lot in Stockholm. The persons we’re looking for initially should be able to work either in Malmö/Öresundsregionen or Stockholm.

At TOZIT we are always looking to deliver high quality projects and resources, but not without having a great time doing so. The first and foremost policy we’ve got is to enjoy every day you work with us.

We invest in our employees making sure they are top of the line and always up to date with what’s new in the SharePoint world. By joining TOZIT AB, we will make sure that you get the latest and greatest training to make sure you’re on top of the latest technology.

#1: SharePoint specialist

If you’re aspiring to be a solid SharePoint professional and want to work together with some of the brightest minds when it comes to .NET and SharePoint, this is an opportunity you don’t want to miss out on.

Job Requirements

  • Basic understanding of SharePoint infrastructure, design and development.
  • Additional merits include:
    • Real world SharePoint project experience
    • .NET development experience
    • .NET and SharePoint Certifications (MCP, MCPD, MCITP, MCTS, MCT, …)

Job Location

This job opening applies to people able to work in one of the following locations:

  • Stockholm
  • Malmö/Köpenhamn/Öresundsregionen

Application Details

Please send your application to jobs@tozit.com if you’re interested in learning more about this position and what we can offer you. Please attach your updated resume (CV) along with any additional data (LinkedIn profile for example).

#2: Senior SharePoint specialist

If you’re a senior SharePoint professional looking for new challenges, perhaps we could fit you in our team.

In this role we will expect you to have a vast experience from the field as a consultant or freelancer and have been working with a lot of real world projects that you can showcase or talk about.

Job Requirements

  • 4+ years of experience with professional SharePoint projects
  • Additional merits include:
    • .NET development
    • Certifications (MCP, MCPD, MCITP, MCTS, MCT, …)

Job Location

This job opening applies to people able to work in one of the following locations:

  • Stockholm
  • Malmö/Köpenhamn/Öresundsregionen

Application Details

Please send your application to jobs@tozit.com if you’re interested in learning more about this position and what we can offer you. Please attach your updated resume (CV) along with any additional data (LinkedIn profile for example).

#3: Technical helpdesk

We’re currently also looking for someone to work with support and helpdesk, where your main responsibility will be to make sure the internal systems work properly and assist the different departments in their daily routines.

Job Requirements

  • You should most definitely be service minded
  • A technical background with Microsoft technologies is preferred

Job Location

This job opening applies to people able to work in central Stockholm only.

Application Details

Please send your application to jobs@tozit.com if you’re interested in learning more about this position and what we can offer you. Please attach your updated resume (CV) along with any additional data (LinkedIn profile for example).

#4: SharePoint sales representative

We’re looking to extend our sales team with one or more additional resources to help out and assist in the sales and finding new leads on projects, and help close new deals with new and existing customers.

Job Requirements

You’re a sales representative with focus on quality instead of quantity and knows how to listen to the clients requirements and can match that with what we can offer. You should have a network to utilize for finding new projects and clients and being able to find new leads.

Job Location

  • Malmö (with the possibility to work in Stockholm sometimes)

Application Details

Please send your application to jobs@tozit.com if you’re interested in learning more about this position and what we can offer you. Please attach your updated resume (CV) along with any additional data (LinkedIn profile for example).

Summary

We’re looking to extend our team and the level of services we provide by employing some new colleagues to work with us in our awesome team. If you’d like to work with me and my team and want to have fun while doing so, please don’t hesitate to send your application to us :-)

SP 2010: Developing for performance Part 4 – Logging

January 17th, 2011 by Tobias Zimmergren

Author: Tobias Zimmergren
http://www.zimmergren.net | http://www.tozit.com | @zimmergren

Introduction

SharePoint 2010 developing for performance article series:
In this series of articles I will briefly introduce you to some key concepts when it comes to developing for performance in our SharePoint 2010 applications.

Related articles in this article series

Part 1 – SP 2010: Developing for performance Part 1 – Developer Dashboard
Part 2 – SP 2010: Developing for performance Part 2 – SPMonitoredScope
Part 3 – SP 2010: Developing for performance Part 3 – Caching in SharePoint 2010
Part 4 – SP 2010: Developing for performance Part 4 – Logging
Part 5 – SP 2010: Developing for performance Part 5 – Disposal patterns and tools
Part 6 – SP 2010: Developing for performance Part 6 – CSS Sprites
Part 7 – SP 2010: Developing for performance Part 7 – Crunching those scripts
Part 8 – SP 2010: Developing for performance Part 8 – Control that ViewState

Part 4 (this article):
In SharePoint 2010 (well, 2007 too for that matter) you need to think about proper logging in your applications to ensure that any problems, issues or other events are lifted to the ULS Logs (Unified Logging System) – that way the administrators can easily view the logs and track down problems with your applications. In this article I will talk a bit about how you can utilize the logging capabilities in SharePoint 2010.

ULS Logs

The ULS Logs are the main place for SharePoint to output it’s diagnostics information. We will take a quick look at how we can read the logs and obviously how we can write custom logging messages to the logs.

Reading the ULS Logs in SharePoint 2010

In order to read the ULS Logs you’ll need access to the SharePointRoot (14LOGS) folder. But to make the life even easier for us Microsoft released a tool called the ULS Viewer which you can download here: http://code.msdn.microsoft.com/ULSViewer

With this tool you can quite easily read through the logs in SharePoint 2010 without any issues.

There’s plenty of resources on the web about how to use the ULS Viewer, so go take a look at any one of them for some usage-instructions.

Download (docx): ULS Viewer documentation

Writing to the ULS Logs from you SharePoint 2010 application

The other side of the logs are of course writing to the logs. This is not a very hard task to do in SharePoint 2010, and I’ll outline the basic steps to do so here.

Normally I create a new class or at least a method to take care of the logging, and it can look like this:

public static void WriteTrace(Exception ex)
{
    SPDiagnosticsService diagSvc = SPDiagnosticsService.Local;
    diagSvc.WriteTrace(0,
        new SPDiagnosticsCategory("TOZIT Exception",
            TraceSeverity.Monitorable,
            EventSeverity.Error),
        TraceSeverity.Monitorable,
        "An exception occurred: {0}",
        new object[] {ex});
}

You can use the aforementioned code by calling the method like so:

try
{
    throw new Exception("Oh no, application malfunctioned! UnAwesomeness!!!");
}
catch(Exception ex)
{
    WriteEvent(ex);
}

It’s not very hard at all – once you’ve done that, you’re all set to kick off your custom applications and just call your custom logging method. Obviously you should create a proper logging class to take care of all your logging in your applications.

Results

Using the ULS Viewer you can easily track down the error messages by filtering on your category (in my case it’s TOZIT Exception)

image

image

Event Logs

Even though the ULS Logs takes care of most of the diagnostics logging today, it might be reasonable to output information into the Event Logs sometime.

Writing to the Event Logs from you SharePoint 2010 application

Just as when you’re doing ULS Logging, you can do logging to the Event Logs. It’s just as simple, but you replace the method "WriteTrace" with "WriteEvent" like this:

public static void WriteEvent(Exception ex)
{
    SPDiagnosticsService diagSvc = SPDiagnosticsService.Local;
    diagSvc.WriteEvent(0,
        new SPDiagnosticsCategory("TOZIT Exception",
            TraceSeverity.Monitorable,
            EventSeverity.Warning),
        EventSeverity.Error,
        "Exception occured {0}", new object[] {ex});
}

Results

You can view the logs in the Event Logs on your SharePoint server as you would read any other logs.

image

Can I do more?

There’s plenty of cool things to do with the logging mechanism in SharePoint 2010, so you should definitively get your hands dirty playing around with it.

If you for example want to tag the log entries with your company name, clients name or project name – you can easily change that behavior as well. Take a look at my friend Waldek’s post about it here: http://blog.mastykarz.nl/logging-uls-sharepoint-2010/

Related resources

Summary

As a part of the article series focusing on some general guidelines for performance in your applications, logging is a major player. If you master your logs in terms of monitoring and custom application logging you will quickly come to realize how valuable it is.

This is intended as a starting point for you to get familiar with the logging-capabilities in SharePoint 2010.

Author: Tobias Zimmergren
http://www.zimmergren.net | http://www.tozit.com | @zimmergren

Introduction

SharePoint 2010 developing for performance article series:
In this series of articles I will briefly introduce you to some key concepts when it comes to developing for performance in our SharePoint 2010 applications.

Related articles in this article series

Part 1 – SP 2010: Developing for performance Part 1 – Developer Dashboard
Part 2 – SP 2010: Developing for performance Part 2 – SPMonitoredScope
Part 3 – SP 2010: Developing for performance Part 3 – Caching in SharePoint 2010
Part 4 – SP 2010: Developing for performance Part 4 – Logging
Part 5 – SP 2010: Developing for performance Part 5 – Disposal patterns and tools
Part 6 – SP 2010: Developing for performance Part 6 – CSS Sprites
Part 7 – SP 2010: Developing for performance Part 7 – Crunching those scripts
Part 8 – SP 2010: Developing for performance Part 8 – Control that ViewState

Part 3 (this article):
In this article I will talk about some different techniques for managing caching in your applications. One of the most important thing for most projects today is to keep the performance to a maximum under heavy load times. Caching helps out in achieving parts of that goal.

Some caching techniques in SharePoint 2010

In SharePoint 2010, there’s a few different ways to cache data. In this section you can read about a few of those approaches. Most of the caching techniques I use regularly as a developer derives from or is a direct usage of the capabilities of the .NET framework.

Output Caching (Configurable)

The concept of Output Caching is something that natively comes with SharePoint 2010, as it builds on and relies on ASP.NET caching techniques. This means that you can configure your SharePoint 2010 site to cache the Pages it outputs. The reasoning behind caching a page is obviously that it takes time to generate the content on any page, and on a heavily accessed site it would be a performance impact to generate a new page on every request – that’s where Output Caching comes in handy.

There’s a few goodies about Output Caching, as well as a few gotchas that you should be aware of before taking the journey of configuring your sites for Output Caching;

  • Positive: Quicker response time for your cached pages, faster to render to the client.
  • Positive: Saves on CPU, since it doesn’t have to re-do all the calculations every time.
  • Positive: You can customize Output Caching by using Cache Profiles.
  • Positive: The caching mechanism can store different versions of the resources cached depending on the permissions for the requesting user.
  • Negative: Caching obviously eats more memory as it needs to cache the pages.
  • Negative: Possibility for inconsistencies between WFE’s in a multi-server farm.
  • Negative: Output Caching is a SharePoint Server 2010 capability.

You can create custom caching profiles for your SharePoint 2010 site which allows you to modify and configure the way things are cached – and what should be cached.

Read more: How to configure Output Caching
Read more: Create a custom cache profile using VaryByCustom event handler

Object Caching (Configurable)

In SharePoint Server 2010 you’ve got the option to use Object Caching as well, which is a mechanism to cache specific page items. This is especially likable if you’re playing around with Cross-List data queries and need to cache the results of such a query.

Read more: Object Caching

BLOB Cache (Configurable)

In SharePoint 2010 you also have something called the BLOB Cache, which is a disk-based caching mechanism that caches resources on the disk. Normally these resources are files served by a web page and are named Binary Large OBjects (BLOB).

Normally the BLOB cache will cache files served by the web request like images and video clips.

In web.config you’ve got something similar to the following line which lets you know what file types are cached. You can of course add or remove file types here:

<BlobCache 
    location="D:BLOB"
path=".(gif|jpg|jpeg|jpe|jfif|bmp|dib|tif|tiff|ico|png|wdp|hdp|css|js|asf|avi|flv|m4v|mov|mp3|mp4|mpeg|mpg|rm|rmvb|wma|wmv)$"
    maxSize="10"
    enabled="false" />

Read more: Configure cache settings for a Web Application (SharePoint Server 2010)
Read more: Configure the BLOB Cache

Caching in code (Programmable)

While we know that there’s pre-configurable caching available in SharePoint 2010 (like the Output Cache and BLOB Cache), there’s obviously still a need to create custom caching routines in your applications.

In order for your custom applications to run efficiently and save on server load, you need to consider the importance of using proper caching in your applications.

For instance, if you’ve created an application that is (on every request) fetching information from a database or a SharePoint list (SPList), do you really need that data to be fetched directly from the source – or could you live with having it cached for a few minutes? If it’s not super-important data we’re dealing with that doesn’t need to be up to date every second and every request – please consider building some caching mechanisms in your applications.

There’s a few different approaches to caching in your custom applications, the most common being the ASP.NET Caching (Read about how you can cache a Web Part).

To be honest, there’s no need to write about all the different code-samples you could use for caching right here, as the guys at Microsoft did an excellent job talking about it in this Best Practice article:

Read more: Common coding issues when working with the SharePoint Object Model

Note that the aforementioned article is for SharePoint 2007 (WSS 3.0) originally, but these techniques still apply.

Summary

In this article you’ve read about a few approaches to make your applications perform better in SharePoint 2010. These techniques are a fundamental part of development and configuration when it comes to playing around with SharePoint and making it behave properly in terms of responsiveness and performance.

I’ve been getting a few requests for talking about some various caching techniques in SharePoint, so there you have it. Go have a read at those links and you should be all set to start the adventure that is caching!

Read more: Planning for caching and performance

Enjoy!