SharePoint Development Patterns: Getting Started

Most teams, nowadays, use a virtual machine model for developing SharePoint solutions simply because it makes the most sense to do so.  From the teams and individuals that I’ve worked with, it also seems that most also work directly inside of the VM environment.

I’ve never found this palatable as there is a noticeable decrease in responsiveness and performance when working with Visual Studio inside of the VM.  Besides, I also write lots of non-SharePoint related code as well and it seems a pain to maintain two development environments.

In reality, for the most part, any development that you can do inside of the VM, you can do outside of the VM.  The benefit is increased responsiveness, increased performance of Visual Studio, and less hassle configuring the server development environment.  After I fire up my VM in the morning, I want to interact with it and think about it as little as possible.

Note that your mileage may vary; No Silver Bullet resonates with me.  Whether this development pattern works for you and your team will depend on the type of solution you’re building, the type of hardware your company provides (in many cases, developer hardware is woefully underpowered so the only option is to work in the VM hosted on much more powerful hardware), and your own tastes.

Setting the Stage

The first step is to set up a common “staging” area for files.  Pick a path on your server and simply share the folder with Everyone with read/write/full control.  I’d recommend C:\_staging or C:\_deploy.

Step two is to share the SharePoint hive directory as well (we’ll see why in a bit).  Once you’ve shared it, hit up the ISAPI directory ( C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\ISAPI) and copy the SharePoint .dll files to your local development environment.

Project Structure

Typical solution layout

My personal preference is to have a very clear delineation of responsibilities in my project structure.  Specifically, to address approachability of the solution, I break out all front-end web artifacts (.aspx, .ascx, .asmx, images, script files, etc.) into a separate web project from the get go.  This aids in approachability by allowing .NET developers to stay in their domain.  With the proper level engineering, we can ensure that no leakage of the SharePoint namespaces occurs across project boundaries.

I will typically break out the solution into at least 4 projects:

Framework – contains the framework level code.  This is code that is domain agnostic; you could bundle it up and reuse it in another project in an entirely different domain.  Things like model builders, exception factories, base model class, base gateway classes, and so on.  This project will reference the SharePoint libraries.  In some cases, it even makes sense to make this an entirely different solution altogether and only bring the binaries over.  This helps prevent “cheating” (modifying framework level code to meet one specific use case) by encouraging building solutions that the framework supports.

Core – contains the domain specific logic.  The goal is to build your domain model in this project in such a way that it nearly entirely abstracts away the existence of SharePoint to the web project.  The benefit is that you can prevent bad practices at the UI level (no more SPSite and SPWeb references in your code-behind) by simply preventing SharePoint from “leaking” into your web project.  This requires diligence to ensure that your domain model is sufficient to support your web projects.

No SharePoint reference!

Web – contains all of the web artifacts that will be deployed to SharePoint.  Controls, pages, scripts, images, CSS files, web service endpoints, and so on.  You’ll note that the web project doesn’t have any reference to SharePoint, whatsoever.

The reasoning is that this prevents bad practices.  It promotes a separation of concern for the developers who are working on your front-end and the ones that are working on your business logic.

Even in teams where these are the same developers, it promotes good programming practices by keeping your SharePoint specific artifacts from leaking into your web projects.  Common examples are string list names and CAML queries; you don’t want to see those in your web project!

Package – contains all of the packaging information (your .wsp solution).  For the most part, the structure of this project is strongly dictated by WSPBuilder.  And I don’t mind; it’s simply the most logical way to layout your package.  Simply copy the files from your web project to the package project exactly as they would be deployed in your SharePoint environment.  One special note is the GAC directory.  Any binaries copied to this directory will be automatically packaged for deployment to the GAC on the server.  I typically just use XCOPY in the post-build events of all of my projects to copy the binaries over to the GAC directory of my package project.

Here’s an example from my web project:

Depending on how you want to group your features, you may wish to create more packages.  Typically, if I’m deploying a large set of data into lists as well, I’ll create a separate data package since redeploying the data typically takes longer than redeploying just the code artifacts and features.

Centralizing Your Output

You’ll notice in my post build that I’m actually pushing the output up to a directory called “pkg” as well.  This is my package directory.  I do this for convenience as it serves as a single location that I can zip and send to someone.  It includes all of my setup batch scripts or PowerShell scripts, binaries, dependent assemblies, and the .wsp file.

Folder structure in the file system

It’s simply a matter of calling batch scripts on the post-build event to copy out the relevant files.  This also makes deployment to our staging environment easier by allowing us to greatly simplify the XCOPY call to do the copy.  Here’s the post-build event of my package project:

Pushing to the Staging Location

Batch file located at the root

Once all of your content is centralized to the “pkg” directory, it’s a trivial exercise to move it to the staging location (remember that shared drive that we set up earlier?).  I typically put this in another script at the root of my solution.

The reason is that I don’t always want it to run when I build as I may need to build just to check for compile time errors.

If you’ve set up your post build events correctly, it should be a pretty trivial exercise at this point to copy your contents over to the staging directory.  If not, you can use this file as a centralized batch script for performing your copies instead.

Here’s an example from another solution:

The last line performs the copy of all of the contents of the “pkg” directory over to the staging location.

In case you’re wondering, the /Y argument for XCOPY is used to suppress the prompt for overwrite (it will overwrite by default) and the /S argument is used to copy directories and sub-directories.

Executing Remote Commands

Finally, we need to figure out how to execute commands remotely to deploy our artifacts.

A note here: I typically do not automate installation of the package; I like to do this manually as I like to be able to monitor and check for errors.  What I do automate is the redeployment of .dll and .pdb files.  In my development cycles, I typically only deploy the package once or twice a week (changes to the content type, event receiver definition, new .aspx pages, etc.), however, I will redeploy the binaries dozens of times as I modify code-behind and event receivers and so on.

To enable this, we need to perform three actions on the VM first to prepare the environment:

  1. Update the PATH environment variable to include the location where gacutil.exe is (on my VM, it’s at: C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\Bin\x64)
  2. Download and unzip the SysInternals Suite (preferably to C:\SysInternals\ for simplicity).
  3. Update the PATH environment variable to include the location C:\SysInternals\.

On the host side, you will also need to download and unzip PsExec.  You can either put the location into your PATH or you can copy it to a directory under your solution (I like to do this and add it to source control so that everyone that checks out the source will automatically be able to build against the VM).

The following line needs to be added to your copy-wsp.bat file (or to a separate file altogether):

Here’s a breakdown of the arguments:

  • \\scooby : remote server name
  • -u : the user to log in as
  • -p : the password of the user
  • -e : Does not load the specified account’s user profile
  • -d : Don’t wait for the process to terminate (non-interactive)
  • -w C:\_staging : Set the working directory on the remote environment
  • cmd /c \\scooby\_staging\_gac-install.bat : Execute the batch file.  The /c argument to cmd tells it to terminate after execution.

The batch script can contain anything you want.  Typically, I have a few calls to gacutil.exe and a few XCOPY commands to copy the .pdb files for remote debugging followed by a stop and restart of the web server:

You can get more creative and make it as complex as you need it to be.  You can even deploy and call your own custom command line tools if the complexity justifies it.  Keep in mind that this only redeploys your binaries.

When it comes to .aspx, .js, and .ascx files, I generally do the work right on the server using EditPlus (or your text editor of choice).  Remember that we shared the SharePoint hive directory?  Yup, you can just open it from your host machine now and work with it as if it were a local file.  Just don’t forget to synchronize it back!

I like to use FreeCommander on the host for this purpose.  If you prefer to edit in Visual Studio, you can do that, too, and just use FreeCommander to synchronize your edits to the TEMPLATE directory.  Because SharePoint is just a big ol’ ASP.NET application, you can develop for it just like any other ASP.NET application; no need to redeploy your entire .wsp for changes to the content files.  This speeds up development cycles considerably as it takes a lot of extra thumb twiddling time out of your daily workflow.

Executing Batch Scripts from Visual Studio

One hitch in all of this is that you’ll soon find that having to leave Visual Studio to execute batch scripts sucks. Ideally, we’d be able to execute batch scripts right from Visual Studio.    You can set this up pretty easily as an external tool:

Configuring an external tool to execute batch files

Note that it takes the item path as a command.

To make it even easier, we can hook it up to a hotkey.

Wiring the keyboard shortcut.

Now when you select the file and hit CTRL+Shift+X, the batch file is automatically executed.

Remote Debugging

This is, quite honestly, the most complex part to get right.  The general tip is to ensure that the Visual Studio Remote Debugging Service is running under your account (with the same username and same password).  You don’t have to be on the same domain (in fact, my VM typically runs its own domain server as well).

Attaching to a remote machine

Debugging the remote process is as simple as pointing the server and selecting the right process by selecting Debug > Attach to Process > enter the remote machine name > select the right process (see image on the left).

Don’t take my lead on this; I didn’t follow best practices in setting up my development VM :-D; I set up all of my services to use the Administrator account for simplicity.  However, it causes confusion as I can’t tell which one is running my front-end.

In a properly configured environment, it should be a lot easier to determine which w3wp.exe instance to attach to based on the user names (one of those is the service application, one is the central admin, and one is the actual web application).

Wrap Up

I’ve heard and read many other suggestions for building solutions for SharePoint including building custom Visual Studio add-ins, Visual Studio macros, or MSBuild tasks.  In general, I don’t like these approaches for day-to-day development because: 1) they’re hard to manage, 2) they’re hard for the general developer to understand, debug, extend, and fix, and 3) they require too much effort!

Batch files are easy.  XCOPY is easy.  Why make it harder than it has to be?

The goal is to minimize the amount of time it takes to realize and deploy changes in the easiest possible way.

You may also like...