<CharlieDigital/> Programming, Politics, and uhh…pineapples

29Oct/10Off

Cancelling an Event In SharePoint 2010

Posted by Charles Chen

Small note on the API change: the Cancel property has been deprecated in favor of the new Status property.

Hope this saves someone some time.

Filed under: .Net, SharePoint No Comments
27Oct/10Off

Updating SharePoint Taxonomy Fields

Posted by Charles Chen

I tripped up on it a bit recently on SharePoint taxonomy fields (or managed metadata fields) as I found that I wasn't able to push down values from a SharePoint list item to a Word document programmatically in SharePoint.

I was able to set the property in SharePoint and the list item would reflect the property in string form like so:

Japan|c2a154e3-8bb0-4b56-b083-297526964fd9

But the problem with this is that when the user opens the document, the Document Information Panel (DIP) will not have the value pre-selected.  All of the other values are set properly and show up in the DIP.  I found that if I saved the item manually in SharePoint through the UI, it would push the changes down to the Word document, but it took quite a while to figure out how to set the taxonomy field values.

In the Word document itself, I could see that the document properties were clearly different.  The first code block shows what a functioning document properties XML looks like:

<RegionTaxHTField0 xmlns="3a290427-a0ba-4a8f-bf50-36a9ad2bef07">
    <Terms xmlns="http://schemas.microsoft.com/office/infopath/2007/PartnerControls">
        <TermInfo xmlns="http://schemas.microsoft.com/office/infopath/2007/PartnerControls">
            <TermName>Japan</TermName>
            <TermId>c2a154e3-8bb0-4b56-b083-297526964fd9</TermId>
        </TermInfo>
    </Terms>
</RegionTaxHTField0>

And here is what it looks like if you set the property using the string value only:

<RegionTaxHTField0 xmlns="3a290427-a0ba-4a8f-bf50-36a9ad2bef07">
    <Terms xmlns="http://schemas.microsoft.com/office/infopath/2007/PartnerControls"></Terms>
</RegionTaxHTField0>

I chased this around for hours trying to figure out how to set the field values in SharePoint so that they'll be pushed down into the Word document.  It turns out, there's a new API for handling taxonomy fields.  The following sample assumes that you've iterated through the fields and collected all of the taxonomy fields:

/// <summary>
///   Updates the taxonomy fields.
/// </summary>
/// <param name="target">The target file.</param>
/// <param name="taxonomyFields">
///   The taxonomy fields which have been collected already.
/// </param>
private static void UpdateTaxonomyFields(SPFile target,
    List<TaxonomyField> taxonomyFields)
{
    SPListItem item = target.Item;
    SPSite site = item.Web.Site;

    TaxonomySession session = new TaxonomySession(site, false);

    foreach(TaxonomyField t in taxonomyFields)
    {
        TermStore termStore = session.TermStores[t.SspId];

        TermSet termSet = termStore.GetTermSet(t.TermSetId);

        string value = Convert.ToString(item[t.Id]);

        string[] parts = value.Split('|');

        value = parts[parts.Length - 1];

        Term term = termSet.GetTerm(new Guid(value));

        if(term == null)
        {
            Log.Write("Term could not be found for value \"{0}\"", value);
            continue;
        }

        t.SetFieldValue(item, term);
    }
}

Using this approach, you can set the property in SharePoint and push it down to the document as well.

Filed under: .Net, SharePoint No Comments
21Oct/10Off

Programmatically Submitting Files to the SharePoint Content Organizer

Posted by Charles Chen

The content organizer in SharePoint 2010 is an interesting feature, particularly from an application development perspective as it gives you a fairly competent rules builder (why does it post back when you select a content type?  Like seriously, Microsoft?) and routing engine, for free.  We built something similar in FirstPoint for SharePoint 2007, but it was quite a bit of work.  You can do all sorts of interesting things with this such as moving documents around based on metadata changes or externalizing the rules that determine where server generated documents should be routed to.

The content organizer can be accessed from the Official Files web service but it can also be accessed directly when building server applications such as event receivers or custom web services.  One common scenario, for example, is using an event receiver for moving a document to a different folder or library if a user changes a metadata field.  If I have a list of contacts organized into folders by last name (A-C, B-E, etc.), I'd want the system to automatically move a contact into the correct folder if a contact changes his or her last name.  Without the content organizer and externalized rules, you'd either have to hard code the routing rules or design and code a system to externalize the routing rules, both sub-optimal solutions compared to the content organizer.

Behind the scenes, content organizer users a class called OfficialFileCore, you'll need to add a reference to Microsoft.Office.Policy.dll (found in the ISAPI directory) to your project.  The following is some code that should give you an idea of how to call the OfficialFileCore.SubmitFile() method:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Microsoft.SharePoint;
using Microsoft.SharePoint.Utilities;
using Microsoft.Office.RecordsManagement.RecordsRepository;
using System.Collections;

// Namespace and class omitted for formatting purposes...

static void Main(string[] args)
{
    Microsoft.SharePoint.OfficialFileResult result = Microsoft.SharePoint.OfficialFileResult.UnknownError;
    string destination = null; // New location of the file is assigned to this string.

    using (SPSite site = new SPSite("http://moss.dev.com/"))
    using (SPWeb web = site.OpenWeb())
    using (new SPMonitoredScope("Official File Receiver", uint.MaxValue, new ISPScopedPerformanceMonitor[] { new SPSqlQueryCounter(100) }))
    {
        SPFile file = web.GetFile("http://moss.dev.com/dropofflibrary/test1.docx");
        byte[] bytes = file.OpenBinary();
        string fileType = file.Item.ContentType.Name;
        SPFieldCollection fields = file.Item.Fields;

        List<Microsoft.SharePoint.RecordsRepositoryProperty> properties
            = new List<Microsoft.SharePoint.RecordsRepositoryProperty>();

        // Create a RecordsRepositoryProperty for each metadata field.
        foreach (SPField field in fields)
        {
            try
            {
                string value = Convert.ToString(
                    field.GetFieldValue(Convert.ToString(file.Item[field.Title])));

                Console.Out.WriteLine("Name:{0}, Type:{1}, Value:{2}",
                    field.Title, field.TypeAsString, value);

                Microsoft.SharePoint.RecordsRepositoryProperty property =
                    new Microsoft.SharePoint.RecordsRepositoryProperty
                    {
                        Name = field.Title,
                        Type = field.TypeAsString,
                        Value = value
                    };

                properties.Add(property);
            }
            catch (Exception exception)
            {
                // Some fields fail; not sure if they're consequential yet!
                Console.Out.WriteLine(" - Failed to process field {0}", field.Title);
            }
        }

        result = OfficialFileCore.SubmitFile(web, bytes, properties.ToArray(), fileType,
            "http://moss.dev.com/dropofflibrary/test1.docx", "Charles", false, out destination);

        // Seems that you have to manually delete the file.
        file.Item.Delete();
    }

    Console.Out.WriteLine("1 > {0}", result);
    Console.Out.WriteLine("2 > {0}", destination);
}

In this case, the source library is the drop off library.  Note that if you programmatically add a file to the library, it doesn't actually get routed according to the content organizer rules; you'll still have to submit it manually in the UI or use the code outlined above to actually trigger the routing.

One final note: as far as I can tell, it seems that you have to manually delete the file from the source library once you successfully route the file.

Filed under: .Net, SharePoint 2 Comments
18Oct/10Off

jQuery Conference 2010

Posted by Charles Chen

I didn't go, but John Peterson did.

Check out his feedback from the conference.

<3 jQuery

Filed under: DevLife No Comments
14Oct/10Off

Getting All Content Controls using OpenXML

Posted by Charles Chen

If you're trying to get all of the content controls in an OpenXML document, the most obvious way to do it would be:

// Get the document instance
WordprocessingDocument document = ...; 

// Get all of the SdtBlock elements
document.MainDocumentPart.Document.Descendants<SdtBlock>();

But this will only get you some of the content controls.  Basically, it won't return any nested, inline content controls (nested, non-inline content controls are still returned).  Nested, inline content controls are still tagged as <sdt/>, but the corresponding class name is actually SdtRun, not SdtBlock.

To get all of the content controls, you need to use the following code instead:

// Get the document instance
WordprocessingDocument document = ...; 

// Get all of the SdtElement elements
document.MainDocumentPart.Document.Descendants<SdtElement>();

The resultset will include SdtBlock and SdtRun elements.  This had me coding in circles for a few hours....

Note that likewise, the content element is different between the two.  For SdtBlock, the content element is an SdtContentBlock.  For SdtRun, the content element is an SdtContentRun.

Filed under: .Net, Office No Comments
12Oct/10Off

SharePoint Development Patterns: Getting Started

Posted by Charles Chen

Most teams, nowadays, use a virtual machine model for developing SharePoint solutions simply because it makes the most sense to do so.  From the teams and individuals that I've worked with, it also seems that most also work directly inside of the VM environment.

I've never found this palatable as there is a noticeable decrease in responsiveness and performance when working with Visual Studio inside of the VM.  Besides, I also write lots of non-SharePoint related code as well and it seems a pain to maintain two development environments.

In reality, for the most part, any development that you can do inside of the VM, you can do outside of the VM.  The benefit is increased responsiveness, increased performance of Visual Studio, and less hassle configuring the server development environment.  After I fire up my VM in the morning, I want to interact with it and think about it as little as possible.

Note that your mileage may vary; No Silver Bullet resonates with me.  Whether this development pattern works for you and your team will depend on the type of solution you're building, the type of hardware your company provides (in many cases, developer hardware is woefully underpowered so the only option is to work in the VM hosted on much more powerful hardware), and your own tastes.

Setting the Stage

The first step is to set up a common "staging" area for files.  Pick a path on your server and simply share the folder with Everyone with read/write/full control.  I'd recommend C:\_staging or C:\_deploy.

Step two is to share the SharePoint hive directory as well (we'll see why in a bit).  Once you've shared it, hit up the ISAPI directory ( C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\ISAPI) and copy the SharePoint .dll files to your local development environment.

Project Structure

Typical solution layout

My personal preference is to have a very clear delineation of responsibilities in my project structure.  Specifically, to address approachability of the solution, I break out all front-end web artifacts (.aspx, .ascx, .asmx, images, script files, etc.) into a separate web project from the get go.  This aids in approachability by allowing .NET developers to stay in their domain.  With the proper level engineering, we can ensure that no leakage of the SharePoint namespaces occurs across project boundaries.

I will typically break out the solution into at least 4 projects:

Framework - contains the framework level code.  This is code that is domain agnostic; you could bundle it up and reuse it in another project in an entirely different domain.  Things like model builders, exception factories, base model class, base gateway classes, and so on.  This project will reference the SharePoint libraries.  In some cases, it even makes sense to make this an entirely different solution altogether and only bring the binaries over.  This helps prevent "cheating" (modifying framework level code to meet one specific use case) by encouraging building solutions that the framework supports.

Core - contains the domain specific logic.  The goal is to build your domain model in this project in such a way that it nearly entirely abstracts away the existence of SharePoint to the web project.  The benefit is that you can prevent bad practices at the UI level (no more SPSite and SPWeb references in your code-behind) by simply preventing SharePoint from "leaking" into your web project.  This requires diligence to ensure that your domain model is sufficient to support your web projects.

No SharePoint reference!

Web - contains all of the web artifacts that will be deployed to SharePoint.  Controls, pages, scripts, images, CSS files, web service endpoints, and so on.  You'll note that the web project doesn't have any reference to SharePoint, whatsoever.

The reasoning is that this prevents bad practices.  It promotes a separation of concern for the developers who are working on your front-end and the ones that are working on your business logic.

Even in teams where these are the same developers, it promotes good programming practices by keeping your SharePoint specific artifacts from leaking into your web projects.  Common examples are string list names and CAML queries; you don't want to see those in your web project!

Package - contains all of the packaging information (your .wsp solution).  For the most part, the structure of this project is strongly dictated by WSPBuilder.  And I don't mind; it's simply the most logical way to layout your package.  Simply copy the files from your web project to the package project exactly as they would be deployed in your SharePoint environment.  One special note is the GAC directory.  Any binaries copied to this directory will be automatically packaged for deployment to the GAC on the server.  I typically just use XCOPY in the post-build events of all of my projects to copy the binaries over to the GAC directory of my package project.

Here's an example from my web project:

----------------------------------------------------------------------
:: // Copy the binaries to the package
----------------------------------------------------------------------
XCOPY "$(TargetDir)Cumulus.Web.*" "..\..\..\pkg\binaries" /Y
XCOPY "$(TargetDir)Cumulus.Web.dll" "$(SolutionDir)Cumulus.Package\GAC" /Y

----------------------------------------------------------------------
:: // Copy the content to the package.
----------------------------------------------------------------------
:: // Steps to move controls.
xcopy "$(ProjectDir)*.ascx" "$(SolutionDir)Cumulus.Package\SharePointRoot\TEMPLATE\CONTROLTEMPLATES\Cumulus.Controls" /s /Y /R

::// Steps to move pages
xcopy "$(ProjectDir)WebForms\*.aspx" "$(SolutionDir)Cumulus.Package\SharePointRoot\TEMPLATE\FEATURES\Cumulus.WebForms\WebForms\" /s /Y /R
xcopy "$(ProjectDir)LayoutPages\*.aspx" "$(SolutionDir)Cumulus.Package\SharePointRoot\TEMPLATE\LAYOUTS\Cumulus.LayoutPages\" /s /Y /R

::// Steps to move web services
xcopy "$(ProjectDir)WebServices\*.asmx" "$(SolutionDir)Cumulus.Package\SharePointRoot\TEMPLATE\LAYOUTS\Cumulus.WebServices" /s /Y /R

::// Steps to move Javascript files
xcopy "$(ProjectDir)*.js" "$(SolutionDir)Cumulus.Package\SharePointRoot\TEMPLATE\LAYOUTS\Cumulus.Content" /s /Y /R

::// Steps to move flash files
xcopy "$(ProjectDir)*.swf" "$(SolutionDir)Cumulus.Package\SharePointRoot\TEMPLATE\LAYOUTS\Cumulus.Content" /s /Y /R

::// Steps to move CSS files
xcopy "$(ProjectDir)*.css" "$(SolutionDir)Cumulus.Package\SharePointRoot\TEMPLATE\LAYOUTS\Cumulus.Content" /s /Y /R

::// Steps to move graphics files
xcopy "$(ProjectDir)*.gif" "$(SolutionDir)Cumulus.Package\SharePointRoot\TEMPLATE\LAYOUTS\Cumulus.Content" /s /Y /R
xcopy "$(ProjectDir)*.png" "$(SolutionDir)Cumulus.Package\SharePointRoot\TEMPLATE\LAYOUTS\Cumulus.Content" /s /Y /R
xcopy "$(ProjectDir)*.jpg" "$(SolutionDir)Cumulus.Package\SharePointRoot\TEMPLATE\LAYOUTS\Cumulus.Content" /s /Y /R
xcopy "$(ProjectDir)*.jpeg" "$(SolutionDir)Cumulus.Package\SharePointRoot\TEMPLATE\LAYOUTS\Cumulus.Content" /s /Y /R

::// Steps to move master pages
xcopy "$(ProjectDir)MasterPages\*.master" "$(SolutionDir)Cumulus.Package\SharePointRoot\TEMPLATE\FEATURES\Cumulus.MasterPages\MasterPages\" /s /Y /R

Depending on how you want to group your features, you may wish to create more packages.  Typically, if I'm deploying a large set of data into lists as well, I'll create a separate data package since redeploying the data typically takes longer than redeploying just the code artifacts and features.

Centralizing Your Output

You'll notice in my post build that I'm actually pushing the output up to a directory called "pkg" as well.  This is my package directory.  I do this for convenience as it serves as a single location that I can zip and send to someone.  It includes all of my setup batch scripts or PowerShell scripts, binaries, dependent assemblies, and the .wsp file.

Folder structure in the file system

It's simply a matter of calling batch scripts on the post-build event to copy out the relevant files.  This also makes deployment to our staging environment easier by allowing us to greatly simplify the XCOPY call to do the copy.  Here's the post-build event of my package project:

:: // Copy the setup files to the package directory.

XCOPY "$(ProjectDir)Setup\*.*" "..\..\..\pkg" /Y
XCOPY "$(TargetDir)Cumulus.Package.*" "..\..\..\pkg\binaries" /Y

Pushing to the Staging Location

Batch file located at the root

Once all of your content is centralized to the "pkg" directory, it's a trivial exercise to move it to the staging location (remember that shared drive that we set up earlier?).  I typically put this in another script at the root of my solution.

The reason is that I don't always want it to run when I build as I may need to build just to check for compile time errors.

If you've set up your post build events correctly, it should be a pretty trivial exercise at this point to copy your contents over to the staging directory.  If not, you can use this file as a centralized batch script for performing your copies instead.

Here's an example from another solution:

XCOPY web\spacl.web.solution\*.wsp ..\pkg\ /Y
XCOPY web\spacl.web.solution.data\*.wsp ..\pkg\ /Y

XCOPY web\spacl.web.solution\setup\*.* ..\pkg\ /Y /s
XCOPY web\spacl.web.solution.data\setup\*.* ..\pkg\ /Y /s

XCOPY ..\lib\spacl\*.* ..\pkg\ /Y /s
XCOPY ..\pkg\*.* \\scooby\_staging\ /Y /s

The last line performs the copy of all of the contents of the "pkg" directory over to the staging location.

In case you're wondering, the /Y argument for XCOPY is used to suppress the prompt for overwrite (it will overwrite by default) and the /S argument is used to copy directories and sub-directories.

Executing Remote Commands

Finally, we need to figure out how to execute commands remotely to deploy our artifacts.

A note here: I typically do not automate installation of the package; I like to do this manually as I like to be able to monitor and check for errors.  What I do automate is the redeployment of .dll and .pdb files.  In my development cycles, I typically only deploy the package once or twice a week (changes to the content type, event receiver definition, new .aspx pages, etc.), however, I will redeploy the binaries dozens of times as I modify code-behind and event receivers and so on.

To enable this, we need to perform three actions on the VM first to prepare the environment:

  1. Update the PATH environment variable to include the location where gacutil.exe is (on my VM, it's at: C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\Bin\x64)
  2. Download and unzip the SysInternals Suite (preferably to C:\SysInternals\ for simplicity).
  3. Update the PATH environment variable to include the location C:\SysInternals\.

On the host side, you will also need to download and unzip PsExec.  You can either put the location into your PATH or you can copy it to a directory under your solution (I like to do this and add it to source control so that everyone that checks out the source will automatically be able to build against the VM).

The following line needs to be added to your copy-wsp.bat file (or to a separate file altogether):

::// Deploys the binaries only (doesn't reinstall .wsp).
..\tools\sysinternals\psexec.exe \\scooby -u administrator -p Passw0rd^
       -e -d -w C:\_staging cmd /c \\scooby\_staging\_gac-install.bat

Here's a breakdown of the arguments:

  • \\scooby : remote server name
  • -u : the user to log in as
  • -p : the password of the user
  • -e : Does not load the specified account's user profile
  • -d : Don't wait for the process to terminate (non-interactive)
  • -w C:\_staging : Set the working directory on the remote environment
  • cmd /c \\scooby\_staging\_gac-install.bat : Execute the batch file.  The /c argument to cmd tells it to terminate after execution.

The batch script can contain anything you want.  Typically, I have a few calls to gacutil.exe and a few XCOPY commands to copy the .pdb files for remote debugging followed by a stop and restart of the web server:

@ECHO OFF

gacutil.exe /il web-assemblies.txt

XCOPY "Cumulus.Web.pdb" "c:\Windows\assembly\GAC_MSIL\Cumulus.Web\1.0.0.0__61550e251eaaab86\" /Y
XCOPY "Cumulus.Web.pdb" "C:\inetpub\wwwroot\wss\VirtualDirectories\cumulus.dev.com80\bin\" /Y

XCOPY "Cumulus.Core.pdb" "c:\Windows\assembly\GAC_MSIL\Cumulus.Core\1.0.0.0__61550e251eaaab86\" /Y
XCOPY "Cumulus.Core.pdb" "C:\inetpub\wwwroot\wss\VirtualDirectories\cumulus.dev.com80\bin\" /Y

XCOPY "Cumulus.Framework.pdb" "c:\Windows\assembly\GAC_MSIL\Cumulus.Framework\1.0.0.0__61550e251eaaab86\" /Y
XCOPY "Cumulus.Framework.pdb" "C:\inetpub\wwwroot\wss\VirtualDirectories\cumulus.dev.com80\bin\" /Y

net stop w3svc
net start w3svc

You can get more creative and make it as complex as you need it to be.  You can even deploy and call your own custom command line tools if the complexity justifies it.  Keep in mind that this only redeploys your binaries.

When it comes to .aspx, .js, and .ascx files, I generally do the work right on the server using EditPlus (or your text editor of choice).  Remember that we shared the SharePoint hive directory?  Yup, you can just open it from your host machine now and work with it as if it were a local file.  Just don't forget to synchronize it back!

I like to use FreeCommander on the host for this purpose.  If you prefer to edit in Visual Studio, you can do that, too, and just use FreeCommander to synchronize your edits to the TEMPLATE directory.  Because SharePoint is just a big ol' ASP.NET application, you can develop for it just like any other ASP.NET application; no need to redeploy your entire .wsp for changes to the content files.  This speeds up development cycles considerably as it takes a lot of extra thumb twiddling time out of your daily workflow.

Executing Batch Scripts from Visual Studio

One hitch in all of this is that you'll soon find that having to leave Visual Studio to execute batch scripts sucks. Ideally, we'd be able to execute batch scripts right from Visual Studio.    You can set this up pretty easily as an external tool:

Configuring an external tool to execute batch files

Note that it takes the item path as a command.

To make it even easier, we can hook it up to a hotkey.

Wiring the keyboard shortcut.

Now when you select the file and hit CTRL+Shift+X, the batch file is automatically executed.

Remote Debugging

This is, quite honestly, the most complex part to get right.  The general tip is to ensure that the Visual Studio Remote Debugging Service is running under your account (with the same username and same password).  You don't have to be on the same domain (in fact, my VM typically runs its own domain server as well).

Attaching to a remote machine

Debugging the remote process is as simple as pointing the server and selecting the right process by selecting Debug > Attach to Process > enter the remote machine name > select the right process (see image on the left).

Don't take my lead on this; I didn't follow best practices in setting up my development VM :-D; I set up all of my services to use the Administrator account for simplicity.  However, it causes confusion as I can't tell which one is running my front-end.

In a properly configured environment, it should be a lot easier to determine which w3wp.exe instance to attach to based on the user names (one of those is the service application, one is the central admin, and one is the actual web application).

Wrap Up

I've heard and read many other suggestions for building solutions for SharePoint including building custom Visual Studio add-ins, Visual Studio macros, or MSBuild tasks.  In general, I don't like these approaches for day-to-day development because: 1) they're hard to manage, 2) they're hard for the general developer to understand, debug, extend, and fix, and 3) they require too much effort!

Batch files are easy.  XCOPY is easy.  Why make it harder than it has to be?

The goal is to minimize the amount of time it takes to realize and deploy changes in the easiest possible way.

Filed under: Dev, SharePoint No Comments
10Oct/10Off

Philly.NET Code Camp – Follow Up

Posted by Charles Chen

(One of many I foresee for the next week or so - lots of blogging to catch up on.)

Firstly, thanks to everyone who sat through my session and the excellent questions and feedback.  I honestly expected < 10 people.  I hope that folks were able to get a lot of value from it.  Apologies for the pacing, but I wanted to address as many questions as possible in context.

I'll detail a lot of the material that I covered in additional blog posts in the near future (huge backlog!), but I do want to re-emphasize that the session wasn't about a Silver Bullet; each team will have different requirements and preferences, each project will have different challenges, each problem may require a different approach; the key takeaway is that at some point, it's important to step back and take a wide-angle view of the challenges that your team faces and figure out (ideally) simple solutions to overcome them.  Find where there are commonalities and try to abstract that away using an object-oriented approach to reduce duplication and effort.

I'll get into this more in a more comprehensive post (I had initially planned to have the whole presentation as a blog post before today, but it was ultimately too much effort in one shot).

In the mean time, you can download the slidedeck here: SharePoint-CodeCamp.2010.2

I have a multitude of posts under my SharePoint category, but here are the most relevant ones:

(Note: some of these code samples are just a tad bit older as I've evolved it bit by bit; I'll try to get some updates out.)

Some relevant external links:

Links to some relevant tools:

  • WSPBuilder - used to easily build and assemble .wsp packages.  Works with 2007 and 2010; I have a soft spot for it.
  • PsExec and DebugView - part of the SysInternals suite.  PsExec is for remote execution.  DebugView is for monitoring the system trace and system debug output.
  • log4net - a general purpose logging library.  Very important for SharePoint development to track down bugs.  The configuration is no different for SharePoint than for an ASP.NET web application
  • ReSharper - a general purpose development tool for Visual Studio.  Word of warning: it's addictive; once you start to use it, you'll never go back to vanilla VS.
  • FreeCommander - great for synchronizing files between different environments.

And if you liked my laptop, here's how you can configure one of your very own for < $1500: Laptop Buying for Developers.

Again, thanks to everyone that made it to the session; please feel free to email me with any questions and I'll try my best to address them in a timely manner.  Look for more followup posts in the near future.

And finally,  you can leave feedback for any of the sessions that you attended today; please do so!

6Oct/10Off

Programmatically Adding an Event Receiver to a Content Type

Posted by Charles Chen

You'll recall from a previous post that you can add an event receiver to a content type.

That's all well and good if you're deploying your event receiver with your content type from the get go, but what if you need to associate an event receiver with an existing content type?

// Remove existing definition for the assembly name, class, and type.
foreach(SPEventReceiverDefinition definition in contentType.EventReceivers)
{
	if(definition.Class != className && definition.Assembly != assemblyName
		&& definition.Type != eventReceiverType)
	{
		continue;
	}

	definition.Delete();
	contentType.Update(true);
	break;
}

SPEventReceiverDefinition eventReceiverDefinition = contentType.EventReceivers.Add();
eventReceiverDefinition.Class = className; // String
eventReceiverDefinition.Assembly = assemblyName; // String
eventReceiverDefinition.Type = eventReceiverType; // SPEventReceiverType
eventReceiverDefinition.Data = documentType; // Arbitrary input data (String)
eventReceiverDefinition.Update();

contentType.Update(true);

Easy!

Filed under: .Net, SharePoint No Comments
1Oct/10Off

Irony .NET Language Implementation Kit

Posted by Charles Chen

I came across Irony (http://irony.codeplex.com/) today while contemplating whether to use antlr or not for a project I'm working on where the requirements call for allowing users to write small conditional instructions.

From the project site description:

Irony is a development kit for implementing languages on .NET platform. It uses the flexibility and power of c# language and .NET Framework 3.5 to implement a completely new and streamlined technology of compiler construction.

Unlike most existing yacc/lex-style solutions Irony does not employ any scanner or parser code generation from grammar specifications written in a specialized meta-language. In Irony the target language grammar is coded directly in c# using operator overloading to express grammar constructs. Irony's scanner and parser modules use the grammar encoded as c# class to control the parsing process. See the expression grammar sample for an example of grammar definition in c# class, and using it in a working parser.

Compared to antlr, it seemed much simpler from the samples.

In the past, I've usually used Spring.NET's expression evaluation functionality (built on antlr); however, the entirety of the Sprint.NET library seemed too heavy for the simple scenario I had to implement and I'd still have to do some string parsing anyways if I used it.  So I set out to try out Irony for myself instead.

The basic gist of the solution is that the interface needs to allow users to specify meta-instructions as strings in the form of:

if ("property1"="value1") action("param1","param2")

Once the user has configured the instructions for a given template document, the meta-instructions are executed when an instance of the template is created and metadata properties are set on the document (in SharePoint).  So the goal is to define a set of meta-instructions and actions which allow users to build dynamic document templates.  For example:

if ("status"="draft") delete()
if ("status"="published") lock()
if ("status"="pending") insert("22ad25d6-3bbd-45f3-bc63-e0e1b931e247")

The first step is to define the grammar (I'm sure this isn't very well constructed BNF, but I need to brush up on that :-D):

using Irony.Parsing;

namespace IronySample
{
    public class AssemblyDirectiveGrammar : Grammar
    {
        public AssemblyDirectiveGrammar() : base(false)
        {
            // Terminals
            StringLiteral property = new StringLiteral("property", "\"");
            StringLiteral value = new StringLiteral("value", "\"");
            StringLiteral param = new StringLiteral("param", "\"");
            IdentifierTerminal action = new IdentifierTerminal("action");

            // Non-terminals
            NonTerminal command = new NonTerminal("command");
            NonTerminal ifStatement = new NonTerminal("ifStatement");
            NonTerminal comparisonStatement = new NonTerminal("comparisonStatement");
            NonTerminal actionStatement = new NonTerminal("actionStatement");
            NonTerminal argumentsStatement = new NonTerminal("argumentsStatement");
            NonTerminal parametersStatement = new NonTerminal("paremeters");
            NonTerminal parameterStatement = new NonTerminal("parameter");

            // BNF
            command.Rule = ifStatement + NewLine;
            ifStatement.Rule = ToTerm("if") + comparisonStatement + actionStatement;
            comparisonStatement.Rule = "(" + property + "=" + value + ")";
            actionStatement.Rule = action + argumentsStatement;
            argumentsStatement.Rule = "(" + parametersStatement + ")";
            parametersStatement.Rule = MakePlusRule(parametersStatement, ToTerm(","),
                        parameterStatement) | Empty;
            parameterStatement.Rule = param;

            MarkPunctuation("if","(", ")", ",", "=");

            LanguageFlags = LanguageFlags.NewLineBeforeEOF;

            Root = command;
        }
    }
}

This defines the elements of the "language" (see the Irony wikibook for a better explanation).  (You'll note that the "if" is entirely superfluous; I decided to leave it in there just so that it would make more sense to the expression authors as they create the meta-instruction.)

As you're writing your grammar, it'll be useful to test the grammar using the provided grammar explorer tool to check for errors:

Irony grammar explorer tool

Once the grammar is complete, the next step is to make use of it.

I wrote a simple console program that mocks up some data input:

private static void Main(string[] args)
{
    // Mock up the input.
    Dictionary<string, string> inputs = new Dictionary<string, string>
                                        {
                                          {"status", "ready"}
                                        };

    // Mock up the instructions.
    string instructions = "if (\"status\"=\"ready\") Echo(\"Hello, World!\")";

    Program program = new Program();
    program.Run(inputs, instructions);
}

The idea is to simulate a scenario where the metadata on a document stored in SharePoint is mapped to a dictionary which is then passed to a processor.  The processor will iterate through the instructions embedded in the document and perform actions.

In this example, I've mapped the statement directly to a method on the Program class for simplicity.  The action is a method called "Echo" which will be fed on parameter: "Hello, World".  The Run() method contains most of the logic:

private void Run(Dictionary<string, string> inputs, string instructions)
{
    // Run the parser
    AssemblyDirectiveGrammar grammar = new AssemblyDirectiveGrammar();

    LanguageData language = new LanguageData(grammar);

    Parser parser = new Parser(language);

    ParseTree tree = parser.Parse(instructions);

    List<ParseTreeNode> nodes = new List<ParseTreeNode>();

    // Flatten the nodes for easier processing with LINQ
    Flatten(tree.Root, nodes);

    var property = nodes.Where(n => n.Term.Name == "property").FirstOrDefault().Token.Value.ToString();
    var value = nodes.Where(n => n.Term.Name == "value").FirstOrDefault().Token.Value.ToString();
    var action = nodes.Where(n => n.Term.Name == "action").FirstOrDefault().Token.Value.ToString();
    string[] parameters = (from n in nodes
                           where n.Term.Name == "param"
                           select Convert.ToString(n.Token.Value)).ToArray();

    // Execute logic
    string inputValue = inputs[property];

    if(inputValue != value)
    {
        return;
    }

    MethodInfo method = GetType().GetMethod(action);

    if(method == null)
    {
        return;
    }

    method.Invoke(this, parameters);
}

You can see that in this case, the evaluation is very simple; it's a basic string equality comparison.  The action execution is basic as well.  It simply executes a method of the same name on the current object instance.  Try running the code and changing the "ready" value in the dictionary and see what happens.

A helper method is included to flatten the resultant abstract syntax tree for querying with LINQ (could possibly be done with a recursive LINQ query?):

public void Flatten(ParseTreeNode node, List<ParseTreeNode> nodes)
{
    nodes.Add(node);

    foreach (ParseTreeNode child in node.ChildNodes)
    {
        Flatten(child, nodes);
    }
}

And finally, the actual method that gets invoked (the action):

public void Echo(string message)
{
    System.Console.Out.WriteLine("Echoed: {0}", message);
}

This sample is fairly basic, but it was pretty easy to get up and running (far easier than antlr) and there's lots of potential for other use cases.  One thing I've found lacking so far is documentation.  It's fairly sparse so there's going to be a lot of trial and error, but the good news is that the source code includes a lot of examples (some of them fairly complex including C#, SQL, and Scheme grammars).

Filed under: .Net, DevTools No Comments