<CharlieDigital/> Programming, Politics, and uhh…pineapples

17Jul/150

Adventures in Single-Sign-On: Cross Domain Script Request

Posted by Charles Chen

Consider a scenario where a user authenticates with ADFS (or equivalent identity provider (IdP)) when accessing a domain such as https://www.domain.com (A) and then, from this page, a request is made to https://api.other-domain.com/app.js (B) to download a set of application scripts that would then interact with a set of REST based web services in the B domain.  We'd like to have SSO so that claims provided to A are available to B and that the application scripts downloaded can then subsequently make requests with an authentication cookie.

Roughly speaking, the scenario looks like this:

Depiction of the scenario

Depiction of the scenario

It was straightfoward enough to set up the authentication with ADFS using WIF 4.5 for each of A and B following the MSDN "How To"; I had each of the applications separately working with the same ADFS instance, but the cross domain script request from A to B at step 5 for the script file generated an HTTP redirect sequence (302) that resulted in an XHTML form from ADFS with Javascript that attempts to execute an HTTP POST for the last leg of the authentication.  This was good news because it meant that ADFS recognized the user session and tried to issue another token for the user in the other domain without requiring a login.

However, this obviously posed a problem as, even though it appeared as if it were working, the request for the script could not succeed because of the text/html response from ADFS.

Here's what https://www.domain.com/default.aspx looks like in this case:

<html>
  <body>
    ...
    <script type="text/javascript" src="https://api.other-domain.com/app.js"></script>
  </body>
</html>

This obviously fails because the HTML content returned from the redirect to ADFS cannot be consumed.

I scratched my head for a bit and dug into the documentation for ADFS, trawled online discussion boards, and tinkered with various configurations trying to figure this out with no luck.  Many examples online that discuss this scenario when making a web service call from the backend of one application to another using bearer tokens or WIF ActAs delegation, but these were ultimately not suited for what I wanted to accomplish as I didn't want to have to write out any tokens into the page (for example, adding a URL parameter to the app.js request), make a backend request for the resource, or use a proxy.

(I suspect that using the HTTP GET binding for SAML would work, but for the life of me, I can't figure out how to set this up on ADFS...)

In a flash of insight, it occurred to me that if I used a hidden iframe to load another page in B, I would then have a cookie in session to make the request for the app.js!

Here's the what the page looks like on the page in A:

<script type="text/javascript">
    function loadOtherStuff()
    {
        var script = document.createElement('script');
        script.setAttribute('type', 'text/javascript');
        script.setAttribute('src', 'https://api.other-domain.com/appscript.js');
        document.body.appendChild(script);
    }
</script>
<iframe src="https://api.other-domain.com" style="display: none" 
    onload="javascript:loadOtherStuff()"></iframe>  

Using the iframe, the HTTP 302 redirect is allowed to complete and ADFS is able to set the authentication cookie without requiring a separate sign on since it's using the same IdP, certificate, and issuer thumbprint.  Once the cookie is set for the domain, then subsequent browser requests in the parent document to the B domain will carry along the cookie!

The request for appscript.js is intercepted by an IHttpHandler and authentication can be performed to check for the user claims before returning any content. This then allows us to stream back the client-side application scripts and templates via AMD through a single entry point (e.g. appscript.js?app=App1 or a redirect to establish a root path depending on how you choose to organize your files).

Any XHR requests made subsequently still require proper configuration of CORS on the calling side:

$.ajax({
    url: 'https://api.other-domain.com/api/Echo', 
    type: 'GET',
    crossDomain: true,
    xhrFields: {
        withCredentials: true
    },
    success: function(result){ window.alert('HERE'); console.log('RETRIEVED'); console.log(result); }
});

And on the service side:

<!--//
    Needed to allow cross domain request.
    configuration/system.webServer/httpProtocol
//-->
<httpProtocol>
    <customHeaders>
        <add name="Access-Control-Allow-Origin" value="https://www.domain.com" />
        <add name="Access-Control-Allow-Credentials" value="true" />
        <add name="Access-Control-Allow-Headers" value="accept,content-type,cookie" />
        <add name="Access-Control-Allow-Methods" value="POST,GET,OPTIONS" />
    </customHeaders>
</httpProtocol>

<!--//
    Allow CORS pre-flight
    configuration/system.webServer/security
//-->
<security>
    <requestFiltering allowDoubleEscaping="true">
        <verbs>
            <add verb="OPTIONS" allowed="true" />
        </verbs>
    </requestFiltering>
</security>

<!--//
    Handle CORS pre-flight request
    configuration/system.webServer/modules
//-->
<add name="CorsOptionsModule" type="WifApiSample1.CorsOptionsModule" />

The options handler module is a simple class that responds to OPTION requests and also dynamically adds a header to the response:

    /// <summary>
    ///     <c>HttpModule</c> to support CORS.
    /// </summary>
    public class CorsOptionsModule : IHttpModule
    {
        #region IHttpModule Members
        public void Dispose()
        {
            //clean-up code here.
        }

        public void Init(HttpApplication context)
        {
            context.BeginRequest += HandleRequest;
            context.EndRequest += HandleEndRequest;
        }

        private void HandleEndRequest(object sender, EventArgs e)
        {
            string origin = HttpContext.Current.Request.Headers["Origin"];

            if (string.IsNullOrEmpty(origin))
            {
                return;
            }

            if (HttpContext.Current.Request.HttpMethod == "POST" && HttpContext.Current.Request.Url.OriginalString.IndexOf(".svc") < 0)
            {
                HttpContext.Current.Response.AddHeader("Access-Control-Allow-Origin", origin);
            }
        }

        private void HandleRequest(object sender, EventArgs e)
        {
            if (HttpContext.Current.Request.HttpMethod == "OPTIONS")
            {
                HttpContext.Current.Response.End();
            }
        }

        #endregion
    }

The end result is that single-sign-on is established across two domains for browser to REST API calls using simple HTML-based trickery (only tested in FF!).

Filed under: .Net, Identity No Comments
14Mar/15Off

Adding Support for Azure AD Login (O365) to MVC Apps

Posted by Charles Chen

I spent the day toying around with ASP.NET MVC 5 web applications and authentication.  I won't cover the step-by-step as there are plenty of blogs that have it covered.

It seems that online, most examples and tutorials show you either how to use your organizational Azure AD account or social identity providers but not both.

I wanted to be able to log in using Facebook, Google, and/or the organizational account I use to connect to Office 365.

This requires that you select Individual User Accounts when prompted to change the authentication mode (whereas most tutorials have you select "Organization Accounts"):

mvc-use-individual-account

This will give you the baseline needed to add the social login providers (more on that later).

To enable Windows Azure AD, you will need to first login into Azure and add an application to your default AD domain.  In the management portal:

  1. Click on ACTIVE DIRECTORY in the left nav
  2. Click the directory
  3. Click the APPLICATIONS link at the top
  4. Now at the bottom, click ADD to add a new application
  5. Select Add an application my organization is developing
  6. Enter an arbitrary name and click next
  7. Now in the App properties screen, you will need to enter your login URL (e.g. https://localhost:4465/Account/Login) and for the APP ID URI, you cannot use "localhost".  You should use your Azure account info like: https://myazure.onmicrosoft.com/MyApp.  The "MyApp" part is arbitrary, but the bolded text must match your directory identifier.

Most importantly, once you've created it, you need to click on the CONFIGURE link at the top and turn on the setting APPLICATION IS MULTI-TENANT:

mvc-multi-tenant

If you fail to turn this on, the logins are limited to the users that are in your Azure AD instance only; you will not be able to log on with accounts you use to connect to Office 365.  You'll get an error like this:

Error: AADSTS50020: User account ‘jdoe@myo365domain.com’ from external identity provider ‘https://sts.windows.net/1234567e-b123-4123-9112-912345678e51/’ is not supported for application ‘2123456f-b123-4123-9123-4123456789e5'. The account needs to be added as an external user in the tenant. Please sign out and sign in again with an Azure Active Directory user account.

An important note is that if you used "localhost" in step 7, the UI will not allow you to save the settings with an error "The App ID URI is not available. The App ID URI must be from a verified domain within your organization's directory."

Once you've enabled this, we're ready to make the code changes required.

First, you will need to install the OpenId package from nuget using the following command:

install-package microsoft.owin.security.openidconnect

Next, in the default Startup.Auth.cs file generated by the project template, you will need to add some additional code.

First, add this line:

app.SetDefaultSignInAsAuthenticationType(CookieAuthenticationDefaults.AuthenticationType);

Then, add this:

app.UseOpenIdConnectAuthentication(new OpenIdConnectAuthenticationOptions
{
    ClientId = "138C1130-4B29-4101-9C84-D8E0D34D222A",
    Authority = "https://login.windows.net/common",
    PostLogoutRedirectUri = "https://localhost:44301/",                
    Description = new AuthenticationDescription
    {
        AuthenticationType = "OpenIdConnect",
        Caption = "Azure OpenId  Connect"
    },
    TokenValidationParameters = new TokenValidationParameters
    {
        // If you don't add this, you get IDX10205
        ValidateIssuer = false   
    }
});

There are two very important notes.  The first is that the Authority must have the /common path and not your Azure AD *.onmicrosoft.com path.

The second note is that you must add the TokenValidationParameters and set ValidateIssuer to false.

If you don't set this to false, you'll get the following 500 error after you successfully authenticate against Azure AD with your organizational O365 account:

IDX10205: Issuer validation failed. Issuer: ‘https://sts.windows.net/F92E09B4-DDD1-40A1-AE24-D51528361FEC/’. Did not match: validationParameters.ValidIssuer: ‘null’ or validationParameters.ValidIssuers: ‘https://sts.windows.net/{tenantid}/’

I think that this is a hack and to be honest, I'm not quite certain of the consequences of not validating the issuer, but it seems that there aren't many answers on the web for this scenario yet.  Looking at the source code where the exception originates, you'll see the method that generates it:

public static string ValidateIssuer(string issuer, SecurityToken securityToken, TokenValidationParameters validationParameters)
{
    if (validationParameters == null)
    {
        throw new ArgumentNullException("validationParameters");
    }
    
    if (!validationParameters.ValidateIssuer)
    {
        return issuer;
    }
    
    if (string.IsNullOrWhiteSpace(issuer))
    {
        throw new SecurityTokenInvalidIssuerException(string.Format(CultureInfo.InvariantCulture, ErrorMessages.IDX10211));
    }
    
    // Throw if all possible places to validate against are null or empty
    if (string.IsNullOrWhiteSpace(validationParameters.ValidIssuer) && (validationParameters.ValidIssuers == null))
    {
        throw new SecurityTokenInvalidIssuerException(string.Format(CultureInfo.InvariantCulture, ErrorMessages.IDX10204));
    }
    
    if (string.Equals(validationParameters.ValidIssuer, issuer, StringComparison.Ordinal))
    {
        return issuer;
    }
    
    if (null != validationParameters.ValidIssuers)
    {
        foreach (string str in validationParameters.ValidIssuers)
        {
            if (string.Equals(str, issuer, StringComparison.Ordinal))
            {
                return issuer;
            }
        }
    }
    
    throw new SecurityTokenInvalidIssuerException(
        string.Format(CultureInfo.InvariantCulture, ErrorMessages.IDX10205, issuer, validationParameters.ValidIssuer ?? "null", Utility.SerializeAsSingleCommaDelimitedString(validationParameters.ValidIssuers)));
}

We're simply short circuiting the process.  It's clear that there is no matching issuer, but it's not quite clear to me yet where/how to configure that.

So what about the other social IdP's?  It's important to note that for Google, not only do you have to create a new client ID in the Google Developer Console, but you also need to enable the Google+ API:

mvc-google-api

You'll just get a bunch of useless error messages if you don't enable the API.

If you manage to get it all working, you should see the following options in the login screen:

mvc-azure

And when you click it, you should be able to log in using the same organizational credentials that you use to connect to Office 365:

mvc-login-azure

Filed under: .Net, MVC 1 Comment
12Aug/14Off

Invoking Custom WCF Services in SharePoint with Claims

Posted by Charles Chen

In SharePoint, if you host a custom WCF service in a claims-enabled application, the authentication via NTLM is actually quite tricky if you are attempting to invoke it from a console application, for example.

There are various articles and Stackoverflow entries on using System.ServiceModel.Description.ClientCredentials on either the ChannelFactory or the client instance, but all of these did not work in the sense that on the server side, SPContext.Current.Web.CurrentUser was null and ServiceSecurityContext.Current.IsAnonymous returned true.

It seems like it should be possible to invoke the service authenticating through NTLM as if the user were accessing it through the web site.

In fact, it is possible, but it involves some manual HTTP requests to get this to work without doing some Windows Identity Foundation programming and consequently setting up tons of infrastructure to get what seems like a relatively simple and straightforward scenario to work.

The first step is to actually manually retrieve the FedAuth token:

/// <summary>
///     Gets a claims based authentication token by logging in through the NTLM endpoint.
/// </summary>
/// <returns>The FedAuth token required to connect and authenticate the session.</returns>
private string GetAuthToken()
{
    string authToken = string.Empty;

    CredentialCache credentialCache = new CredentialCache();
    credentialCache.Add(new Uri(_portalBaseUrl), "NTLM", new NetworkCredential(_username, _password, _domain));

    HttpWebRequest request = WebRequest.Create(string.Format("{0}/_windows/default.aspx?ReturnUrl=%2f_layouts%2fAuthenticate.aspx%3fSource%3d%252F&Source=%2F ", _portalBaseUrl)) as HttpWebRequest;
    request.Credentials = credentialCache;
    request.AllowAutoRedirect = false;
    request.PreAuthenticate = true;

    // SharePoint doesn't like it if you don't include these (403 Forbidden)?
    request.UserAgent = "Mozilla/5.0 (Windows NT 6.3; WOW64; Trident/7.0; rv:11.0) like Gecko";
    request.Accept = "text/html, application/xhtml+xml, */*";

    HttpWebResponse response = request.GetResponse() as HttpWebResponse;

    authToken = response.Headers["Set-Cookie"];

    return authToken;
}

There are three keys here:

  1. The first is that AllowAutoRedirect must be false or you will get an error that you are getting too many redirects.  What seems to happen is that the cookies are not set correctly when using auto redirect so the chain will continue until an exception is thrown.  In Fiddler, you will see this as a long cycle of requests and redirects.
  2. The second is that the URL must be the NTLM authentication endpoint (/_windows...") as any other URL will return a 302 and for that, you will need to set AllowAutoRedirect to true.
  3. The third is that it seems as if SharePoint really doesn't like it when the user agent and accept headers are not included.  I tried it later without these and it seemed to work, but I could not get it to work without them (403 unauthorized) initially.

Once you have the FedAuth token, you are able to basically impersonate the user.  To do so, you will need to include a cookie in your HTTP header request:

// Get the FedAuth cookie
var authToken = GetAuthToken();

// Create the connection artifacts.            
EndpointAddress endpointAddress = new EndpointAddress(endpointUrl);
BasicHttpBinding binding = new BasicHttpBinding();            

ChannelFactory<ISomeService> channelFactory = 
    new ChannelFactory<ISomeService>(binding, endpointAddress);

// Initiate the client proxy using the connection and binding information.
ISomeService client = channelFactory.CreateChannel();

using (new OperationContextScope((IContextChannel) client))
{
    // Set the authentication cookie on the outgoing WCF request.
    WebOperationContext.Current.OutgoingRequest.Headers.Add("Cookie", authToken);

    // YOUR API CALLS HERE    
}

The key is to add the header on the outgoing request before making your service API calls.

With this, you should see that you are able to invoke SharePoint hosted custom WCF service calls in claims-based web applications with NTLM authentication.

Filed under: .Net, SharePoint, WCF No Comments
26Nov/13Off

Preventing the Garbage Collector From Ruining Your Day

Posted by Charles Chen

If you're working with ZeroMQ, you may run into an exception with the message "Context was terminated".

It turns out that this is due to the garbage collector cleaning up (or attempting to clean up?) the ZmqContext.

Found this out via this handy thread on Stack, but what about cases where you can't use a using statement?

For example, in a Windows Service, I create the context on the OnStart method and destroy the context on the OnStop method.

In this case, an alternative is to use the GC.KeepAlive(Object obj) method to prevent the garbage collector from collecting the object until after the call to this method.  It seems counter intuitive, but it is actually a signal to tell the garbage collector that it can collect the object at any point after this call.

Filed under: .Net, Self Note, ZeroMQ No Comments
12Nov/13Off

An Architecture for High-Throughput Concurrent Web Request Processing

Posted by Charles Chen

I've been working with ZeroMQ lately and I think I've fallen in love.

It's rare that a technology or framework just jumps out at you, but here is one that will get your head spinning on the different ways that it can make your architecture more scalable, more powerful, and all the while offering a frictionless way of achieving this.

I've been building distributed, multi-threaded applications since college, and ZeroMQ has changed everything for me.

It initially started with a need to build a distributed event processing engine.  I had wanted to try implementing it in WCF using peer-to-peer and/or MSMQ endpoints, but the thought of the complexity of managing that stack along with the configuration and setup seemed like it would be at least fruitful to look into a few other alternatives.

RabbitMQ and ZeroMQ were the clear front-runners for me.  I really liked the richness of documentation and examples with RabbitMQ and if you look at some statistics, it has a much greater rate of mentions on Stack so we can assume that it has a higher rate of adoption.  But at the core of it, I think that there really is no comparison between these two except for the fact that they both have "MQ" in their names.

It's true that one could build RabbitMQ like functionality on top of ZeroMQ, but to a degree, I think that would be defeating the purpose.  The beauty of ZeroMQ is that it's so lightweight and so fast that it's really hard to believe; there's just one reference to add to your project.  No central server to configure.  No single point of failure.  No configuration files.   No need to think about failovers and clustering.  Nothing.  Just plug and go.  But there is a cost to this: a huge tradeoff in some of the higher level features that -- if you want -- you have to build yourself.

If you understand your use cases and you understand the limitations of ZeroMQ and where it's best used, you can find some amazing ways to leverage it to make your applications more scalable.

One such use case I've been thinking about is using it to build a highly scalable web-request processing engine which would allow scaling by adding lots of cheap, heterogeneous nodes.  You see, with ASP.NET, unless you explicitly build a concurrency-oriented application, your web server processing is single-threaded per request and you can only ever generate output HTML at the sum of the costs of generating each sub part of your view.  To get around this, we could consider a processing engine that would be able to parse controls and send the processing off -- in parallel -- to multiple processors and then reassemble the output HTML before feeding it back to the client.  In this scenario, the cost of rendering the page is the overhead of the request plus the cost of the most expensive part of the view generation.

The following diagram conceptualizes this in ZeroMQ:

zmq-processing

Still a work in progress...

Even if an ASP.NET application is architected and programmed for concurrency from the get-go, you are limited by the constraints of the hardware (# of concurrent threads).  Of course, you can add more servers and put a load balancer in front of them, but this can be an expensive proposition.  Perhaps a better architecture would be to design a system that allows adding cheap, heterogeneous server instances that do nothing but process parts of a view.

 In such an architecture, it would be possible to scale the system at any level by simply adding more nodes -- at any level.  They could be entirely heterogeneous; no need for IIS, in fact, the servers don't even have to be Windows servers.  The tradeoff is that you have to manage the session information yourself and push the relevant information down through the pipeline or at least make it accessible via a high speed interface (maybe like a Redis or Memcached?).

But the net gain is that it would allow for concurrent processing of a single web request and build an infrastructure for handling web requests that is easily scaled with cheap, simple nodes.

Filed under: .Net, Awesome, ZeroMQ No Comments
16Aug/12Off

SharePoint ListData.svc 500 Error

Posted by Charles Chen

If you're fighting with the SharePoint ListData.svc with an odd error:

An error occurred while processing this request.

And you are using an OData operator like endswith, you may encounter this error and be puzzled with why it works for some fields but not others.

Tried various theories -- indexed column?  use the column in a view?  maybe error with the column? -- with no love until Rob thought that it might have to do with empty values.

Turns out that the underlying implementation of ListData.svc doesn't quite like it if you have un-set or "null" values in your text fields.  So a query like this:

http://collab.dev.com/_vti_bin/ListData.svc/Test?$filter=endswith(PrimaryFaxNumber, '6481099') eq true

Will fail if there is an item in the list with an empty value for PrimaryFaxNumber.

However, using a nullity check will fix the issue:

http://collab.dev.com/_vti_bin/ListData.svc/Test?$filter=PrimaryFaxNumber ne null and endswith(PrimaryFaxNumber, '6481099') eq true
Filed under: .Net, SharePoint No Comments
15Sep/11Off

Now I REALLY Can’t be Bothered to Learn Silverlight

Posted by Charles Chen

I've blogged about it before, but seriously, the question has to be asked: if you're a developer with limited bandwidth to focus on mastering new technologies, why would you spend that time on Silverlight?

Not only is WP7 floundering, but now the news is out: the Metro version of IE 10 in Windows 8 won't support any plugins - including Silverlight:

Windows 8 will have two versions of Internet Explorer 10: a conventional browser that lives on the legacy desktop, and a new Metro-style, touch-friendly browser that lives in the Metro world. The second of these, the Metro browser, will not support any plugins. Whether Flash, Silverlight, or some custom business app, sites that need plugins will only be accessible in the non-touch, desktop-based browser.

Should one ever come across a page that needs a plugin, the Metro browser has a button to go to that page within the desktop browser. This yanks you out of the Metro experience and places you on the traditional desktop.

The rationale is a familiar one: plugin-based content shortens battery life, and comes with security, reliability, and privacy problems. Sites that currently depend on the capabilities provided by Flash or Silverlight should switch to HTML5.

If you're not on the HTML5 boat yet, I think the writing is on the wall: the Silverlight party is over (thank goodness).

Filed under: .Net, Awesome No Comments
13Sep/11Off

Lesson Learned on SharePoint Service Applications

Posted by Charles Chen

If you're setting out on writing your own SharePoint service applications, there is an important lesson that you should keep in mind (instead of learning it the hard way): ensure that all of your proxy, application proxy, service, service application, and service instance classes have public parameterless (default) constructors.

Otherwise, you'll have a heck of a time starting, instantiating, and uninstalling services with lots of MissingMethodExceptions and "{class} cannot be deserialized because it does not have a public default constructor" error messages.

Oddly enough, one thing I've learned from this is that the STSADM commands are often more "powerful" than the equivalent Powershell commands.  For example, Remove-SPSolution, even with the -force parameter, still failed with the aforementioned exceptions.  On the other hand, stsadm -o deletesolution {name} -override seemed to work fine.  Puzzling, for the moment, but it got the job done.  Similarly, stopping a service application that's AWOL (stuck on the processing screen) can be accomplished with stsadm -o provisionservice. Deleting it can be done using stsadm -o deleteconfigurationobject (though this one does seem to have side effects...).

Seems that Powershell is still a second class citizen when it comes to basic SharePoint command-line management.

But in any case, if you set out building your own service applications (<rant>and damn it Microsoft, can't you put some better examples out there?!  Even the few that are out there are convoluted, missing key details, hard to follow...</rant>), be sure to include public, default, parameterless constructors.

Filed under: .Net, Rants, SharePoint No Comments
29Jul/11Off

Working with GUIDs in MongoDB and ASP.NET MVC3

Posted by Charles Chen

Just a small tip for those looking to use GUIDs as document IDs in MongoDB in conjunction with ASP.NET MVC3: it's a lot more straightforward than it may seem at the onset.

These examples are based off of the ASP.NET MVC3 tutorials...except with MongoDB instead of EF+SQL Server.

I've set up my model class like so:

public class Movie
{
    [BsonId]
    public Guid ID { get; set; }
    public string Title { get; set; }
    public DateTime ReleaseDate { get; set; }
    public string Genre { get; set; }
    public decimal Price { get; set; }
}

When the application creates an object and persists it to the database, you'll see that it shows up like this in the Mongo console (I've formatted the JSON for clarity):

> db.movies.find()
{
   "_id":BinData(3,"n2FLBkAkhEOCkX42BGXRqg=="),
   "Title":"Test",
   "ReleaseDate":   ISODate("2011-05-11T04:00:00   Z"),
   "Genre":"Comedy",
   "Price":"9.99"
}

If you try to serialize this to JSON, instead of getting a GUID string, you'll get:

// Get a document
BsonDocument document = movies.FindOneAs<BsonDocument>();

// Direct to JSON
document.ToJson();
/*
{
   "_id":new BinData(3,
   "n2FLBkAkhEOCkX42BGXRqg=="   ),
   "Title":"Test",
   "ReleaseDate":   ISODate("2011-05-11T04:00:00   Z"),
   "Genre":"Comedy",
   "Price":"9.99"
}
*/

// With settings
JsonWriterSettings settings = new JsonWriterSettings{OutputMode = JsonOutputMode.JavaScript };
document.ToJson(settings);
/*
{
   "_id":{
      "$binary":"n2FLBkAkhEOCkX42BGXRqg==",
      "$type":"03"
   },
   "Title":"Test",
   "ReleaseDate":Date(1305086400000),
   "Genre":"Comedy",
   "Price":"9.99"
}
*/

This is somewhat inconvenient if you want to work with it from a pure JavaScript perspective; I was hoping that it would have returned a GUID as a string instead.  I was also concerned that this meant that I'd have to manage this manually as well on the server side in my actions, but it turns out that it works better than expected.  The only caveat is that you have to use "_id" when creating queries; otherwise, you can use the GUID as-is and the Mongo APIs will convert it behind the scenes:

public ActionResult Details(Guid id)
{
    MongoCollection<Movie> movies = _database.GetCollection<Movie>("movies");

    // No need to mess with the GUID; use it as is.
    Movie movie = movies.FindOneAs<Movie>(Query.EQ("_id", id)); 

    return View(movie);
}

You can see the result below in the browser:

Note the properly formatted GUID in the URL

So far, so good with my little Mongo+MVC3 experiment 😀

Filed under: .Net, Mongo, MVC No Comments
5Jul/11Off

The Beauty of Mongo

Posted by Charles Chen

MongoDB is sexy.  Sexy as hell.

This isn't the first or the best intro to Mongo+.NET, but I hope this one can show you how it finally solves the object-relational disconnect with an easy to follow example.

Let's start with the model:

#region prologue

// Charles Chen

#endregion

using System;
using System.Collections.Generic;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;

namespace MongoAssessmentsTest
{
    public class Assessment
    {
        private DateTime _created;
        private string _description;
        [BsonId]
        private ObjectId _id;
        private List<ContentItem> _items;
        private DateTime _lastUpdated;
        private string _ownerId;
        private List<string> _tags;
        private string _title;

        public ObjectId Id
        {
            get { return _id; }
            set { _id = value; }
        }

        public string OwnerId
        {
            get { return _ownerId; }
            set { _ownerId = value; }
        }

        public string Title
        {
            get { return _title; }
            set { _title = value; }
        }

        public string Description
        {
            get { return _description; }
            set { _description = value; }
        }

        public List<string> Tags
        {
            get { return _tags; }
            set { _tags = value; }
        }

        public DateTime Created
        {
            get { return _created; }
            set { _created = value; }
        }

        public DateTime LastUpdated
        {
            get { return _lastUpdated; }
            set { _lastUpdated = value; }
        }

        public List<ContentItem> Items
        {
            get { return _items; }
            set { _items = value; }
        }
    }

    public class ContentItem
    {
        private List<Question> _questions;
        private string _text;

        public string Text
        {
            get { return _text; }
            set { _text = value; }
        }

        public List<Question> Questions
        {
            get { return _questions; }
            set { _questions = value; }
        }
    }

    [BsonKnownTypes(typeof(CheckboxQuestion), typeof(RadioQuestion), typeof(SelectQuestion), typeof(FreeTextQuestion))]
    public class Question
    {
        private int _order;
        private int _points;
        private string _text;

        public string Text
        {
            get { return _text; }
            set { _text = value; }
        }

        public int Order
        {
            get { return _order; }
            set { _order = value; }
        }

        public int Points
        {
            get { return _points; }
            set { _points = value; }
        }
    }

    public class CheckboxQuestion : Question
    {
        private List<string> _answers;
        private List<string> _choices;

        public List<string> Choices
        {
            get { return _choices; }
            set { _choices = value; }
        }

        public List<string> Answers
        {
            get { return _answers; }
            set { _answers = value; }
        }
    }

    public class SelectQuestion : Question
    {
        private string _answer;
        private List<string> _choices;

        public List<string> Choices
        {
            get { return _choices; }
            set { _choices = value; }
        }

        public string Answer
        {
            get { return _answer; }
            set { _answer = value; }
        }
    }

    public class RadioQuestion : SelectQuestion { }

    public class FreeTextQuestion : Question
    {
        private List<string> _acceptableAnswers;
        private string _isAnswerCaseSensitive;

        public string IsAnswerCaseSensitive
        {
            get { return _isAnswerCaseSensitive; }
            set { _isAnswerCaseSensitive = value; }
        }

        public List<string> AcceptableAnswers
        {
            get { return _acceptableAnswers; }
            set { _acceptableAnswers = value; }
        }
    }
}

The beauty of it is the absolute simplicity of working with this model.  Aside from a few attributes, there's not much thought that needs to be given to collections and inheritance hierarchies (though there are additional attributes and classes that can be used to control how these are stored if so desired).  I love it because this approach keeps your domain models clean.

How do we interact with this model?

#region prologue

// Charles Chen

#endregion

using System;
using System.Collections.Generic;
using MongoDB.Bson;
using MongoDB.Driver;
using MongoDB.Driver.Builders;

namespace MongoAssessmentsTest
{
    internal class Program
    {
        private static void Main(string[] args)
        {
            Program program = new Program();
            program.Run();
        }

        private void Run()
        {
            MongoServer server = MongoServer.Create(); // Uses default connection options to localhost

            MongoDatabase database = server.GetDatabase("assessments_db");

            if (!database.CollectionExists("assessments"))
            {
                CommandResult result = database.CreateCollection("assessments");

                if (!result.Ok)
                {
                    Console.Out.WriteLine(result.ErrorMessage);
                    return;
                }
            }

            MongoCollection<Assessment> assessments = database.GetCollection<Assessment>("assessments");

            if (assessments.FindOne(Query.EQ("Title", "Sample 1")) == null)
            {
                // Insert an assessment.
                Assessment a = new Assessment
                {
                    Created = DateTime.Now,
                    Description = "A sample assessment",
                    Title = "Sample 1",
                    Items = new List<ContentItem>
                    {
                        new ContentItem
                        {
                            Questions = new List<Question>
                            {
                                new SelectQuestion
                                {
                                    Text = "Who was the first president of the United States?",
                                    Answer = "George Washington",
                                    Choices = new List<string>
                                    {
                                        "George Washington",
                                        "Abraham Lincoln",
                                        "John Adams"
                                    }
                                },
                                new CheckboxQuestion
                                {
                                    Text = "Which of these is NOT a former US President?",
                                    Answers = new List<string>
                                    {
                                        "Benjamin Franklin",
                                        "Hillary Clinton"
                                    },
                                    Choices = new List<string>
                                    {
                                        "Benjamin Franklin",
                                        "Bill Clinton",
                                        "Hillary Clinton",
                                        "Andrew Jackson",
                                        "James Garfield"
                                    }
                                }
                            }
                        }
                    }
                };

                assessments.Insert(a);
            }

            // Get it as BSON - great for writing straight to a web page
            BsonDocument a1 = assessments.FindOneAs<BsonDocument>(Query.EQ("Title", "Sample 1"));

            Console.Out.WriteLine(a1);

            // Get it as an object - great if you want to work with it on the server.
            Assessment a2 = assessments.FindOneAs<Assessment>(Query.EQ("Title", "Sample 1"));

            Console.Out.WriteLine(a2.Title);
        }
    }
}

Brilliantly simple.  Just fire up mongod.exe and you're ready to rock and roll.  Download the example from here: MongoAssessmentsTest.zip

Filed under: .Net, Dev, Mongo No Comments