<CharlieDigital/> Programming, Politics, and uhh…pineapples

1Jun/16Off

Adventures in Single-Sign-On: SharePoint Login Via Office 365

Posted by Charles Chen

If you are still working with on-premise SharePoint 2010/2013 or you application only supports SAML 1.1 but you'd like to leverage your new and shiny Office 365 accounts for single-sign-on, you can achieve this relatively painlessly by using Windows Azure ACS as a federation provider (FP).  It's possible to use Office 365 (Azure AD) directly as your identity provider (IdP), but for SharePoint 2010, this involves writing custom code since SharePoint 2010 can't consume SAML 2.0 tokens without custom code (only SAML 1.1 via configuration).

The overall flow for the scenario is diagrammed below:

Using Windows Azure ACS as a Federation Provider for Azure AD

Using Windows Azure ACS as a Federation Provider for Azure AD

In this scenario, the trust relationship from SharePoint is only to the FP (Azure ACS) which acts as a proxy for the trust to the IdP (Azure AD).  Azure AD contains the actual credentials, but we proxy those credentials through ACS to take advantage of the SAML 2.0 -> SAML 1.1 translation without writing code (otherwise, it would be possible to directly establish trust to Azure AD or through AD FS).  When the user accesses a protected resource in SharePoint, the user is redirected first to the FP which then redirects to the IdP and proxies that response via a series of redirects to the relying party (RP), SharePoint.

The first step is to create an Azure ACS namespace.  You'll need your ACS namespace URL, which should look like: https://{NAMESPACE}.accesscontrol.windows.net

Next, in Azure AD, create a new application which will allow your ACS to use Azure AD as an IdP.  On the APPLICATIONS tab, click ADD at the bottom to add a new application.  Enter a descriptive name for the application (note: you may want a more "friendly" name -- see last picture):

add-acs-application

Then enter the following for the SIGN-ON URL and APP ID URI:

ad-properties

Before leaving the Azure Portal, click on the VIEW ENDPOINTS button at the bottom of the dashboard and copy the URL for the FEDERATION METADATA DOCUMENT:azure-ad-endpoints

Now hop back over to Azure ACS management and add a new identity provider.  Select WS-Federation identity provider and on the next screen enter a descriptive name and paste the URL into the field:

acs-setup-1

Once you've set up the IdP, the next step is to set up the relying party (RP).  The key is to get the following settings correct:

acs-setup-2

In this example, I'm just using my local development environment as an example, but you must specify the _trust URL and explicitly select SAML 1.1 from the Token format dropdown.  Additionally, uncheck Windows Live ID under Identity providers so that only the one configured previously remains checked.

Finally, on this screen, you will need to specify a certificate to use for signing.  Under Token singing, select Use a dedicated certificate and then either use an existing valid X.509 certificate or create one for testing purposes.

Create the certificate using the following command:

MakeCert.exe -r -pe -n "CN={ACS_NAMESPACE}.accesscontrol.windows.net" -sky exchange -ss my -len 2048 -e 06/01/2017

You will need to export the certificate from the certificate store with the private key.  So WIN+R and type in mmc.exe.  From the MMC, click File and then Add or Remove Snap-ins.  Select the Certificates snap-in and click OK.  Locate the certificate and export it with the private key.  While you're here, export it again without the private key; you will need this certificate when setting up the authentication provider in SharePoint.

Back in the ACS management app, upload the first certificate that was exported and enter the password.  An important note is that ACS will always append a "/" after your realm; we will need to make sure what when we register the authentication provider, we include this in the login URL.

Before leaving the ACS management app for good, we need to update the rule group to pass through the claims.  On the left hand side, click on Rule groups and select the default rule group created for our RP.  Now click Generate to create the default rule set.

One thing I discovered through trial and error (mostly error) is that Azure AD does not seem to be providing a value for the emailaddress claim which we will be using later (you don't technically have to use this as the identifying claim, but I did in SharePoint before discovering that this causes an error).  So we'll remap the "name" claim to "emailaddress".  Click on the name claim  (http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name) and select the emailaddress claim type (http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress) as the output claim type.

Now back in the SharePoint environment, we'll now use the certificate exported without the private key to create a new trusted root authority for SharePoint and create and register the identity provider.  Fire up a management shell and enter the following commands:

$certificate = new-object System.Security.Cryptography.X509Certificates.X509Certificate2("c:\temp\cert-no-private-key.cer")
new-sptrustedrootauthority -name "ACS Token Signing Certificate" -Certificate $certificate
$cm0 = New-SPClaimTypeMapping "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress" -IncomingClaimTypeDisplayName "EmailAddress" -SameAsIncoming
$loginUrl = "https://{ACSNAMESPACE}.accesscontrol.windows.net/v2/wsfederation?wa=wsignin1.0&wtrealm=https://sso.dev.local/&redirect=false"
$realm = "https://sso.dev.local/"
$issuer = New-SPTrustedIdentityTokenIssuer -Name "ACS" -Description "ACS" -Realm $realm -ImportTrustCertificate $certificate -ClaimsMappings $cm0 -SignInUrl $loginUrl -IdentifierClaim $cm0.InputClaimType

And finally, from SharePoint Central Admin, we can now add the "ACS" authentication provider:

acs-identity-provider

The name that we gave earlier to the application in Azure AD can now also be configured to display in the app drawer of Office 365:

o365-apps

Filed under: Azure, SharePoint, SSO No Comments
17Jul/15Off

Adventures in Single-Sign-On: Cross Domain Script Request

Posted by Charles Chen

Consider a scenario where a user authenticates with ADFS (or equivalent identity provider (IdP)) when accessing a domain such as https://www.domain.com (A) and then, from this page, a request is made to https://api.other-domain.com/app.js (B) to download a set of application scripts that would then interact with a set of REST based web services in the B domain.  We'd like to have SSO so that claims provided to A are available to B and that the application scripts downloaded can then subsequently make requests with an authentication cookie.

Roughly speaking, the scenario looks like this:

Depiction of the scenario

Depiction of the scenario

It was straightfoward enough to set up the authentication with ADFS using WIF 4.5 for each of A and B following the MSDN "How To"; I had each of the applications separately working with the same ADFS instance, but the cross domain script request from A to B at step 5 for the script file generated an HTTP redirect sequence (302) that resulted in an XHTML form from ADFS with Javascript that attempts to execute an HTTP POST for the last leg of the authentication.  This was good news because it meant that ADFS recognized the user session and tried to issue another token for the user in the other domain without requiring a login.

However, this obviously posed a problem as, even though it appeared as if it were working, the request for the script could not succeed because of the text/html response from ADFS.

Here's what https://www.domain.com/default.aspx looks like in this case:

<html>
  <body>
    ...
    <script type="text/javascript" src="https://api.other-domain.com/app.js"></script>
  </body>
</html>

This obviously fails because the HTML content returned from the redirect to ADFS cannot be consumed.

I scratched my head for a bit and dug into the documentation for ADFS, trawled online discussion boards, and tinkered with various configurations trying to figure this out with no luck.  Many examples online that discuss this scenario when making a web service call from the backend of one application to another using bearer tokens or WIF ActAs delegation, but these were ultimately not suited for what I wanted to accomplish as I didn't want to have to write out any tokens into the page (for example, adding a URL parameter to the app.js request), make a backend request for the resource, or use a proxy.

(I suspect that using the HTTP GET binding for SAML would work, but for the life of me, I can't figure out how to set this up on ADFS...)

In a flash of insight, it occurred to me that if I used a hidden iframe to load another page in B, I would then have a cookie in session to make the request for the app.js!

Here's the what the page looks like on the page in A:

<script type="text/javascript">
    function loadOtherStuff()
    {
        var script = document.createElement('script');
        script.setAttribute('type', 'text/javascript');
        script.setAttribute('src', 'https://api.other-domain.com/appscript.js');
        document.body.appendChild(script);
    }
</script>
<iframe src="https://api.other-domain.com" style="display: none" 
    onload="javascript:loadOtherStuff()"></iframe>  

Using the iframe, the HTTP 302 redirect is allowed to complete and ADFS is able to set the authentication cookie without requiring a separate sign on since it's using the same IdP, certificate, and issuer thumbprint.  Once the cookie is set for the domain, then subsequent browser requests in the parent document to the B domain will carry along the cookie!

The request for appscript.js is intercepted by an IHttpHandler and authentication can be performed to check for the user claims before returning any content. This then allows us to stream back the client-side application scripts and templates via AMD through a single entry point (e.g. appscript.js?app=App1 or a redirect to establish a root path depending on how you choose to organize your files).

Any XHR requests made subsequently still require proper configuration of CORS on the calling side:

$.ajax({
    url: 'https://api.other-domain.com/api/Echo', 
    type: 'GET',
    crossDomain: true,
    xhrFields: {
        withCredentials: true
    },
    success: function(result){ window.alert('HERE'); console.log('RETRIEVED'); console.log(result); }
});

And on the service side:

<!--//
    Needed to allow cross domain request.
    configuration/system.webServer/httpProtocol
//-->
<httpProtocol>
    <customHeaders>
        <add name="Access-Control-Allow-Origin" value="https://www.domain.com" />
        <add name="Access-Control-Allow-Credentials" value="true" />
        <add name="Access-Control-Allow-Headers" value="accept,content-type,cookie" />
        <add name="Access-Control-Allow-Methods" value="POST,GET,OPTIONS" />
    </customHeaders>
</httpProtocol>

<!--//
    Allow CORS pre-flight
    configuration/system.webServer/security
//-->
<security>
    <requestFiltering allowDoubleEscaping="true">
        <verbs>
            <add verb="OPTIONS" allowed="true" />
        </verbs>
    </requestFiltering>
</security>

<!--//
    Handle CORS pre-flight request
    configuration/system.webServer/modules
//-->
<add name="CorsOptionsModule" type="WifApiSample1.CorsOptionsModule" />

The options handler module is a simple class that responds to OPTION requests and also dynamically adds a header to the response:

    /// <summary>
    ///     <c>HttpModule</c> to support CORS.
    /// </summary>
    public class CorsOptionsModule : IHttpModule
    {
        #region IHttpModule Members
        public void Dispose()
        {
            //clean-up code here.
        }

        public void Init(HttpApplication context)
        {
            context.BeginRequest += HandleRequest;
            context.EndRequest += HandleEndRequest;
        }

        private void HandleEndRequest(object sender, EventArgs e)
        {
            string origin = HttpContext.Current.Request.Headers["Origin"];

            if (string.IsNullOrEmpty(origin))
            {
                return;
            }

            if (HttpContext.Current.Request.HttpMethod == "POST" && HttpContext.Current.Request.Url.OriginalString.IndexOf(".svc") < 0)
            {
                HttpContext.Current.Response.AddHeader("Access-Control-Allow-Origin", origin);
            }
        }

        private void HandleRequest(object sender, EventArgs e)
        {
            if (HttpContext.Current.Request.HttpMethod == "OPTIONS")
            {
                HttpContext.Current.Response.End();
            }
        }

        #endregion
    }

The end result is that single-sign-on is established across two domains for browser to REST API calls using simple HTML-based trickery (only tested in FF!).

Filed under: .Net, Identity, SSO No Comments