<CharlieDigital/> Programming, Politics, and uhh…pineapples

3Sep/140

A Common Sense Way to Improve Cloud Backup

Posted by Charles Chen

In the wake of the Apple iCloud debacle, there has been a lot of discussion on what Apple has done wrong, what it could do better, and how this could have been prevented.

This is not a blog post about 2-factor authentication or proper implementation of authentication channels or how Apple should be more open in their dealings with the security community, but something more basic and common sense: give users more granular control on what gets backed up.

You will see in many discussions and comments to articles that there is quite a bit of "victim shaming".

Example pulled from Business Insider

Example pulled from Business Insider

But I think that this is quite unfair and I postulate that an average smartphone user has no idea that their photos are being synced to the cloud.  It is far more likely that users had no idea that these photos and videos were synced to the cloud in the first place and even if they had an abstract idea that it was (for example, you take a photo on your phone and you can see it on your desktop later), they had no concrete idea of the implications (those photos are now resident in the cloud as opposed to transient).

It is easy to imagine that such things are obvious and should be trivially easy to configure and control to the end users, but I think that this is a poor assumption to make by anyone that is technically savvy; people like my mother and wife really have no idea about these things.  My guess is that Jennifer Lawrence and Kate Upton simply had no idea that their photos and videos were sitting resident in the cloud and even if they did, they probably couldn't figure out how to get rid of them.

Some have said that this is not the fault of the OS makers or app makers.  Google Photos, for example, gives you a very clear screen when you launch it for the first time asking you if you'd like to sync the files to the cloud.  But one problem is that users may not actually read these things before agreeing.  The other is that even after a user agrees, if the user decides that she wishes to change her mind, the setting is turned off from a screen that is three levels deep (launch Photos, click Menu, click Settings, click Auto Backup).  While this is very obvious to some, to many -- like my mother -- this is an absolute mystery.  She has no idea that it's syncing her photos and has no idea how to turn it off.

I think that there are many common sense solutions that can be implemented outside of the security measures implemented above to give users more granular control over their content.

Give Periodic Notifications to Update Privacy Settings

One simple idea is that say every three months, the phone prompts you with a notification in your notification bar:

Simple mockup of notification

Simple mockup of notification

This would allow users to periodically be reminded that things like automatic sync are on and that they have the option of turning them off.  The user is free to ignore it, but it would give them at least a reminder that "Hey, I'm sending your stuff to the cloud, are you OK with that?  Do you want to review your settings?"

Make Synchronization Explicit

One of the problems I have with Google Photos is that it's all or nothing by default.  There isn't a middle ground that allows me to sync some of my photos as a default.

The user experience paradigm here would be much like that of Facebook where you can post photos by selecting them from your album to explicitly and with fine grain control what gets sent to the cloud.  Likewise, iCloud and Google Photos would do well to allow a middle ground that gives users more fine grained control over what gets sent to the cloud instead of ON and OFF.

In discussions, some have said that this would present too high a burden on end users, but it seems to work fine for Facebook and I think that it would be relatively easy to implement in an easy to use manner:

Example of a notification UI that would allow more fine grained control.

Simple mockup of a notification with controls inline to quickly sync all, explicitly choose photos to sync, or don't sync anything.

If the user selects "Sync All", then all 20 new photos are synced to the cloud (be that iCloud, Dropbox, Google Drive, etc).  If the user selects "Choose", the user is given a screen that allows the user to explicitly pick the ones to sync.  The pick screen should prompt the user to "Ignore unselected items for future backup?" when selection is complete so that any unselected photos are simply ignored next time.  If the user selects "Don't Sync", then do nothing.

A simple design like this still gives the user access to the convenience of cloud backups while giving them explicit, fine-grained control and acknowledgement that their data will be stored in the cloud.

Closing Thoughts

The victim shaming is simply not warranted; whether these individuals should or should not have taken these compromising photos and videos is not the right question to ask.  The right question to ask is whether Apple or Google should be automatically syncing them to a resident cloud storage without finer grained controls and explicit consent.

12Aug/140

Invoking Custom WCF Services in SharePoint with Claims

Posted by Charles Chen

In SharePoint, if you host a custom WCF service in a claims-enabled application, the authentication via NTLM is actually quite tricky if you are attempting to invoke it from a console application, for example.

There are various articles and Stackoverflow entries on using System.ServiceModel.Description.ClientCredentials on either the ChannelFactory or the client instance, but all of these did not work in the sense that on the server side, SPContext.Current.Web.CurrentUser was null and ServiceSecurityContext.Current.IsAnonymous returned true.

It seems like it should be possible to invoke the service authenticating through NTLM as if the user were accessing it through the web site.

In fact, it is possible, but it involves some manual HTTP requests to get this to work without doing some Windows Identity Foundation programming and consequently setting up tons of infrastructure to get what seems like a relatively simple and straightforward scenario to work.

The first step is to actually manually retrieve the FedAuth token:

/// <summary>
///     Gets a claims based authentication token by logging in through the NTLM endpoint.
/// </summary>
/// <returns>The FedAuth token required to connect and authenticate the session.</returns>
private string GetAuthToken()
{
    string authToken = string.Empty;

    CredentialCache credentialCache = new CredentialCache();
    credentialCache.Add(new Uri(_portalBaseUrl), "NTLM", new NetworkCredential(_username, _password, _domain));

    HttpWebRequest request = WebRequest.Create(string.Format("{0}/_windows/default.aspx?ReturnUrl=%2f_layouts%2fAuthenticate.aspx%3fSource%3d%252F&Source=%2F ", _portalBaseUrl)) as HttpWebRequest;
    request.Credentials = credentialCache;
    request.AllowAutoRedirect = false;
    request.PreAuthenticate = true;

    // SharePoint doesn't like it if you don't include these (403 Forbidden)?
    request.UserAgent = "Mozilla/5.0 (Windows NT 6.3; WOW64; Trident/7.0; rv:11.0) like Gecko";
    request.Accept = "text/html, application/xhtml+xml, */*";

    HttpWebResponse response = request.GetResponse() as HttpWebResponse;

    authToken = response.Headers["Set-Cookie"];

    return authToken;
}

There are three keys here:

  1. The first is that AllowAutoRedirect must be false or you will get an error that you are getting too many redirects.  What seems to happen is that the cookies are not set correctly when using auto redirect so the chain will continue until an exception is thrown.  In Fiddler, you will see this as a long cycle of requests and redirects.
  2. The second is that the URL must be the NTLM authentication endpoint (/_windows...") as any other URL will return a 302 and for that, you will need to set AllowAutoRedirect to true.
  3. The third is that it seems as if SharePoint really doesn't like it when the user agent and accept headers are not included.  I tried it later without these and it seemed to work, but I could not get it to work without them (403 unauthorized) initially.

Once you have the FedAuth token, you are able to basically impersonate the user.  To do so, you will need to include a cookie in your HTTP header request:

// Get the FedAuth cookie
var authToken = GetAuthToken();

// Create the connection artifacts.            
EndpointAddress endpointAddress = new EndpointAddress(endpointUrl);
BasicHttpBinding binding = new BasicHttpBinding();            

ChannelFactory<ISomeService> channelFactory = 
    new ChannelFactory<ISomeService>(binding, endpointAddress);

// Initiate the client proxy using the connection and binding information.
ISomeService client = channelFactory.CreateChannel();

using (new OperationContextScope((IContextChannel) client))
{
    // Set the authentication cookie on the outgoing WCF request.
    WebOperationContext.Current.OutgoingRequest.Headers.Add("Cookie", authToken);

    // YOUR API CALLS HERE    
}

The key is to add the header on the outgoing request before making your service API calls.

With this, you should see that you are able to invoke SharePoint hosted custom WCF service calls in claims-based web applications with NTLM authentication.

Filed under: .Net, SharePoint, WCF No Comments
29Jul/141

Defining a “Release Lead”

Posted by Charles Chen

Came across a great definition while reading up on release 4.0 of WordPress:

A release lead, if anyone is curious, determines all important parameters for a release, like schedule, deadlines, which feature plugins are merged, and more generally, scope and goals. They take point when it comes to meetings, shepherding contributions, announcement posts, and updates. A release lead is a connector and facilitator, identifying bottlenecks and friction wherever they may be and at the service of the developers and plugin teams that are aiming to have something in a given release, and be in frequent communication with them.

The release lead should should follow what’s being committed, and set the tone for prioritizing and gardening the milestone on Trac. Given the constraint of time in hitting a date, help with prioritization and ensuring good communication lines are two of the most valuable things a lead can contribute.

The last five release leads were lead developers, but that’s not a requirement, nor is being a committer. I always thought of my “code reviewer” and “committer” hats as being separate, additional responsibilities. (Helen, of course, also wears these same hats.) Regardless: the release lead has the final call on all important decisions related to the release.

I particularly like the term "gardening the milestone" as this is a very suitable analogy to what the responsibilities of a release lead involves as it's very much an art like pruning a rose bush or a hedge and keeping a garden looking sharp.

Filed under: Dev 1 Comment
21Jul/14Off

Adventures in Poorly Written APIs

Posted by Charles Chen

I'm working with a library that I have been fighting with for the better part of three days now to try to get it to work.

The previous version of this library was actually very well written from an API perspective and well documented.  The new version?  Not so much.

But from this, I take away two lessons in API design.

Lesson 1: Don't Use Dictionaries in Public APIs -- ESPECIALLY for Required Data

The first mistake the API designers made in this case was exposing a "dictionary" in the public API.

This itself may not be all bad if the API is highly flexible and can accept many forms of requests, but if a public dictionary interface is required, then at least have the good judgement to provide a wrapper to populate those dictionaries.

But even the lack of usability is not as bad was having required data inputs passed to your service as dictionary entries.  In other words, the service call won't succeed without some sort of magical recipe of just the right metadata keys and values.  If it's a required piece of metadata, why not surface it as an actual parameter to the function?

Exposing dictionaries in public APIs is a terrible design idea which is as bad as APIs that require you to pass in XML strings (understood if it's a must, but have the courtesy to provide downstream callers an API to serialize objects into that XML and don't put required data into open ended inputs).  They should be used infrequently and preferably wrapped by an abstraction layer.

Lesson 2: Don't Require Your Caller to Keep Passing You Data You Handed Back

In this particular API, there is an initialization handshake with the server whereby large bits of system data are passed back to the caller.

For whatever reason, instead of simply handing a token back to the server, all (or part?? -- again, it's a magical incantation of sorts) of this metadata must be exchanged with the server to complete a request!

Why not provide a session key?  A caller token?  Something that the caller can use to represent this collection of system data instead of the full set of system data?

More to come, I'm sure of it, but this has been an interesting adventure in poorly written APIs.

Filed under: Dev, Rants No Comments
18May/14Off

Mind Your Matches

Posted by Charles Chen

I recently had to track down a performance issue with one of our Cypher queries that was taking an obscenely long amount of time to run (15 seconds!) given the simplistic nature of the query.

Can you spot the problem?

MATCH (dist:Distribution), (user:User)
MATCH (docType:DocumentType)
WHERE dist.Uid = '...'
   AND user.Uid = '...'
   AND docType.Uid = '...'

It was not easy to track down as this is an entirely valid query that gives the exact result desired.

However, that first MATCH will load every (!) Distribution and User in the graph into memory.

MATCH (dist:Distribution), (user:User), (docType:DocumentType)
WHERE dist.Uid = '...'
   AND user.Uid = '...'
   AND docType.Uid = '...'

It was an easy fix, as you can see, but also an easy mistake to make!

Filed under: Neo4j 2 Comments
4Apr/14Off

FluentNHibernate and SQL Date Generation

Posted by Charles Chen

So you'd like your SQL entry to have a system generated date/time, eh?

Here is a sample table:

CREATE TABLE [dbo].[AuditLog] (
	Id int IDENTITY(1,1) PRIMARY KEY,
	EventUtc datetime2(7) DEFAULT(SYSUTCDATETIME()) NOT NULL,
	EventOffsetUtc datetimeoffset(7) DEFAULT(SYSDATETIMEOFFSET()) NOT NULL,
	EntityContextUid uniqueidentifier,
	EntityContextName nvarchar(256),
	EntityContextType varchar(128),
	UserLogin nvarchar(128),
	EventName varchar(128),
	AppContext varchar(64),
	EventData nvarchar(max),
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]

To spare you hours dealing with this error:

System.Data.SqlTypes.SqlTypeException:
   SqlDateTime overflow. Must be between
   1/1/1753 12:00:00 AM and 12/31/9999
   11:59:59 PM.

What you need to do is to use the following mapping for your date/time columns:

Map(a => a.EventUtc).Column("EventUtc")
	.CustomSqlType("datetime2(7)")
	.Not.Nullable()
	.Default("SYSUTCDATETIME()")
	.Generated.Insert();
Map(a => a.EventOffsetUtc).Column("EventOffsetUtc")
	.CustomSqlType("datetimeoffset(7)")
	.Not.Nullable()
	.Default("SYSDATETIMEOFFSET()")
	.Generated.Insert();

Special thanks to this Stackoverflow thread.

2Apr/14Off

What Alan Watts Can Teach Us About Leadership

Posted by Charles Chen

I was listening to a talk by Alan Watts and found one bit of advice that really connected to what I've learned about leading others.

The principle is that any time you -- as it were -- voluntarily let up control.

In other words, cease to cling to yourself; you have an excess of power because you are wasting energy all the time in self defense.

Trying to manage things, trying to force things to conform to your will.

The moment you stop doing that, that wasted energy is available.

Therefore you are in that sense -- having that energy available --  you are one with the divine principle; you have the energy.

When you are trying, however, to act as if you were god, that is to say you don't trust anybody and you are the dictator and you have to keep everybody in line you lose the divine energy because what you are simply doing is defending yourself.

One mistake that I've been guilty of is to try to force things to conform to my will on various projects (I still do it to varying degrees!).  It is usually with the best of intentions -- for a cleaner framework, a better product, a more efficient process -- but at the same time, it is true that a lot of energy is spent wasted in doing so.

What is the alternative, then?

I think Watts is right that a level of trust has to exist that the team around you can help you achieve your project goals.  Instead of expending the energy in controlling the members of the team, spend the energy in building that trust through training, mentorship, guidance, and giving up not just control, but responsibility.

Sometimes that trust will be unwarranted, but sometimes, that trust will pay itself back many-fold.

2Mar/14Off

Programmatically Add SharePoint Lists with Schemas

Posted by Charles Chen

So you want to add a custom list with a schema, eh?

In SharePoint, this is "easily" (lol) done by adding a custom list with a Schema.xml file and a list template XML file.

But what if you don't want to add a custom list template and you want to do it programmatically?  You'd want to do this, of course, if you wanted to define custom views on the list.  I've seen this done programmatically (as in verbose, custom code to build lists, add content types, build views, etc.), but SharePoint already offers you a mechanism for defining custom views using the list schema XML file.  Why duplicate what SharePoint already gives you for free?

In looking through the API, it seems that there is an API call that would support it, but it's quite cryptic in how it's actually invoked.

After a bit of testing, I found that it's actually quite easy.

Here is the API definition from Microsoft:

public virtual Guid Add(
   string title,
   string description,
   string url,
   string featureId,
   int templateType,
   string docTemplateType,
   string customSchemaXml,
   SPFeatureDefinition listInstanceFeatureDefintion,
   SPListTemplate.QuickLaunchOptions quickLaunchOptions
)

Here is the invocation in PowerShell:

$listId = $web.Lists.Add("Test List", "Test", "TestList2", 
    "00BFEA71-DE22-43B2-A848-C05709900100", 100, "100", $xml, $feature, 0)

A little explanation is in order here.  The first three parameters are very straight forward.  The fourth one is where it starts to get "funny".  Here, you will want to search in your 14\TEMPLATE\FEATURES\ directory for the feature that contains the template that you want to use.  In this case, I am creating a list based on the generic custom list type so the feature is located in 14\TEMPLATE\FEATURES\CustomList.  You need the GUID of the feature in the Feature.xml file here and not your own custom feature GUID.

The fifth and sixth parameters are straight forward.

We'll skip the seventh parameter for now.

The eight parameter here is the feature definition which contains the template that your list will be based on.  Because we are using an out-of-the-box list template, we simply need to load the feature definition for the GUID in parameter 4:

$features = [Microsoft.SharePoint.Administration.SPFarm]::Local

$id = [Guid]("00BFEA71-DE22-43B2-A848-C05709900100") 

$feature = $features[$id]

Again, because we are using the out-of-the-box template, we need to use the out-of-the-box feature definition that contains the template.

The ninth parameter is, again, straight forward.

Now back to that seventh parameter.  This parameter is simply the XML that would be generated by adding a new list in Visual Studio.  I've added a simple example here:

<?xml version='1.0' encoding='utf-8'?>
<List xmlns:ows='Microsoft SharePoint' Title='List1' FolderCreation='FALSE' Direction='$Resources:Direction;' Url='Lists/List1' BaseType='0' EnableContentTypes='True' xmlns='http://schemas.microsoft.com/sharepoint/'>
    <MetaData>
        <ContentTypes>
            <ContentTypeRef ID='0x01'>
                <Folder TargetName='Item' />
            </ContentTypeRef>
            <ContentTypeRef ID='0x0120' />
            <ContentTypeRef ID='0x010800B66F73F12643464793530152868EEE87'/>
        </ContentTypes>
        <Fields>
            <Field ID='{fa564e0f-0c70-4ab9-b863-0177e6ddd247}' Type='Text' Name='Title' DisplayName='$Resources:core,Title;' Required='TRUE' SourceID='http://schemas.microsoft.com/sharepoint/v3' StaticName='Title' MaxLength='255' />
        </Fields>
        <Views>
            <View BaseViewID='0' Type='HTML' MobileView='TRUE' TabularView='FALSE'>
                <Toolbar Type='Standard' />
                <XslLink Default='TRUE'>main.xsl</XslLink>
                <RowLimit Paged='TRUE'>30</RowLimit>
                <ViewFields>
                    <FieldRef Name='LinkTitleNoMenu'></FieldRef>
                </ViewFields>
                <Query>
                    <OrderBy>
                        <FieldRef Name='Modified' Ascending='FALSE'></FieldRef>
                    </OrderBy>
                </Query>
                <ParameterBindings>
                    <ParameterBinding Name='AddNewAnnouncement' Location='Resource(wss,addnewitem)' />
                    <ParameterBinding Name='NoAnnouncements' Location='Resource(wss,noXinviewofY_LIST)' />
                    <ParameterBinding Name='NoAnnouncementsHowTo' Location='Resource(wss,noXinviewofY_ONET_HOME)' />
                </ParameterBindings>
            </View>
            <View BaseViewID='1' Type='HTML' WebPartZoneID='Main' DisplayName='Hello, World' DefaultView='TRUE' MobileView='TRUE' MobileDefaultView='TRUE' SetupPath='pages\viewpage.aspx' ImageUrl='/_layouts/images/generic.png' Url='AllItems.aspx'>
                <Toolbar Type='Standard' />
                <XslLink Default='TRUE'>main.xsl</XslLink>
                <RowLimit Paged='TRUE'>30</RowLimit>
                <ViewFields>
                    <FieldRef Name='Attachments'></FieldRef>
                    <FieldRef Name='LinkTitle'></FieldRef>
                    <FieldRef Name='IntegrationID'></FieldRef>
                </ViewFields>
                <Query>
                    <OrderBy>
                        <FieldRef Name='ID'></FieldRef>
                    </OrderBy>
                </Query>
                <ParameterBindings>
                    <ParameterBinding Name='NoAnnouncements' Location='Resource(wss,noXinviewofY_LIST)' />
                    <ParameterBinding Name='NoAnnouncementsHowTo' Location='Resource(wss,noXinviewofY_DEFAULT)' />
                </ParameterBindings>
            </View>
        </Views>
        <Forms>
            <Form Type='DisplayForm' Url='DispForm.aspx' SetupPath='pages\form.aspx' WebPartZoneID='Main' />
            <Form Type='EditForm' Url='EditForm.aspx' SetupPath='pages\form.aspx' WebPartZoneID='Main' />
            <Form Type='NewForm' Url='NewForm.aspx' SetupPath='pages\form.aspx' WebPartZoneID='Main' />
        </Forms>
    </MetaData>
</List>

It is easily customized with additional custom views, specification of the fields on those views, and even specification of content types to associate to the list!

So why would you want to do this?  If you want a custom list with content types and custom views and all of that jazz, you can get it without writing a lot of custom code to build lists and without the hassle of custom templates (a pain in the butt); you can just write the schema XML (or maybe better yet, configure and export the list) and let SharePoint do its magic!

Filed under: Dev, SharePoint No Comments
27Feb/14Off

DEATH TO INFOPATH!

Posted by Charles Chen

There are few technologies that I truly hate and InfoPath is right up there.  The problem isn't necessarily InfoPath itself (okay, yes, I do hate it and it sucks hard), but the misconceptions from many former and current customers and business users about the utility and suitability of InfoPath cause it to be deployed in a variety of situations where it has no business being in a web-based enterprise architecture.

It's a technology that has been oversold and over-promised but always under-delivers and "enterprise architects" love the stuffin's out of it for some reason without a real grasp on why it's such a lame, terrible technology.

Now it seems that it's reached the end of the line:

In an effort to streamline our investments and deliver a more integrated Office forms user experience, we’re retiring InfoPath and investing in new forms technology across SharePoint, Access, and Word. This means that InfoPath 2013 is the last release of the desktop client, and InfoPath Forms Services in SharePoint Server 2013 is the last release of InfoPath Forms Services.

Microsoft makes many great things like .NET and Visual Studio and some total duds like InfoPath (a solution looking for a problem).

This is my favorite part of the blog post:

Industry trends and feedback from our customers and partners make it clear that today’s businesses demand an intelligent, integrated forms experience that spans devices. We are looking to make investments that allow you to easily design, deploy, and use intelligent, integrated forms across Office clients, servers, and services—forms that everyone can use on their PC, tablet, or phone.

Hey Microsoft, here's a tip for you: HTML!  What took you so long?  I mean, holy smokes, why did you even waste the money to conceive of InfoPath Forms Services to convert InfoPath forms into HTML in the first place? Why did you even bother with forcing developers to build forms in some terrible designer with a terrible programming experience only to convert those forms right back into HTML so that people could fill it out?

Any idiot could have seen the utter uselessness and un-ceremonial end of InfoPath years ago.

Filed under: Awesome, Office No Comments
26Feb/14Off

Office Lighting and Decision Making

Posted by Charles Chen

The results of an interesting study flashed across my news feed yesterday:

Another finding was that the perception of heat could affect the participants’ emotions. The research team said that this is because emotions are more intense under bright light; thus, leading to the perception that light is heat, which can trigger more intense emotions.

The team also found that bright light affects the kinds of decisions people make. Since the majority of people work during the day under bright lighting conditions, the researchers noted that most daily decisions are made under bright light, which intensifies emotions.

Accordingly, they suggest that turning the light lower may help people make more rational decisions, not to mention negotiate better settlements in a calmer manner.

Taking emotion out of decision making is one of the most important skills one can develop and maybe it can be as simple as installing more ambient lighting and turning off those (ugly) fluorescent lamps overhead!