<CharlieDigital/> Programming, Politics, and uhh…pineapples

15Apr/10Off

Why SPMetal Falls Short

Posted by Charles Chen

First, SPMetal is good.  It's very good.  Much better than life without it.  It encourages more object-oriented programming (instead of XML string oriented programming - blech!)

That said, SPMetal falls short of awesome by just a hair.

Ideally, one would be able to generate models from local CAML files instead of having to deploy the content types first since I assume it's best practice for organizations to deploy the content types from CAML files for consistency.  It seems backwards from a development perspective to have to deploy your content types and then be able to generate models to write code.

Filed under: Rants, SharePoint No Comments
10Apr/10Off

Philly .NET Code Camp and Windows Azure

Posted by Charles Chen

I spent half a day at the Philly .NET Code Camp and ended up attending only two sessions (weather was too nice outside to be sitting inside on a Saturday :-D).  By chance, I saw Alvin Ashcraft's name on the list of presenters when I showed up, so I was hoping I'd get to meet him in person.  But he was seemingly absent from his early morning session.

One of the two sessions I attended was on Windows Azure; it was an excellent presentation given by Dave Isbitski.  I've dabbled with it a bit early on in the CTP and was not particularly impressed.  Since then, I've continued to read up on it on and off.  The one thing that I took away from today's session was that Azure is not enterprise ready (yet) and perhaps isn't meant to be?

To understand why, consider your account management and login experience: it's all tied to Windows Live IDs.  Yes, that's right.  Windows Live IDs.  This means that your enterprise password and account naming policies can't be enforced.  Your enterprise password complexity and history rules are not applicable.  Furthermore, what happens if the person who owns or creates a new Live ID leaves the company?  Perhaps she'd be nice enough to hand over the password info, but what if this person were hit by a bus?  I think this is a big problem.  What if she has a grudge?  And just how secure are Windows Live IDs?

Account management is another issue.  As it is, it requires entering in credit card information.  This doesn't scream "enterprise" to me.  You'd think there would be an ability to link accounts to company by company billing accounts (I dunno, maybe via a company's MSDN license?)  There's also no concept of hierarchical account linking and instance management.  This means that I can't even associate multiple Live IDs with one account and set granular permissions on the instances that each account can control (for example, Steve's account can manage these two worker roles while Joe's account manages this web role).  What it boils down to is the wild, wild west of account management; there's no global view for a company to monitor usage across multiple accounts.

While there are a host of other issues (the ability to create data (SQL log exports and external replication are not supported, for example) and image backups) that affect enterprise adoption, perhaps the biggest one, in my opinion, is the big question mark of how these systems can be validated.  Whether you're working with clients in the financial industry or perhaps insurance or life sciences (like me), enterprise systems will need to be validated and certified.  I see this is a big challenge for adoption in life sciences due to the strict validation requirements for software systems.

At the end of the day, I can kind of see where Microsoft is going with this if you compare it to Google Apps, for example.  But the key differentiator to me has always been that Microsoft always represented the enterprise while Google perhaps better represents the entrepreneur and the tinkerer.  While both approaches are needed, it does add some difficulty in terms of evaluting Azure for enterprise usage given that the current implementation of some of the core features are not very enterprise friendly.

That said, it's still an exciting platform.  I've got a few things brewing and I'll be keeping the blog updated as I complete my experiments.

Filed under: .Net, DevLife 2 Comments
9Apr/10Off

Obama Increases Funding for Prompt Global Strike

Posted by Charles Chen

Bear with me here for some politics ;)

Caught an interesting article last night regarding increased investment in a new weapons program to complement the decrease in the nuclear arsenal.

Prompt Global Strike (wiki)

http://www.msnbc.msn.com/id/36253190/ns/us_news-washington_post/

The administration has asked Congress for $240 million for next year's Prompt Global Strike development programs, a 45 percent increase from the current budget. The military forecasts a total of $2 billion in development costs through 2015 -- a relative bargain by Pentagon standards.

Nuclear arms have formed the backbone of U.S. deterrence strategy for six decades. Although the strategy worked during the Cold War, military leaders say they need other powerful weapons in their arsenal to deter adversaries who assume that the United States would refrain from taking the extreme step of ordering a nuclear strike.
"Deterrence can no longer just be nuclear weapons. It has to be broader," Marine Gen. James E. Cartwright, vice chairman of the Joint Chiefs of Staff and a leading proponent of Prompt Global Strike, told a conference last month.

Some U.S. military officials say their current non-nuclear options are too limited or too slow. Unlike intercontinental ballistic missiles, which travel at several times the speed of sound, it can take up to 12 hours for cruise missiles to hit faraway targets. Long-range bombers likewise can take many hours to fly into position for a strike.
"Today, unless you want to go nuclear, it's measured in days, maybe weeks" until the military can launch an attack with regular forces, Cartwright said. "That's just too long in the world that we live in."

This is encouraging and I like it. After reading some of the comments on CNN.com on the new START treaty, I started to wonder if some of those posters lived in the same reality. A lot of folks seem to be stuck in the Cold War mentality.  Today, our economy is deeply intertwined with that of China and our European allies depend heavily on Russia for oil, natural gas, and Russia's vast diamond supply (well, and hot women, too).

The problem with a nuclear arsenal as a deterrent is that it has no effect in an asymmetrical war against a stateless, nationless, enemy; the enemy already knows that we can't and won't use them due to the collateral damage to civilian populations and nuclear fallout that would result.  Prompt Global Strike gives us the ability to deliver munitions with the same expediency as ICBMs do without all of the nasty side effects. Effectively, it becomes a weapon that we can actually deploy and use rather than a stockpile of nuclear arms that I simply cannot foresee us ever using.

However, it's not without its own dangers (at least until a proper protocol is designed):

Although it is technically simple to replace nuclear warheads on a missile with conventional ones, Prompt Global Strike has been dogged by a significant problem: how to ensure that Russia could tell the difference if a launch occurred.

Because it's basically a modified ICBM with a non-nuclear warhead, new protocols are needed to ensure that launching one of these isn't going to trigger an accidental nuclear response. The article mentions some options on the table including lower trajectories, higher trajectories, pre-launch communication with nuclear powers, etc.

Many have decried the START treaty as "weakening" the US.  However, this fails to consider that the US and Russia today hold over 90% of the worlds nuclear armaments and the new START treaty reduces this from 2200 to only 1550 - still plenty to flatten most of the world's major cities.  Ultimately, as Air Force General Kevin Chilton says, the Prompt Global Response missile system gives "an additional weapon in the quiver of the president to give him options in time of crisis today, in which he maybe only has a nuclear option for a timely response."

The wiki article has some pretty gnarly details on how it can be deployed:

  • Ballistic missiles, based on either the ICBM or SLBM
  • Hypersonic cruise missiles, such as the Boeing X-51
  • Air launched missiles
  • Space based launch platforms

Very interesting indeed. As a Command in Chief, working in concert with Gates to realign our military spending and investments in addition to changing our international persona (especially in predominantly Muslim nations and with our European allies), Obama has been miles ahead of Bush in my book.

5Apr/10Off

jsTree and Nested XML Data Stores

Posted by Charles Chen

I happened upon jsTree a few months back while searching for a solid jQuery based tree.

Without a doubt, it is one of the most well implemented and functional Javascript trees I've used with perhaps the most powerful feature being the built-in support for client-side XML representations of the tree and the ability to add arbitrary metadata to the tree using the Sarissa library.

While the tree itself is extremely powerful, some of the documentation is actually out of date and made my implementation of the tree a bit more tasking than it should have been.

For example, the option for initializing the tree with a static XML string (as demonstrated here) actually requires using staticData instead of static in the latest version.  Adding metadata to the tree is also made far more complicated by the documentation and examples I found online.

In reality, the datastore implementation for the nested XML support is powerful enough to extract arbitrary DOM attributes (online posts seem to indicate that you need to use the custom metadata plugin and/or the jQuery metadata plugin).  You can see in the sample below, that I set the attribute "md" to an encoded JSON string (you'll want to do this for when you reload the tree as a Javascript string) and then retrieve the value as a part of the XML by specifying the attributes to collect:

/*--- test adding data ---*/
$("#test").click(function() {
    var t = $.tree.focused();

    if (!t.selected) {
        return;
    }

    /*--- sets a person on the node ---*/
    var person = {
        "FirstName": $("#firstName").val(),
        "LastName": $("#lastName").val(),
        "Age": $("#age").val() 
    };

    var serialized = JSON.stringify(person);

    /*--- sample of adding an arbitrary attribute at the DOM level ---*/
    t.selected.attr("md", encodeURI(serialized));

    /*--- ...and how to retrieve it in XML ---*/
    var opts = {};
    opts.outer_attrib = ["id", "rel", "class", "md"];

    var xml = t.get(null, "xml_nested", opts)

    $("#xml-d").text(xml);
    $("#treeXml").val(encodeURI(xml));              
});

In real usage, you'd assign some more meaningful values to the metadata and save it.  I find that with a library like this, it's probably easier to just save the whole XML string and that's simple enough by just pushing the XML to a hidden input before submitting the form (as I've done in the last line).

Mahr Mohyuddin has a much more complex and more generic implementation of ASP.NET integration here, but I think that might be more complexity than is needed.  In practice, it makes more sense to use full JSON objects on the client side (as I've used above) and embed them into the attribute and then, using the JavaScriptSerializer class, extract the objects into domain objects on the server side.  Here's an example:

string xmlString = Uri.UnescapeDataString(treeXml.Value);

XDocument xml = XDocument.Parse(xmlString);

// Get all the <items/>.
var items = from x in xml.Descendants()
            where x.Name == "item"
            select x;

List<Person> people = new List<Person>();

JavaScriptSerializer serializer = new JavaScriptSerializer();

// Resolve the paths and Person instances.
foreach(var item in items) {
    string[] parts = item.AncestorsAndSelf()
        .Select(a => a.Descendants("name").First().Value)
        .Reverse().Skip(1).ToArray();

    string path = string.Join("/", parts);

    if(item.Attribute("md") == null) {
        continue; // Next iteration.
    }

    string serializedPerson = 
        Uri.UnescapeDataString(item.Attribute("md").Value);

    Person p = serializer.Deserialize<Person>(serializedPerson);
    p.OrgPath = path;

    people.Add(p);
}

_people.DataSource = people;
_people.DataBind();

Nothing fancy here; the only thing of note is the little LINQ query to resolve the node path (may or may not be useful).

The Person class is also very simple and barebones:

using System;

namespace JsTreeSample {
    /// <summary>
    /// Models a person.
    /// </summary>
    /// <remarks>
    /// Serializable to support deserialization from JSON.
    /// </remarks>
    [Serializable]
    public class Person {
        private string _orgPath;
        private int _age;
        private string _firstName;
        private string _lastName;

        public string FirstName {
            get { return _firstName; }
            set { _firstName = value; }
        }

        public string LastName {
            get { return _lastName; }
            set { _lastName = value; }
        }

        public int Age {
            get { return _age; }
            set { _age = value; }
        }

        public string OrgPath {
            get { return _orgPath; }
            set { _orgPath = value; }
        }
    }
}

The full project is included.  Some usage notes: select a node first and then enter a first name, last name, and age.  Then click "Set Data" to create a Person object at the node (this will also show the full XML of the tree).  Then click Submit to send the data (displays in a repeater).

JsTreeSample.7z (144.07 KB)

Definitely check out jsTree for your next project; it's amazingingly versatile and rich in functionality.

Filed under: Dev, DevTools 1 Comment
2Apr/10Off

SharpZipLib and ASP.NET

Posted by Charles Chen

I recently had to write a search-driven component to extract and export documents from a SharePoint repository. It presented a challenge since many examples on the web start from the premise of a file system and not binary streams.

I settled upon SharpZipLib, an excellent and fairly easy to use library, but found the documentation quite lacking, particularly around creating a zip file to a stream (like an ASP.NET output response stream).

I put together a little sample just to test it out and finally got it working after struggling with it for a good 30 minutes. This sample creates a zip package using an embedded resource (text file) for simplicity. In practice, you can just modify the implementation of the GetBinaryContent helper method. Note that when adding multiple files, you do a "put" first and then a "write".

using System;
using System.IO;
using System.Reflection;
using System.Web;
using System.Web.UI;
using ICSharpCode.SharpZipLib.Zip;
 

namespace SharpZipLibTest {
    public partial class _Default : Page {
        protected void Page_Load(object sender, EventArgs e) {
            string fileName = "package.zip";

            // Clear the response.
            Response.Clear();
            Response.Buffer = true;
            Response.ContentType = "application/octet-stream";
            Response.Charset = "";
            Response.AddHeader("Content-Disposition",
                string.Format("attachment; filename={0}", fileName));

            using (var ms = new MemoryStream()) 
            using (var zip = new ZipOutputStream(ms)) {
                byte[] fileBuffer = GetBinaryContent();

                // Write to the zip package.
                var entry = new ZipEntry("helloworld.txt");
                entry.DateTime = DateTime.Now;
                entry.Size = fileBuffer.Length;
                zip.PutNextEntry(entry);
                zip.Write(fileBuffer, 0, fileBuffer.Length);

                // Repeat for each file
                /*
                for(...) {
                    var entry = new ZipEntry("...");

                    zip.PutNextEntry(entry);
                    zip.Write(...);
                }
                */

                zip.Flush();
                zip.Finish();

                byte[] output = ms.ToArray();

                // Write the final output to the response stream.
                Response.OutputStream.Write(output, 0, output.Length);                
            }

            // End the response.
            Response.Flush();
            Response.End();

            HttpContext.Current.ApplicationInstance.CompleteRequest();
        }

        /// <summary>
        /// Gets the binary content to send.
        /// </summary>
        /// <returns>The byte array containing the binary content.</returns>
        private byte[] GetBinaryContent() {
            string resourceName = "SharpZipLibTest.helloworld.txt";

            byte[] fileBuffer;

            using (Stream s = Assembly.GetExecutingAssembly()
                .GetManifestResourceStream(resourceName)) {
                // Read the file stream into a buffer.
                fileBuffer = new byte[s.Length];

                s.Read(fileBuffer, 0, fileBuffer.Length);
            }

            return fileBuffer;
        }
    }
}

The full project file is attached and runnable.

SharpZipLibTest.7z (104.3 KB)

Filed under: .Net, DevTools No Comments