Sunday, September 06, 2009

Introducing GridRoom

In several previous posts I've alluded to working on a product that uses DirectShow.NET and various Amazon Web services. I've tinkered with bits and pieces of this product for quite a while in my spare time. Well, several months ago I decided to finally take the plunge and turn it into a business. Since then I've worked on it nearly full-time (still doing a bit of consulting for my old employer), and I'm proud to announce that the beta was publicly released last week.

The product is called GridRoom as you might have guessed, and it makes sharing high-quality video with small to medium-sized groups of people really, really easy. This is a product designed to solve a problem I ran into personally as a youth football coach and cameraman, and that's the market I've targeted initially.

Football is an unusual sport in that game film review is essential for effective coaching at all levels of play--even youth football. There are so many players crammed into a small space, so much chaos, so many precise things that need to happen to be successful that coaches on the sideline often have no idea what really happened on a given play until they review the film.

That's why teams at the highest levels put at least one coach in the press box during games and also take overhead still shots from the top of the stadium. New York Giant's owner Wellington Mara apparently originated the practice of taking overhead Polaroids as early as the 1940s.

When you coach football and watch film you really begin to understand the absurdity of what happens on the football field--and how a few happenstance plays breaking one way can turn a close match into a route. I've seen 90 yard runs where several offensive players missed their blocks--but the unblocked defenders ran right past the ballcarrier. I've seen runs for losses where blockers cleared a huge hole but the runner slipped down in the backfield.

As a former high-school and small college player I was familiar with the pre-digital film-review process at those levels--but I hadn't thought about it much since my playing days. In the summer of 2004 I completed a small consulting project for the Atlanta Falcons and marveled at their impressive system for archiving and retrieving not just game footage from multiple camera angles, but also just about everything that happened on the practice field--even individual player drills.

That fall our oldest son played youth football for the first time. I was an assistant coach, and I convinced my wife to film the games with our new digital camera--not an easy task for her with several younger siblings in tow. I didn't expect to find much sophistication in the youth football film review process, but I also didn't expect it to be as painful as it was.

Most teams struggle to find someone competent to film their games. If they're lucky the cameraman will burn a DVD and give it to the head coach a couple of days after the game. Assistant coaches (and there are usually 3 or 4 even for youth teams) often don't see the game film at all. Sometimes the head coach will give them a copy before the next game. Players almost never see the complete game; most see only highlights at the end of the season.

The low-end products for football film review back then were targeted at large high schools through small colleges. They were designed for complete film breakdown--where every play is cataloged by game phase, down and distance, offensive and defensive formations, and key players. And even these low-end products were turnkey systems costing several thousand dollars each. I didn't consider it realistic that part-time youth football coaches would ever put that much time into detailed film breakdown so after several years of coaching and filming I decided to work on the more basic problem of distribution and collaboration instead.

GridRoom is the result of that work so far. It combines the process of capturing or importing a video with sharing so you can literally plug your digital video camera into the computer, enter a couple of pieces of information (including the email addresses of folks you want to share with), kick off the capture process, and walk away, go to bed, or whatever. A DVD quality copy of the video is captured, compressed, uploaded, and made available to everyone with no further intervention. You can usually get the complete game film to every player and coach in less than 6 hours with less effort than it takes to create a single DVD.

The client software uses a very robust upload and download process that loses very little progress if you restart your PC or lose your network connection. (In an almost comically perfect test of this feature my Dad cut the power to my parents computer to repair a light fixture while my Mom was downloading a video to try out GridRoom for the first time.) The client also includes a simple video player designed with football film review in mind (plays are bookmarked, you can easily skip forward and backward, and you can toggle between slow motion and regular speed).

Youth through high school football teams are my primary market right now but GridRoom could be useful in any scenario where people are already distributing DVDs to small groups of people. I think the key factors are a need for 1) high-quality content, 2) rapid distribution, 3) reduction of DVD duplication work, and 4) content that has high personal or replay value to the end consumer--i.e. video they want to keep, not something they want to stream at low-quality and never see again.

I'll continue to post here about more detailed technology and business problems I run into with this venture, but I've also created a separate GridRoom blog for non-technical tips, how-to articles, and upcoming features.

Wednesday, July 29, 2009

3 Things I Hate About My Clear WiMAX Gateway

I recently signed up for Clearwire's WiMAX service. So far the service itself has been great. Performance is good at all times, and we haven't lost our connection once in the past two weeks. This is pretty important since I work from home and also run a couple of low-profile servers over this connection.

Most of my problems with Clear stemmed from the fact that they advertised their Motorola gateway as a modem when it's actually a router. Unfortunately the gateway has continued to cause problems--so many that I would not recommend Clear's service to anyone with a home network (or at least anyone with more than the plainest vanilla configuration). Wait until you can purchase a better WiMAX modem or gateway than what's provided by Clear.

First, the gateway refuses to properly forward requests originating on the local network to my public servers. Every LinkSys home router I've owned for the past 8 years has handled this correctly. But the Clear Motorola gateway returns the router's administration console for all local network requests to the gateway's private AND public IP addresses. The router includes an option to enable/disable administration from the Internet, but that makes no difference. In practical terms, this means I can't see what my home servers look like to the outside world without either A) accessing them through a different ISP or B) routing my http requests through a slow and invasive proxy server such as this one.

Second, the gateway apparently eats traceroute packets. The screenshot below shows a traceroute to www.google.com with a timeout of 10,000 milliseconds:

image

Now I can't prove that the gateway is actually eating these packets because I don't have another device on Clear's network to compare. It could be that the packets are eaten by an upstream router on Clear's network. But I doubt that's the case. (As an aside, I see the same behavior when I run a traceroute to other hosts on the Clear network, such as their DNS servers and time server.)

A big reason I'm inclined to blame the gateway for eating traceroute packets is also the third thing I hate about it: It doesn't fully implement the DNS protocol. I discovered this when my mail relay stopped working after switching to Clear. After several wasted hours monkeying around with my SMTP server and Wireshark I found that the gateway was eating DNS lookups for mail exchanger records. This was especially difficult to diagnose because the A record lookups were working just fine.

The screenshot below shows an nslookup of the MX record for gmail.com, which times out:

image

This screenshot shows an nslookup of the A record for gmail.com, which succeeds:

image

Once I identified the problem it took just a few minutes to fix by bypassing the Clear gateway for DNS lookups on the mail server. As you can see below, the same MX lookup works fine when routed specifically to Clear's primary DNS server (75.94.255.12):

image

This DNS server is where the gateway passes DNS requests it can't resolve, which indicates to me that the gateway itself is responsible for mangling the DNS packets.

Wednesday, July 15, 2009

Signing up for Clear WiMAX in Atlanta

During our 9 years in Atlanta we've tried both DSL and cable broadband. We started with AT&T Broadband cable, and that was pretty horrible. We then tried DirectTV and SpeedFactory DSL. These were great companies, but DirectTV stopped providing DSL service, and SpeedFactory was killed off by BellSouth. We returned to cable with Comcast and they've been so-so: intermittent outages, poor customer support, and expensive--but usually available and fast.

Needless to say, I've been eager to find an alternative provider for quite some time. When I discovered several days ago that Clearwire had just rolled out WiMAX service in Atlanta--for very competitive prices--I was ready to try it out.

I tried to place an order that night but ran into some issues with Clearwire's Web site. I finally ordered the service yesterday and received my equipment today. It has definitely been a mixed experience so I thought I'd take a few minutes to highlight the good and bad here. Local blogger Bruce Bracey has catalogued his experiences in some detail here and here if you want another opinion.

The Clearwire web site is pretty, but has some serious usability issues. For one thing, the site really overuses AJAX. This is a problem during the signup process, but the best example I can show you is the FAQ page:

clearwire-faq

Notice the tiny little scrolling area in the modal pop-up window. Nice. I have a 20 inch monitor, but I'm forced to view their FAQ through a 200 pixel-high portlet--and scroll down 18 times to read the whole thing!

Other issues I ran into:

  • You can only lease a WiMAX modem if you sign a 2-year contract. Month-to-month plans require a $35 activation fee and a $79 modem purchase. I can understand the activation fee, but the modem purchase/lease options don't make sense, and it wasn't initially obvious why the modem lease option kept appearing and disappearing during the signup process as I switched between plans.
  • The 2-year contract requires your SSN and birthdate for a credit check plus your credit card number, expiration date, and security code. The site rejected my choice for a username at least 5 times (once for including an underscore character and multiple times because the usernames I selected were already in use). Each time my SSN, birthdate, and credit card information were discarded so I had to repopulate these 5 fields 5 times each. (Perhaps this was for security reasons, but the site also neglects to set autocomplete="off" on these form fields so they were conveniently remembered for me by Firefox--negating any security benefits.)

After all this I was rejected for not passing the credit check. Huh? I've always had excellent credit so I was concerned enough to surf over to AnnualCreditReport.com and check my credit files for the first time this year.There were no problems with my credit history so I have no explanation for the rejection. I returned to Clearwire's site and this time succeeded in signing us up under my wife's name--apparently she's more likely to pay due to the enormous salary she earns managing our household and homeschooling our kids!

Once I ordered service the Motorola CPEi 25150 modem arrived in just over 24 hours--a pleasant surprise. The packaging and instructions were very simple and indicated that I only needed to attach the modem to my computer via an Ethernet cable to get connected. Unfortunately, it wasn't that simple.

I tried swapping out my cable modem and replacing it with the WiMAX modem, but that didn't work. I next hooked the Clearwire modem directly to my computer and could then connect to the Internet. It took several more steps on the Clearwire site to finish configuring my account and the site timed out several times during this process.

Because the modem didn't work when I first attached it to my network I was suspicious it was more than just a modem. Sure enough, after a little investigation I discovered the "modem" was actually a full-fledged router with firewall, NAT, DHCP, etc. support and a Web administration interface (default IP is 192.168.15.1, default password is "motorola"). The router capabilities are not mentioned anywhere on the Clearwire site, packaging, or setup instructions. This was quite a nasty surprise since it took several more hours of wrestling with my network configuration to make the new modem co-exist with the (much nicer) Linksys router required for my Vonage phone service.

On the positive side, signal strength and network speed were very good. Our house is located near the top and on the east side of the ridge that carries Jones Bridge Road between Highway 120 and McGinnis Ferry Road in John's Creek. When placed by the east wall on our first floor (not in front of a window) the modem shows a steady 5 signal bars (strongest possible).

I selected the 786/384 kbps plan ($20 per month) to try out the service, but I achieved significantly higher speeds on my initial Speakeasy test at 4PM EST:

  • Atlanta - 5862/458 kbps
  • Seattle - 5581/423 kbps
  • LA - 4257/503 kbps

Later in the day Clearwire must have throttled me back to my contract speeds because the results were not quite so spectacular at 12 PM:

  • Atlanta - 1100/360 kbps
  • Seattle - 1095/336 kbps
  • LA - 1092/345 kbps

Atlanta is just Clearwire's third market nationwide so hopefully they'll improve the signup process quickly while keeping the network fast!

P.S. If you found any of the above information useful and decide to sign up for Clear WiMAX feel free to show your appreciation by entering my referral code: "b25rd7". Thanks!

UPDATE: Three things I hate about my Clear gateway.

Wednesday, July 01, 2009

Determining Html Element Visibility with WatiN

I recently started using WatiN to write automated Web application tests for the first time. WatiN is a great and rapidly maturing tool with lots of developer support, and I highly recommend it based on my experiences so far. But after writing several tests of dynamic screen elements I realized the tests weren't verifying everything I thought they were verifying.

When you use WatiN "Find" syntax to search for elements and screen text there's no obvious distinction made between hidden and visible elements. The screens I was testing use ASP.NET validation controls to perform client-side validation and include other dynamic elements such as the AJAX.NET ModalPopupExtender. I wanted to verify that these elements were visible and hidden at the appropriate times, and the simplest way of accomplishing this with WatiN was not immediately obvious.

After digging around in the API a bit I came up with a method that's simple, robust and seems to perform reasonably well. As demonstrated in the sample tests below, you can simply walk up the element hierarchy checking whether an element or any of its parents are styled with "display: none".

   1: using System.Text.RegularExpressions;
   2: using NUnit.Framework;
   3: using WatiN.Core;
   4:  
   5: [TestFixture]
   6: public class WatinTest
   7: {
   8:     [Test]
   9:     public void PopupDisplayed()
  10:     {
  11:         var ie = new IE();
  12:         ie.GoTo("http://www.example.com");
  13:  
  14:         Div popup = ie.Div(Find.ById(new Regex("pnlPopup$")));
  15:         Assert.IsTrue(IsDisplayed(popup));
  16:     }
  17:  
  18:     [Test]
  19:     public void ValidatorDisplayed()
  20:     {
  21:         var ie = new IE();
  22:         ie.GoTo("http://www.example.com");
  23:  
  24:         Span validator = ie.Span(Find.ByText("Required field"));
  25:         Assert.IsTrue(IsDisplayed(validator));
  26:     }
  27:  
  28:     public bool IsDisplayed(Element element)
  29:     {
  30:         if (string.Equals(element.Style.Display, "none"))
  31:         {
  32:             return false;
  33:         }
  34:         if (element.Parent != null)
  35:         {
  36:             return IsDisplayed(element.Parent);
  37:         }
  38:         return true;
  39:     }
  40: }

The first test case (PopupDisplayed) searches for the div element representing a modal popup panel by the element id. It uses a regular expression since the ASP.NET control ids for nested controls are unwieldy and variable. In this test we are checking whether the div itself is displayed, but this method would also work for any control contained by the div (such as a submit or text input element).

The second test (ValidatorDisplayed) searches for the span element representing a RequiredFieldValidator by the validation text contained in the span.

Some caveats apply when using this method:

  • It works only with embedded CSS styles. If the "display: none" style was applied using a CSS class contained in a style sheet the WatiN Element.Style property would NOT automatically reflect this.
  • It doesn't determine true element visibility--only whether the element is displayed. Elements may also be hidden using "visibility: hidden", occluded by other elements with a higher z-index, or moved offscreen using the position style.

As a side note, I highly recommend that you grab the Internet Explorer Developer Toolbar and take some time to understand the WatiN page model before writing many tests.

Monday, April 27, 2009

Confusing Errors Using WCF Transport Security With Client Certificates

While prototyping a WCF service last week I ran into a number of confusing security-related errors on both the client and server. My setup was as follows:

  • Windows 2008 Server
  • IIS 7
  • .NET 3.5
  • basicHttpBinding/transport security/client certificates

Most of the errors that can result from this setup have been documented elsewhere. This detailed post by Imaya Kumar does a great job of walking through the overall configuration and explaining the error messages caused by various misconfigurations. However, I did run into one client error that wasn't specifically documented anywhere else:

System.ServiceModel.Security.MessageSecurityException: The HTTP request was forbidden with client authentication scheme 'Anonymous'. ---> System.Net.WebException: The remote server returned an error: (403) Forbidden.

This error actually indicates that your client certificate could not be validated by the server. In my case I was using self-generated client and root certificates, and I had inadvertently uninstalled the root certificate from my server.

A related gotcha is that setting a particular clientCertificate.certificateValidationMode for your service is meaningless when using transport security. For example, the "PeerTrust" value in the configuration snippet below is ignored. Client certificates seem always to be validated using the PeerOrChainTrust method when using transport security with the http bindings.

   1: <behaviors>
   2:   <serviceBehaviors>
   3:     <behavior name="TransportBehavior">
   4:       <serviceDebug includeExceptionDetailInFaults="true" httpHelpPageEnabled="false" httpsHelpPageEnabled="true" />
   5:       <serviceMetadata httpsGetEnabled="true" httpGetEnabled="false" />
   6:       <serviceCredentials>
   7:         <clientCertificate>
   8:           <authentication certificateValidationMode="PeerTrust" />
   9:         </clientCertificate>
  10:       </serviceCredentials>
  11:     </behavior>        
  12:   </serviceBehaviors>
  13: </behaviors>

Thursday, April 16, 2009

New Simple Savant Release

I've just released Simple Savant v0.2 at CodePlex. This version brings the library up to date with the latest Amazon SimpleDB features and also completes the baseline feature set. (See this post for an introduction to Simple Savant.) The new release adds support for:

This release also includes a couple of breaking changes to reduce the potential for ambiguous or unsafe behavior in certain circumstances:

  • Select operations now return all available results by default. Previously Savant returned just the first page of results by default.
  • Savant now only limits the number of requests sent to SimpleDB on select operations rather than attempting to return a precise number of results. This seemed the best approach considering the potential confusion when using Savant with selects containing the "limit" keyword. Use SelectCommand.MaxResultPages to limit the number of result pages requested from SimpleDB in a single call.
  • DeleteAttribute requests are no longer used to delete null item properties on put operations. This not only degraded performance but could also lead to inconsistent data states. There are now two configuration options available for null property handling. Null item properties can either be ignored--meaning you become responsible for explicitly deleting attributes for null properties on put operations OR Savant can manage this for you by storing null properties as a single null character (\0) in SimpleDB.

Partial Object Operations

Partial object operations are useful when you need to put, get, select, or delete a small number of item attributes. For example, if we were working with a Person domain and needed to update just the birth date and email address for a large number of people we could do so with the code below. The example first puts new values for the two attributes and then gets just the birth date.

   1: PropertyValues values1 = new PropertyValues(person.Id);
   2: values1["BirthDate"] = new DateTime(2000, 1, 1);
   3: values1["EmailAddress"] = "mike@example.com";
   4: savant.PutAttributes<PersonItem>(values1);
   5:  
   6: PropertyValues values2 = Savant.GetAttributes<PersonItem>(person.Id, "BirthDate");

Scalar Selects

Scalar selects not only provide support for count(*) queries, but also make it easy to read a single scalar value with a minimal amount of code. In the example below we're selecting just the birth date for a person with a specific email address. If this method is used with a select query that returns extra items or attributes they will be silently discarded.

   1: DateTime? birthDate = (DateTime?)
   2:     savant.SelectScalar<PersonItem>("select BirthDate from Person where EmailAddress = @EmailAddress",
   3:                                     new CommandParameter("EmailAddress", "mike@example.com"));

BatchPutAttributes

Amazon's new BatchPutAttributes operation supports transactional puts of attributes for multiple items in the same domain. This is used automatically when you invoke SimpleSavant.Put() with multiple item parameters. For example, we could ensure that Tom, Dick, and Harry are either all in or all out of SimpleDB with the code below.

   1: PersonItem tom = new PersonItem();
   2: PersonItem dick = new PersonItem();
   3: PersonItem harry = new PersonItem();
   4: savant.Put(tom, dick, harry);

Multi-valued Command Parameters

Finally, it's now possible to provide multiple command parameter values for use with "in" clauses. The parameter values are expanded to a comma-delimited list for execution against SimpleDB. In the code snippet below we're using this feature to count the number of people in our domain whose height is denoted in inches or meters.

   1: CommandParameter parameter = new CommandParameter("HeightUnit", null);
   2: parameter.Values = new List<object> {HeightUnit.Inches, HeightUnit.Meters};
   3: int count = (int) savant.SelectScalar<PersonItem>("select count(*) from Person where HeightUnit in (@HeightUnit)",
   4:                                     parameter);

Wednesday, March 18, 2009

Simple Savant: .NET Object-Persistance Framework for Amazon SimpleDB

I'm building an application that stores all structured data using Amazon's SimpleDB service. When I started creating the overall architecture I searched for recommendations on designing applications specifically for SimpleDB or similar services. I didn't find many tips, but I did find lots of complaints about the disadvantages of SimpleDB when compared to mature RDBS products. I also discovered the available .NET interfaces to SimpleDB were fairly low-level and didn't put much effort into overcoming these inherent deficiencies.

So I put together a list of the higher-level features that would simplify building an application with SimpleDB and built many of these features on top of the Amazon C# Library for SimpleDB. The result is the Simple Savant .NET library (written in C#) which I've open-sourced at CodePlex.

So what are the biggest hurdles when designing for SimpleDB vs an RDBMS? I came up with the following list:

  • No transactions.
  • No numeric or date/time types. All attributes are stored as text strings so special formatting is required to support sorting and searching.
  • Arbitrary truncation of query results.
  • Eventual consistency. Item modifications are not guaranteed to be immediately visible on successive requests.
  • No full-text searching.
  • Item attributes are limited to 1024 characters.

(You could throw in additional deficiencies on the administration/reporting side, but my focus here is on application design.)

The first release of Simple Savant addresses several of these fundamental issues and dramatically reduces the level of effort required to work with SimpleDB. Features include:

  • Mapping object properties to SimpleDB attributes.
  • Formatting of basic .NET data types to support lexicographical sorts and searches.
  • Support for ADO.NET-style parameterized select operations (including formatting and escaping of parameter values)
  • Unlimited select results in a single call.
  • Transparent caching on Get and Put operations to mitigate the effects of SimpleDB's eventual consistency model.
  • Automatic domain creation

Using Simple Savant

Let's start by designing a class to hold information about a person:

   1: using System;
   2:  
   3: namespace Coditate.Savant.ConsoleSample
   4: {
   5:     [DomainName("Person")]
   6:     public class PersonItem
   7:     {
   8:         [ItemName]
   9:         public Guid Id { get; set; }
  10:  
  11:         public string FirstName { get; set; }
  12:  
  13:         public string LastName { get; set; }
  14:  
  15:         public string EmailAddress { get; set; }
  16:  
  17:         public DateTime BirthDate { get; set; }
  18:  
  19:         public float Height { get; set; }
  20:  
  21:         public float Weight { get; set; }
  22:  
  23:         public WeightUnit WeightUnit { get; set; }
  24:  
  25:         public HeightUnit HeightUnit { get; set; }
  26:  
  27:         public override string ToString()
  28:         {
  29:             return string.Format("\n\r\tName: \t\t{0}, {1}\n\r\tEmailAddress: \t{2}\n\r\tId: \t\t{3}", LastName,
  30:                                  FirstName,
  31:                                  EmailAddress, Id);
  32:         }
  33:     }
  34:  
  35:     public enum WeightUnit
  36:     {
  37:         Pounds,
  38:         Kilograms
  39:     }
  40:  
  41:     public enum HeightUnit
  42:     {
  43:         Inches,
  44:         Meters
  45:     }
  46: }

This is just about the simplest possible class we could use with Simple Savant. The ItemName attribute attached to PersonItem.Id is the only customization required for storing person instances in SimpleDB. It tells Simple Savant which property to use as the SimpleDB item name for Get, Put, and Delete operations.

The class-level DomainName attribute is optional. It lets us customize the SimpleDB domain where instances are stored. By default the class name is used for the domain name. Thus if we removed the DomainName attribute person instances would be stored in the "PersonItem" domain, but with the attribute they will be stored in the "Person" domain.

Next let's populate a PersonItem and put it in SimpleDB:

   1: PersonItem person = new PersonItem
   2:     {
   3:         BirthDate = new DateTime(1972, 1, 15),
   4:         EmailAddress = "bob@example.com",
   5:         FirstName = "Bob",
   6:         Height = 72.5f,
   7:         HeightUnit = HeightUnit.Inches,
   8:         Id = Guid.NewGuid(),
   9:         LastName = "Smith",
  10:         Weight = 200,
  11:         WeightUnit = WeightUnit.Pounds
  12:     };
  13:  
  14: string awsAccessKeyId = "xxxxxxx";
  15: string awsSecretAccessKey = "xxxxxxx";
  16: SimpleSavant savant = new SimpleSavant(awsAccessKeyId, awsSecretAccessKey);
  17:  
  18: savant.Put(person);

Once we've populated our person object it takes just a few lines of code to configure Simple Savant and send our person to SimpleDB!

Here's what we'd find in SimpleDB after executing this code:

ItemName

EmailAddress

Height

Weight

HeightUnit

WeightUnit

BirthDate

FirstName

LastName

c9748367-c929-408d-b7cf-60b3677717cf

bob@example.com

1072.5

1200

Inches

Pounds

1972-01-15T00:00:00.000-05:00

Bob

Smith

By default property names are used for SimpleDB attribute names, but you can customize them using the AttributeName attribute. Other points of note:

  • Dates are formatted using the ISO 8601 standard to support lexicographical ordering.
  • Unsigned numeric types are zero-padded to support lexicographical ordering.
  • Signed numeric types are offset to support lexicographical ordering when positive and negative values are mixed together. The default offset value is 10 raised to the nth power, where n is the maximum number of whole digits supported by the type. This is why our height value of 72.5 is stored as 1072.5 and our weight value of 200 is stored as 1200. (Float or Single precision values are formatted with three whole and four decimal digits by default.) 
  • All properties are mapped to SimpleDB attributes by default. You can customize this behavior using the SavantInclude and SavantExclude attributes.
  • Property formatting can be customized using the CustomFormat and NumberFormat attributes. For example, we could easily customize  PersonItem.Weight to be formatted with five whole digits and two decimal digits to increase the maximum possible weight value from 999 to 99,999.

Getting our person back from SimpleDB takes just a couple more lines of code:

   1: Guid personId = person.Id;
   2: PersonItem person2 = savant.Get<PersonItem>(personId);

If we added many people to our person domain and needed to select them all we could do so like this:

   1: IList<PersonItem> allPeople = savant.Select<PersonItem>("select * from Person");

Note that this query would return all items in the domain. Simple Savant requests all available pages of results from SimpleDB unless you explicitly limit results using properties on SelectCommand. See the API documentation for more details on this.

Finally, to run a more sophisticated range query finding all people born during the 1980s we can use a parameterized command:

   1: SelectCommand<PersonItem> command = new SelectCommand<PersonItem>("select * from Person where BirthDate between @StartDate and @EndDate");
   2: command.AddParameter(new CommandParameter("StartDate", "BirthDate", new DateTime(1980, 1, 1)));
   3: command.AddParameter(new CommandParameter("EndDate", "BirthDate", new DateTime(1989, 12, 31)));
   4:  
   5: SelectResults<PersonItem> eightiesPeople = savant.Select(command);

The same formatting rules are applied to select parameters as when performing Get and Put operations on the Person domain. In other words, if we defined custom formatting behavior for PersonItem.BirthDate, the same formatting rules would be used for select operations involving the BirthDate attribute--and the query above would still just work.

Hopefully this post will help you get started using Simple Savant. The CodePlex release includes sample code and full API documentation that should keep you going.

UPDATE: Here is a brief tutorial on the new features in Simple Savant v0.2.

 
Header photo courtesy of: http://www.flickr.com/photos/tmartin/ / CC BY-NC 2.0