Authentication - Session, Basic and OAuth

Authentication

Authenticating users of an API is very important - and thankfully the many available extension points within ASP.Net MVC WebAPI make this really easy to implement.

We intended the API being developed to be consumed by both the application itself, as well as clients armed with a username and password - so that meant:

  • Session based authentication
  • Basic Authentication

The application was already an OAuth provider (to support our OpenSocial gadget support) - so we also decided to adopt this for the API, thus allowing those gadgets to also interact with the API (and to allow for delegated authentication scenarios).

Delegating Handler

Initially we attempted to support these three authentication methods through separate delegating handlers, but eventually abandoned that approach for a single class that handled all 3 authentication methods - here is the guts of determining which method to use:

Task AuthenticateRequest(HttpRequestMessage request, CancellationToken cancellationToken)
{
if (request.Properties.ContainsKey("user") && request.Properties["user"] != null)
{
return HandlePreAuthenticated(request, cancellationToken);
}

var context = request.GetHttpContext();

if (request.Headers != null
&& request.Headers.Authorization != null
&& request.Headers.Authorization.Scheme != null)
{
if (request.Headers.Authorization.Scheme.Equals("basic", StringComparison.OrdinalIgnoreCase))
{
return HandleWithBasicAuthAuthentication(request, cancellationToken);
}

if (request.Headers.Authorization.Scheme.Equals("OAuth", StringComparison.OrdinalIgnoreCase))
{
return HandleWithOAuthAuthentication(request, cancellationToken, context);
}

return Task.Factory.StartNew(() => new HttpResponseMessage(HttpStatusCode.Unauthorized));
}

return HandleWithSessionAuthentication(request, cancellationToken, context);
}

So the approach taken was that:

  • If the request properties contains a user, we treat the request as pre-authenticated (used for testing mostly, more on that in a future post).
  • If there is authorization header, we check the scheme and perform either Basic or OAuth handling of the request.
  • Otherwise, we fall through to handling the request with session authentication.

The HttpContext (and it's related abstractions) are fairly baked in to parts of the pre-existing Authentication infrastructure and so we need to extract this from the request to complete authentication in many cases - this has actually got much easier with each release of the WebAPI - the first WCF based drops of the WebAPI made this almost impossible to do without spelunking into reflection over private fields.

All authentication methods would eventually end up associating an authenticated user with the requests properties via a SetIdentity method:

void SetIdentity(User user, HttpRequestMessage request)
{
request.Properties.Add("user", user);
}

OAuth

Enterprise Tester uses DevDefined.OAuth - which includes support for the problem reporting extension as part of OAuth 1 - this is exposed as Report property on the OAuthException, which can be then used as the content of a response when Authentication fails:

Task HandleWithOAuthAuthentication(
HttpRequestMessage request,
CancellationToken cancellationToken,
HttpContextBase context)
{
var httpRequest = context.Request;

try
{
User user = _authenticationService.AuthenticateRequest(httpRequest);

SetIdentity(user, request);

return base.SendAsync(request, cancellationToken);
}
catch (OAuthException authEx)
{
string reportAsText = authEx.Report.ToString();

if (Logger.IsErrorEnabled) Logger.ErrorFormat(authEx, "OAuth Error occurred while authenticating OAuth request, url: {0}, method: {1}", httpRequest.Url, httpRequest.HttpMethod);

return Task.Factory.StartNew(() => new HttpResponseMessage(HttpStatusCode.Forbidden) {Content = new StringContent(reportAsText)});
}
catch (Exception ex)
{
if (Logger.IsErrorEnabled) Logger.ErrorFormat(ex, "General Error occurred while authenticating OAuth request, url: {0}, method: {1}", httpRequest.Url, httpRequest.HttpMethod);

var report = new OAuthProblemReport {Problem = OAuthProblems.PermissionUnknown, ProblemAdvice = "Encountered general error: " + ex.Message + " - please see application logs for more details"};

string reportAsText = report.ToString();

return Task.Factory.StartNew(() => new HttpResponseMessage(HttpStatusCode.Forbidden) {Content = new StringContent(reportAsText)});
}
}

Async

Within the application we have a simple service for returning the "current user" associated with a request/thread:

public interface IUserContext
{
User CurrentUser { get; }
}

With the WebAPI being asynchronous the mechanics of this didn't work very well for us (The thread the DelegatingHandler executes on for Authentication wasn't necessarily the same thread that constructed the controller and executed the action).

To avoid too much rework we just implemented an ActionFilterAttribute that was applied to a base controller which all the "authenticated" controllers inherited from:

public class AssociateUserWithThreadFilterAttribute : ActionFilterAttribute
{
public override void OnActionExecuting(HttpActionContext actionContext)
{
var service = IoC.Resolve();

var user = actionContext.Request.Properties["user"] as User;

if (user != null && service != null)
{
service.SetIdentity(user, Authorization.Everything);
}

base.OnActionExecuting(actionContext);
}
}

It's not a beautiful solution, but had no impact on our existing implementation for authentication which is great.

The one gotcha here is if returning a Task as the result of a controller action you need to be a little careful/take care with associating the user with the task's thread yourself.

So far we only return a Task from the POST methods handling the upload of attachments as a mime multipart request, so this hasn't been too much of a problem to deal with.

Next

Next in part 6 we take a look at how we handled exposing long running tasks.

Read More

OData, TQL and Filtering

Drilling down

When using an API the consumer generally wants to be able to:

  • Access a subset of a large collection of resources (paging)
  • Order the set of resources they get back.
  • Filter the resources returned based on some criteria.
  • Aggregate across the resources matching the filter (at the very least get a total count back)

Enterprise Tester aims to provide ways to achieve all of this - the approach we took was fairly pragmatic:

  • Leverage existing query language investments (we have a query language in our application already, it makes sense to also expose this in the API)
  • For anything else, use the OData $filter functionality to filter it (we didn't want to invent another query language)

TQL Support

Query Language


TQL (Testing Query Language) is a Domain Specific Query language developed for searching and aggregating test information - and worth a series of posts all on it's own - but was definitely an existing investment we wanted to leverage when building out the products API.

TQL Queries can be quite simple:

Status = Open

Or quite complex (the below query would find any requirements which are associated (indirectly) to bugs with a Resolution of 'Wont Fix' raised in the last week).

EntityType = Requirement
AND Relationships IN {
Destination IN {
Type = Bug
AND Resolution = 'Wont Fix'
AND CreatedAt >= "-1 week"
}
}
ORDER BY Package ASC, LastUpdatedAt DESC

The parsing of the Query Language is implemented in F# using FParsec (if you haven't looked at FParsec, then umm.. you should - I can't say enough good things about this library!)

We have not so for had to make any changes to the query language to make it more palatable to consumption from the API - I think a few things worked in our favor there:

  • Quoting strings is optional for words not containing whitespace, and you can use single or double quotes.
  • Encoding of strings follows the JSON conventions for escaping etc.
  • When implementing the parser we ensured it was whitespace insensitive - so the above query can also just be written on a single line.
  • We did not use symbols for AND and OR logical operators - so we avoided using ampersands [&] for AND
  • Having the query include ordering info avoided the need for a second order/sort query parameter

This allowed us to make it easy to search via the API without having to URL encode the query parameter in many cases.

Working against our favor is Lucene itself - the query language allows performing a contains search using a Tilde (~) operator:

Name ~ "defect bug"

Within the string being searched for we support the use of Lucene query parser syntax:

Name ~ "te?t" AND Description ~ "'REST API' && 'Curl Example'"

This can trip up people experimenting with the API directly within a browser, where in some cases not escaping these characters correctly can result in part of their query being parsed as a parameter - so far this hasn't really proven to be much of an issue.

Controller

The implementation of a controller taking a TQL query was fairly simple:

public HttpResponseMessage Get(string tql = null)
{
QueryResults results = _entitySearcher.Search(Request.RequestUri, tql, Skip ?? 0, Top ?? DefaultTopSearchResults);

List wrapped = results.Items.Select(Wrap).ToList();

var wrappedResults = new QueryResults
{
Items = wrapped
};

if (!NoInlineCount)
{
wrappedResults.Skip = results.Skip;
wrappedResults.Top = results.Top;
wrappedResults.Total = results.Total;
}

HttpResponseMessage response = Request.CreateResponse(HttpStatusCode.OK, wrappedResults);

return response;
}

The implementation defaults to including skip/top/total as well as next/prev/first/last links in the response - but we did provide a way for client consumer to excluded that information if they so desired (i.e. if implementing a search which is guaranteed to return 1 result) - by passing in the OData $inlinecount query parameter.

GET /api/automatedtests?tql=Name~Selenium&$inlinecount=none

We also exposed a search method, allowing you to execute TQL queries to search across all the entity types at once, implementation of the controller there was similar - one thing we did do is leverage the "Expansions" dictionary every view model has to embellish it with the type of the search result (as a property called "EntityType").

wrappedResults.Items = results.Items.Select(result =>
{
object mapped = _viewModelMapper.MapSearchResult(result, Expands);
string type = QueryUtility.FormatTypeForDisplay(result.GetUnproxiedType());
((AbstractModel) mapped).Expansions.Add("EntityType", type);
return mapped;
}).ToList();

In the case of search results we are dealing directly with NHibernate entities, which can be proxies - thus the call to .GetUnproxiedType().

OData

I always feel a little disingenuous referring to OData in our API docs - but unfortunately I don't know of a good term for what we are doing.

Our support for OData extends as far as filtering a set of results (GET requests) and goes no further - we certainly did not build an OData compliant API, or ensure the shape of our results conformed to something an OData consumer may expect.

The filtering specification outlined in OData though is incredibly useful to avoid inventing yet another query language unnecessarily (and was one of the draw cards for using WebAPI in the first place).

  • $expand
  • $filter
  • $inlinecount
  • $orderby
  • $skip
  • $top

Initially our collection resource GET methods looked like this (or in some cases with additional query parameters to identify the collection owner)

[Queryable]
public IQueryable Get()
{
...
}

But as we moved through the pre-releases for the WebAPI we hit a bit of a snag in that OData support was pulled from the beta - we knew the problem would eventually be rectified, but in the mean time we had code that didn't work any more - so we pulled the necessary pieces that made up OData execution in earlier builds and re-introduced support - so our controllers ended up like this:

public HttpResponseMessage Get()
{
QueryResults results = ODataQueryExecutor.Execute(someQueryable, Request.RequestUri);
HttpResponseMessage response = Request.CreateResponse(HttpStatusCode.OK, results);
return response;
}

And the code to handle executing the OData query and return the results:

public static class ODataQueryExecutor
{
public static QueryResults Execute(IEnumerable items, Uri uri)
{
StructuredQuery structuredQuery = ODataQueryDeserializer.GetStructuredQuery(uri);

IStructuredQueryPart inlineCount = structuredQuery.QueryParts.FirstOrDefault(p => p.QueryOperator == "inlinecount");

var resultsPage = new QueryResults();

if (inlineCount != null && inlineCount.QueryExpression == "allpages")
{
resultsPage.Total = ((IQueryable) ODataQueryDeserializer.Deserialize(items.AsQueryable(), GetQueryWithoutTopOrSkip(structuredQuery).QueryParts)).Count();

resultsPage.Top = TryParseQueryPartAsInt(structuredQuery.QueryParts, "top");

resultsPage.Skip = TryParseQueryPartAsInt(structuredQuery.QueryParts, "skip");
}

resultsPage.Items = ((IQueryable) ODataQueryDeserializer.Deserialize(items.AsQueryable(), structuredQuery.QueryParts)).ToList();

resultsPage.SetSelfAndGenerateLinks(uri);

return resultsPage;
}

This allowed us to just do filtering, and get the results back in our familiar QueryResults form (so we ensure consistency with the equivalent TQL query responses).

Aggregation

The TQL (Testing Query Language) also features the ability to perform aggregations (Think group by, count, sum, average, faceted querying etc.) - these can be quite complex:

COUNT,
Min(LastUpdatedAt) AS "Start At",
Max(LastUpdatedAt) AS "Finished At",
GROUP BY Status {
GROUP By Type {
Count
}
} AS "Statuses",
GROUP BY Path {
SUM(EstimatedDuration) AS "Total Estimated",
SUM(ActualDuration) AS "Total Actual",
AVG(EstimatedLessActualDuration) AS "Average Remaining Time (Per Script)"
} AS "Packages",
FACETED OrArgs(Equal(Status,Failed), Equal(Status,Blocked)) AS "Failed OR Blocked" {
Entities
} AS "Flagged"
WHERE EntityType = ScriptAssignment
AND Project = 'Project X'
AND Status != NotRun

Which within the application would return a result like this:

And from the API, returns a result like this:

{
"Results": {
"COUNT": 15,
"Start At": "2011-11-01T02:51:57Z",
"Finished At": "2011-12-29T20:48:48Z",
"Statuses": {
"Passed": {
"GroupByType": {
"Smoke": {
"Count": 4
},
"Functional": {
"Count": 4
},
"User Acceptance": {
"Count": 1
}
}
},
"InProgress": {
"GroupByType": {
"Smoke": {
"Count": 1
},
"Functional": {
"Count": 2
}
}
},
"Failed": {
"GroupByType": {
"Regression": {
"Count": 1
},
"Functional": {
"Count": 2
}
}
}
},
"Packages": {
"Execution Sets/Cycle 1/Reports": {
"Total Estimated": "02:05:00",
"Total Actual": "02:25:00",
"Average Remaining Time (Per Script)": "-00:03:00"
},
"Execution Sets/Cycle 1 - regression testing": {
"Total Estimated": "01:10:00",
"Total Actual": "02:00:00",
"Average Remaining Time (Per Script)": "-00:12:30"
},
"Execution Sets/Cycle 1/Transfers": {
"Total Estimated": "01:10:00",
"Total Actual": "04:45:00",
"Average Remaining Time (Per Script)": "-00:28:20"
}
},
"Flagged": {
"Failed OR Blocked": {
"Entities": [
{
"Id": "5b932463-6089-4bc9-9e23-a0b100e133b0",
"EntityType": "ScriptAssignment",
"Self": "http://localhost/EnterpriseTester/api/scriptassignment/5b932463-6089-4bc9-9e23-a0b100e133b0"
},
{
"Id": "39c54b0c-5340-4b93-ace9-a0b100e13519",
"EntityType": "ScriptAssignment",
"Self": "http://localhost/EnterpriseTester/api/scriptassignment/39c54b0c-5340-4b93-ace9-a0b100e13519"
},
{
"Id": "ac8e5dce-322c-4e05-9975-a0b100e13375",
"EntityType": "ScriptAssignment",
"Self": "http://localhost/EnterpriseTester/api/scriptassignment/ac8e5dce-322c-4e05-9975-a0b100e13375"
}
]
}
}
}
}

To achieve this we needed to parse the query to determine if it was an aggregation, and then handle the search query differently in each case:

public HttpResponseMessage Get(string tql = null, string format = null)
{
tql = tql ?? string.Empty;

PresentationContext context = CreatePresentationContext();

EnterpriseTester.Core.Search.AST.Query parsed = _queryParser.ParseQuery(tql);

if (parsed.IsAggregatedExpression && !string.IsNullOrEmpty(tql))
{
return HandleAggregatedExpression(parsed , format, context);
}

return HandleQuery(parsed, format, context);
}

Aggregated search results are generated as a set of nested IDictionary instances - which actually serialize very well to JSON when using JSON.Net - all except our entities...

With the aggregated queries support in TQL, you can use functions such as COUNT, SUM(Field), AVG(Field) and so on... one of these functions Entities - which actually just returns an List of all the entities matching the criteria. This is not something SQL does, but this isn't SQL now is it - so there is nothing stopping us returning an array as opposed to a single value for any node in the tree of results.

By default Entities will stop being collected by the query after there is 25, but the limit can be increased too if necessary. This feature is really useful when combined with a faceted search where expressions and formulas can be used to calculate which entities to include or not, and where you don't expect to get a large number of matching entities.

So within the dictionary of results returned from the TQL query engine for the aggregated query, we may have a list of EntityInfo elements, one for each entity returned from a "Entities" aggregate function.

EntityInfo consisted of the CLR Type and Guid ID for the entity - not something we want to expose in our API (but very useful for other parts of the application) so to overcome this we pass a visitor over the dictionary of results to rewrite these into a form that's palatable for our API consumers:

HttpResponseMessage CreateDefaultAggregatedResponse(IDictionary results)
{
var visitor = new ResolveEntityInfoDictionaryVisitor(entity =>
{
string entityType = QueryUtility.FormatTypeForDisplay(entity.EntityType);
return new Dictionary
{
{"Id", entity.Id},
{"EntityType", entityType},
{AbstractModel.SelfProperty, _viewModelMapper.ResolveUrlForEntityTypeResource(entityType,entity.Id)}
};
});

visitor.Visit(results);

return Request.CreateResponse(HttpStatusCode.OK, new RawAggregatedQueryModel {Results = results});
}

So we ensure those aggregation results now have a URL pointing at the resource for that entity.

Next

Next in part 5 we take a look at how we handled Authentication, including support for Session, Basic and OAuth.

Read More

Generating API documentation

Out of the box API help

The WebAPI supports generation of a help page featuring descriptions of each controller / Http method etc. - even generated samples - I found this blog useful to understand the process, including a nice video - and the feature has come a long way since it was first introduced.

For a green-fields application I would strongly recommend using it!

Here's one I prepared earlier...

However, for Enterprise Tester we already had an existing in-application help system in place, and so decided to also put our generated API documentation there as well.

Here's what the help system looks like:

We did initially look at harnessing what came out of the box with WebAPI, but decided given the way our JSON rewriting works, and some of our additional metadata such as expansions, that it would be easier to implement much of what we needed from scratch.

To do this we opted for a more localized approach - marking controllers and action methods up with attributes to provide the metadata to generate the documentation - this also provides a hint about the Expands it supports etc.

[HttpMethodInfo(
Description = "Retrieves all (or a subset) of automated tests that are visible to the current user.",
RequiredPermissions = new[] {CoreOperations.CoreOperationsTestManagementView.FullName},
SupportsTQL = true)]
[HttpMethodSupportsStatusCode(HttpStatusCode.OK, "....description...")]
[HttpMethodSupportsStatusCode(HttpStatusCode.Forbidden, "....description...")]
[ExpandInfo(typeof (AutomatedTest), typeof (ViewAutomatedTestModel))]
public HttpResponseMessage Get(string tql = null)
{
...
}

Next, we wanted to provide multiple examples for each HTTP method - we did this by setting up a class called "ExampleInfo":

public class ExampleInfo
{
public string Method { get; set; }
public string Title { get; set; }
public string Description { get; set; }
public List RequestHeaders { get; set; }
public List ResponseHeaders { get; set; }
public List RequestParameters { get; set; }
public object RequestModel { get; set; }
public string RenderedRequest { get; set; }
public object ResponseModel { get; set; }
public string RenderedResponse { get; set; }
public HttpStatusCode ResponseStatus { get; set; }
}

public class ParameterInfo
{
public string Key { get; set; }
public string Value { get; set; }
public string Description { get; set; }
}

And then controllers (or their abstract base controllers...) could provide examples by implementation a static method on the controller itself i.e.

static IEnumerable GetPostExamples()
{
yield return new ExampleInfo { .... };
}

The name of the method is not important, we match only on type signature (so you can have as many example methods as you like i.e. one per HTTP method etc.)

Also notice that in the example info we have both a ResponseModel and a RenderedResponse properties (and equivalents for the request) - rendered requests (strings) are useful when providing examples for documenting multi-part mime supporting methods, but primarily we leveraged the RequestModel/ResponseModel which meant changes to examples would always reflect the current codebase.

Resource documentation

Given all this information we then generate a help topic that looks like this:

In addition we generate a index list of all available resources, providing a quick overview of what methods are support by each resource.

For collection resources, where the results can be filtered (either by filtering via OData's $filter query parameter, or via TQL - Testing Query Language) we also include a small label indicating which query type is supported.

Last of all (an arguably one of the most useful views for a developer building an API) is the permission view - though you should obviously have tests to verify the authorization restrictions applied to each method, it can be very useful to see it presented in a matrix view - so for each resource, we can see what permissions are required.

Because of the underlying help system it's possible for 3rd party developers to further extend the content of these help topics or add new topics to further document the capabilities of the API.

Bulk Actions

Last of all beyond REST we also expose some existing functionality via JSON RPC, such as support for bulk actions.

Bulk actions within the application take a set of items (or a query returning a set of items) and then applies an action to that set - be it generating some form of Export, performing a bulk move/update/delete etc.

We contemplated trying to translate these concepts into a RESTful context, but it didn't really make sense and so instead opted for a mechanism to start these background tasks, and then monitor their progress.

Metadata related to bulk actions was already available via the IoC container, so we implemented some additional help topic "providers" to generate additional topics necessary for examples of the various types of action you could execute through the API.

Next

Next in part 4 we take a look at OData, TQL and filtering of collection resources.

Read More

Expand implementation and view model mapping

What is Expand

Round-trips are the death of performance in many cases, and this is no different for API's.

The web does scale out well - so there is certainly the option to make lots of simultaneous requests, but this does not take care of the problem of addressing those related resources - if the you need to fetch back a resource's representation before you can construct additional requests to fetch other resources, you still are faced with the issues of latency.

OData provides a mechanism for implementing this via a URL containing the $expand query parameter. API's implemented for products such as Atlassians Jira (popular defect tracker) include an "expand" parameter which achieves the same thing, but uses a slightly different approach.

The API being presented here is not an OData compliant service - but certainly the Expand concept was a useful one we wanted to adopt.

Deep expansion

In addition to single-level expansion:

GET /api//scriptpackage/{id}?$expand=Children

Which might return:

{
"Id": "1911ea14-3ede-46ca-bb1b-a0a80019f6cf",
"ProjectId": "53cc97cd-7514-4465-b352-a0a80019f180",
"Name": "Script Library",
"OrderNumber": 2,
"Expands": [
"Children",
"Parent",
"Project",
"Scripts"
],
"Self": "https://localhost/api/scriptpackage/1911ea14-3ede-46ca-bb1b-a0a80019f6cf"
}

We wanted to support deeper expansion - so that something like:

GET /api/project/{id}/scriptpackages?$expand=Children.Children,Children.Scripts

Would return a script package (folder) with all it's child packages, those child packages children and those child packages scripts (test cases).

{
"Id": "1911ea14-3ede-46ca-bb1b-a0a80019f6cf",
"ProjectId": "53cc97cd-7514-4465-b352-a0a80019f180",
"Name": "Script Library",
"OrderNumber": 2,
"Expands": [
"Parent",
"Project",
"Scripts"
],
"Children": [
{
"Id": "1fe2686a-eb17-485a-96b4-a0a80019f6cf",
"ParentId": "1911ea14-3ede-46ca-bb1b-a0a80019f6cf",
"Name": "Sprint 1",
"OrderNumber": 0,
"Expands": [
"Parent",
"Project"
],
"Scripts": [...],
"Children":[...],
...
},
...
}

Notice that we advertise the available expansions as a property of the resource - this is a feature of the Atlassian Jira API we adopted (and this list changes based on what expansions have already been applied).

Building a mapper

To allow expansion to be done correctly and at any depth, we needed to hand over construction of our view models to a third party - thus enters the view model mapper:

public interface IViewModelMapper
{
void RegisterSearchResultConstructor(
Func intermediateConstructor)
where TFrom : class
where TTo : AbstractModel
where TIntermediate : class;
void RegisterDefaultConstructors(Type[] from, Type to);
void RegisterDefaultConstructor();
void RegisterConstructor(Func constructor);
void RemoveConstructor();
void RegisterExpander(string expansionName, string resourceName,
Func expansion);
void RegisterExpander(Type fromType, Type toType, string expansionName,
string resourceName, Func expansion);
void RegisterEntityTypeAndIdToResourceResolver(Func urlFunc,
params string[] entityTypes);
void RemoveExpand(string expansionName);
TTo Map(TFrom from, params string[] expansions)
where TTo : AbstractModel;
object MapSearchResult(object from, string[] expansions);
object Map(object instance, Type targetType, params string[] expansions);
IEnumerable GetExpandersFor(Type fromType, Type toType);
string ResolveUrlForEntityTypeResource(string entityType, Guid id);
}

The implementation of this interface comprises a service where you can register:

  • Constructors - which are able to take a DTO/domain class/Tuple/whatever and construct a view model from it.
  • Expanders - a named expansion attached to a constructor
  • Map methods for mapping an instance to a view model, with a set of expansions to apply
  • Handling of special cases such as translating the results of a search to a suitable form for then translating into a view model

Given the plugin architecture used within the application, this provided the ability for plugins to add new Expand options to existing resources - so for example if a customer has the automated testing plugin enabled, then the script packages (folder) will also support expansions for the "AutomatedTests" collection of automated tests within that package.

As an example how we register an expander - here is code to register the expansion for a collection of steps associated with a script.

...

mapper.RegisterExpander(
"Steps", null, (input, expands) => RenderSteps(expands, input));
}

IList RenderSteps(string[] expands, EditScriptDto input)
{
ExpandsUtility.AssertEmpty("Steps", expands);

return (input.Steps ?? Enumerable.Empty())
.Select(StepModel.CreateFrom)
.OrderBy(model => model.OrderNumber).ToList();
}

In this case we are using Expand to avoid the cost of expanding a large collection (the steps for a testscript/test case) which is part of the Script aggregate (believe it or not, there are testers out there writing tests scripts with 300+ steps...).

Controllers

Within our API controllers we just call the mapping method and pass in the expands parameter:

var script = _scriptReportingService.GetScript(id);
var wrapped = _viewModelMapper.Map(script, Expands);
return Request.CreateResponse(HttpStatusCode.OK, wrapped);

The Expands property in this case just exposed a property associated with the current request.

protected virtual string[] Expands
{
get { return (string[]) (Request.Properties["expand"] ?? new string[] {}); }
}

And this request property was captured by a simple DelegatingHandler that would parse the query string for various OData parameters - this approach made it a bit easier for other delegating handlers to have access to this information prior to the controller's methods being invoked.

OData support in ASP.Net Web API

For those who have been working with the various releases of ASP.Net WEB API since it was originally targeting WCF, there have been quite a few breaking changes along the way, including OData - which was introduced initially as a basic [Queryable] attribute that could be added to controller methods, and then later on, removed entirely pending a new OData re-implementation.

Recently the Web API team have announced greatly improved support for OData in the WebAPI - allowing the construction of entirely OData compliant services, as a preview release on nuget - the [Queryable] attribute is also back.

I believe this now includes support for $expand, which was previously missing, but I haven't yet had a chance to play with the latest release to confirm this - but I'm not entirely sure if this would have worked for our approach at any rate.

Next

Next, in part 3 of this series we take a look at how we generated API documentation.

Read More

Links, absolute URI's and JSON rewriting

Serialization


As part of the design for the resources being returned from our API we wanted to ensure they included some useful information:
So we are looking to return entities that look like this:
{
"Id": "5b2b0ad0-5371-4abf-a661-9f410088925f",
"UserName": "joeb",
"Email": "joe.bloggs@test.com",
"FirstName": "Joe",
"LastName": "Bloggs",
"Expands": [
"Groups"
],
"Self": "http://localhost:29840/api/user/5b2b0ad0-5371-4abf-a661-9f410088925f",
"Links": [
{
"Title": "Group Memberships",
"Href": "http://localhost:29840/api/user/5b2b0ad0-5371-4abf-a661-9f410088925f/groups",
"Rel": "Groups"
}
]
}

This inevitably means creating some kind of view model that you return from your API as the representation of the underlying resource (entity, aggregate etc.)

After some experimentation we landed on the idea of leveraging the capabilities of JSON.Net to perform on-the-fly JSON rewriting of our serialized entities.

This meant deriving our view models from this base class, and implementing the abstract "Self" property (to return a link to the entity itself) as well as supporting the links and expansions.

public abstract class AbstractModel
{
public const string ExpansionsProperty = "__expansions__";
public const string SelfProperty = "__self__";
public const string LinksProperty = "__links__";

protected AbstractModel()
{
Expansions = new Dictionary();
Links = new List();
}

[JsonProperty(SelfProperty, NullValueHandling = NullValueHandling.Ignore)]
public abstract string Self { get; }

[JsonProperty(NullValueHandling = NullValueHandling.Ignore)]
public string[] Expands { get; set; }

[JsonProperty(ExpansionsProperty, NullValueHandling = NullValueHandling.Ignore)]
public IDictionary Expansions { get; set; }

[JsonProperty(LinksProperty, NullValueHandling = NullValueHandling.Ignore)]
public IList Links { get; set; }
}


Each link, was then also represented by a LinkModel, which was a very simple class:
public class LinkModel
{
[JsonProperty(NullValueHandling = NullValueHandling.Ignore)]
public string Title { get; set; }
public bool Inline { get; set; }
public string Href { get; set; }
public string Rel { get; set; }
}

If you look closely at the abstract model class above you will see the properties use the names __expansions__, __self__ and __links__ - so initially when JSON.Net is used to serialize the entity, that's the name we get.

We then extend the existing JsonMediaTypeFormatter to perform serialization to JToken and then rewrite the token. This is an abstract class:

public abstract class AbstractRewritingJsonMediaTypeFormatter : JsonMediaTypeFormatter
{
protected abstract JToken Rewrite(JToken content);

public override Task WriteToStreamAsync(Type type, object value, Stream writeStream, HttpContent content, TransportContext transportContext)
{
if (type == null) throw new ArgumentNullException("type");
if (writeStream == null) throw new ArgumentNullException("writeStream");

if (UseDataContractJsonSerializer)
{
return base.WriteToStreamAsync(type, value, writeStream, content, transportContext);
}

return TaskHelpers.RunSynchronously(() =>
{
Encoding effectiveEncoding = SelectCharacterEncoding(content == null ? null : content.Headers);

JsonSerializer jsonSerializer = JsonSerializer.Create(SerializerSettings);

using (var tokenWriter = new JTokenWriter())
{
jsonSerializer.Serialize(tokenWriter, value);

JToken token = tokenWriter.Token;

JToken rewrittenToken = Rewrite(token);

using (var jsonTextWriter = new JsonTextWriter(new StreamWriter(writeStream, effectiveEncoding)) {CloseOutput = false})
{
if (Indent)
{
jsonTextWriter.Formatting = Formatting.Indented;
}

rewrittenToken.WriteTo(jsonTextWriter);

jsonTextWriter.Flush();
}
}
});
}
}


Which we then have a concrete implementation of:
public class JsonNetFormatter : AbstractRewritingJsonMediaTypeFormatter
{
readonly IUrlTransformer _urlTransformer;
readonly ExpandsRewriter _expandsRewriter;
readonly SelfRewriter _selfRewriter;
readonly LinksRewriter _linksRewriter;

public JsonNetFormatter(IUrlTransformer urlTransformer)
{
if (urlTransformer == null) throw new ArgumentNullException("urlTransformer");
_urlTransformer = urlTransformer;
_expandsRewriter = new ExpandsRewriter();
_selfRewriter = new SelfRewriter(_urlTransformer);
_linksRewriter = new LinksRewriter(_urlTransformer);
}

protected override JToken Rewrite(JToken token)
{
_expandsRewriter.Rewrite(token);

_selfRewriter.Rewrite(token);

_linksRewriter.Rewrite(token);

return token;
}
}


By doing this we can then implement simple visitors which can rewrite the JSON on the fly looking for those special token names - so for example, in our links above we have a property

public bool Inline { get; set; }

If Inline is true, we actually "in-line" the link into the body of the representation (using Rel as the name of the property), but if the link is not inline, we include in the set of links.

This rewriting process also takes care of rewriting relative API URL's to be absolute, so controllers largely don't need to care about resolving absolute URLs within representations themselves.

JSON Rewriting does bring a cost with it, but so far we have found the cost to be very low (as we are not transforming JSON strings, but just serializing directly to tokens first, then converting the tokens to a string - this avoids the need to delve into reflection to achieve the same results.

Links


Our linking implementation is largely bespoke, but mirror's that of a hyperlink within a web page. Initially we just had a Rel and Href property, but after a while adopted a Title as well (So similar to The netflix links representation in XML).

Though REST as a term is used to describe the "type" of API that is implemented, in fact the API (like most) falls into the camp of "REST'ish" as opposed to RESTful - though personally a fan of HATEOAS, in this case it's a trait we would like to move closer towards, but is certainly not a constraint our API must fulfill before we make it available for consumption.

There are some standards/proposals out there for links within JSON responses, but largely the impact would have been mostly negative to the consumption of the API, by making the representations more internally inconsistent in naming style etc. and for little gain, as the proposed standards don't make implementing the client any simpler at this stage.

The Links collection can be populated by external resources, but largely we have the the model itself populate the set of available links upon construct.

Collection results

When returning collection results, if a collection was page (more on paging in a future post we we look at querying/OData) we also aimed to return the following links as part of the response:

* First page
* Last page
* Next page
* Previous page

The IANA provides a list of well-known link relationships including "first", "next", "last" and "prev" - so we adopted those values for the links in the API.

public class QueryResults : AbstractModel
{
public override string Self
{
get;
}

public int? Skip { get; set; }

public int? Top { get; set; }

public int? Total { get; set; }

public IList Items { get; set; }

public QueryResults SetSelfAndGenerateLinks(string self)
{
...
}

public QueryResults SetSelfAndGenerateLinks(Uri uri)
{
..
}

protected void AddLink(string rel, Uri uri, int start)
{
..
}
}

Notice we have a method for setting the Self URL in this case - this is because in the case of a query result the code returning the set of query results may be decoupled from the controller where knowledge of the current request URI exists.

Within the call to SetSelfAndGenerateLinks we have this code for adding the links, based on where we are at in the set of results.

if (inMiddle || atStart)
{
AddLink(StandardRelations.Next, uri, Math.Max(0, Skip.Value + Top.Value));
AddLink(StandardRelations.Last, uri, Math.Max(0, Total.Value - Top.Value));
}

if (inMiddle || atEnd)
{
AddLink(StandardRelations.Previous, uri, Math.Max(0, Skip.Value - Top.Value));
AddLink(StandardRelations.First, uri, 0);
}

And then when the request is rendered we might end up with a response that looks like this:

{
"Skip": 20,
"Top": 25,
"Total": 45,
"Items": [
...
],
"Self": "http://localhost:29840/api/search?tql=Name+~+test&$skip=20&$top=25",
"Links": [
{
"Href": "http://localhost:29840/api/search?tql=Name+~+test&$skip=0&$top=25",
"Rel": "prev"
},
{
"Href": "http://localhost:29840/api/search?tql=Name+~+test&$skip=0&$top=25",
"Rel": "first"
}
]
}

Providing links like this can really simplify the implementation of clients and allows us to potentially change URI structure without causing as many headaches for API consumers.

Next

Next, in part 2 of this series we take a look at the implementation of Expand in the API, and how we mapped our entities to View models for the API.

Read More