WebAPI Testing

In the past...

Before WebAPI we were implementing API's using the Monorail MVC Framework (It was adopted in the application long before ASP.Net MVC had established a comparable set of features).

Monorail is very test friendly, but generally speaking our test approach was one of:

  • Constructing a controller by hand, or via some auto-registering IoC container
  • Stub/mock out the necessary mechanics of Monorail
  • Invoke the action methods directly, then check the returned values + state of the controller

It let's you focus on testing the controller in Isolation, but ignores all the mechanics such as routing, filters etc.

End to end testing

When moving to the WebAPI for the API implementation, we found it was in fact much easier to setup the entire pipeline (including all the DelegatingHandlers, routes etc.) and execute a request and get a response, here's the constructor for our base class for API tests:

Reset();

InitializeEnvironmentForPluginInstallation();

new RestAPIPluginInstaller().Install(helper);

new CoreRestResourcesPluginInstaller().Install(helper);

var host = IoC.Resolve();

jsonNetFormatter = host.JsonNetFormatter;

server = new HttpServer(host.Configuration);

client = new HttpClient(server);

And here's how a simple test looks:

[Fact]
public void Post_for_existing_package_throws_forbidden()
{
var model = new CreateOrUpdateScriptPackageModel
{
Id = new Guid("FBA8F2E7-43E8-417E-AF4E-ADA7A4CF7A9E"),
Name = "My Package"
};

HttpRequestMessage request = CreateRequest("api/scriptpackages", "application/json", HttpMethod.Post, model);

using (HttpResponseMessage response = client.SendAsync(request).Result)
{
Assert.Equal(HttpStatusCode.Forbidden, response.StatusCode);
Assert.Equal("POST can not be used for updates.", GetJsonMessage(response));
}
}

The GetJsonMessage method in this case just extracts the error information from the JSON response.

For tests returning full responses we used ApprovalTests with the DiffReporter - this proved incredibly productive.

[Fact]
[UseReporter(typeof (DiffReporter))]
public void Post_script_package_for_project()
{
var project = new Project {Id = new Guid("59C1A577-2248-4F73-B55E-A778251E702B")};

UnitOfWork.CurrentSession.Stub(stub => stub.Get(project.Id)).Return(project);

authorizationService.Stub(stub => stub.HasOperations((TestScriptPackage) null, CoreOperations.Instance.TestManagement.ManageScripts)).IgnoreArguments().Return(true);

var model = new CreateOrUpdateScriptPackageModel
{
Name = "My Package",
ProjectId = project.Id
};

HttpRequestMessage request = CreateRequest("api/scriptpackages", "application/json", HttpMethod.Post, model);

using (HttpResponseMessage response = client.SendAsync(request).Result)
{
Assert.Equal(HttpStatusCode.Created, response.StatusCode);
Assert.Equal("application/json", response.Content.Headers.ContentType.MediaType);
Approvals.Verify(response.Content.ReadAsStringAsync().Result);
}
}

If you have not used ApprovalTests before, the magic occurs here:

Approvals.Verify(response.Content.ReadAsStringAsync().Result);

This get's the content of the response (JSON) as a string and then checks to see if it matches our "golden master" - if it does not, you are shown a Merge UI with the results of the current test compared to the golden master:

At this point you can:

  • Accept all the changes.
  • Fix what's broken and run the test again.

For this to work well you need to render your JSON with identation enabled - and you need to ensure that however you serialization works, the order of the properties in the output is repeatable.

The JsonMediaTypeFormatter that ships with WebAPI has a property called Indent you can force to true for your testing in this case (we also have it wired up for debug builds).

I think what's great about this approach is:

  • It's really easy
  • You catch errors you might miss if just checking parts of your response for consistency
  • You are reading your JSON output constantly from your application - I find this process extremely helpful - a lot of issues that were not picked up during the initial implementation/design were uncovered just by reviewing how we presented our resources in JSON
  • Did I mention it's really easy?!

Authentication

The creation of a test request was handled by a few helper methods on the API tests base class.

protected HttpRequestMessage CreateRequest(string url, string mthv, HttpMethod method, User user = null)
{
var request = new HttpRequestMessage();
request.RequestUri = new Uri(_url + url);
request.Headers.Accept.Add(new MediaTypeWithQualityHeaderValue(mthv));
request.Method = method;
request.Properties["user"] = (user ?? currentUser);

return request;
}

protected HttpRequestMessage CreateRequest(string url, string mthv, HttpMethod method, T content, MediaTypeFormatter formatter = null, User user = null) where T : class
{
var request = new HttpRequestMessage();
request.RequestUri = new Uri(_url + url);
request.Headers.Accept.Add(new MediaTypeWithQualityHeaderValue(mthv));
request.Method = method;
request.Content = new ObjectContent(content, (formatter ?? jsonNetFormatter));
request.Properties["user"] = (user ?? currentUser);
return request;
}

Notice we inject the user in the request's Properties collection, this allowed us to bypass to need to setup basic auth etc. headers and handling the additional overhead of mocking out the authentication of a user.

Is that it?

Pretty much - we certainly had more traditional Unit tests for supporting parts of the API such as the help generation, view model mapping and filters/delegating handlers - but they were very standard, the actual API testing was all done through the methods describe above...

And I think that's great news! I've worked with other technologies in the past where you could dedicate a series of posts mocking out different aspects of the underlying framework mechanics - but in the case of the WebAPI there was no need, because it can be easily self-hosted without a whole lot of bother.

Next

Next in part 8 we take a look at the "anatomy of a plugin" - investigating how we implemented support for 3rd party developers to develop API controllers as part of a plugin for Enterprise Tester.

Read More

Long running tasks

Eventually most applications develop some mechanism for launching and tracking the progress of a task running asynchronously.

In Enterprise Tester this is normally seen ss a progress dialog:

In the application this was handled in more then one way, by different parts of the application, but through the API we saw an opportunity to unite these different methods.

API as plaster (spackle for Americans)

Applications over time grow and morph in often unforeseen ways, heading in directions you never originally imagined (an incidentally this is part of the reason why our jobs as developers is so much fun).

The result of this is that you can often end up with multiple features over time, that at first seem very different, but at some point a perception-shift occurs and you realize in fact they are variations on the same feature.

At this point there's a strong desire to try and rectify the issue - but your faced with some problems:

  • It's going to involve lots of work to align everything together.
  • Unless you plan to build further on this feature, it's difficult to justify any increase in value to the business.
  • If you are somewhat pragmatic, you may struggle to justify it internally as well.

But as an alternative to addressing the problem from the bottom up, when adding an API to your product, you also have the option of addressing it an API level - and having the API take care of then delegating to find the appropriate implementation.

This is where the API then behaves as plaster, smoothing over the cracks and small imperfections in your implementation as it is exposed to the world of potential 3rd party developers.

But enough of the hypothetical - let's take a look at what we did for background tasks.

First we introduced a new layer of abstraction:

public interface IJobHandler : IModule
{
string Key { get; }
string Description { get; }
string CreateJob(IDictionary parameters);
ProgressReportDTO GetProgressReport(string jobId);
bool CanHandle(string jobId);
}

This allowed a thin adapter to be created over the top of each background task implementation.

Next - in each implementation of this interface we created a composite key (under the hood most of the task implementations used a GUID Identifier for tracking the progress of the job) which could be used to differentiate the ID of the job from other handlers:

public bool CanHandle(string jobId)
{
if (!jobId.StartsWith(_keyPrefix))
{
return false;
}

if (ExtractId(jobId) == null)
{
return false;
}

return true;
}

The key prefix also has the bonus of allowing our background tasks to be identified by something a little more meaningful then a GUID i.e. "reindex_task_B55C4A97-9731-4907-AF8F-13BB10A01C3A" - a small change, but a pleasant one.

Last of all, each job implementation wraps the progress results from the underlying implementation , mapping them into the a common DTO.

IDictionary AdditionalProperties { get; }
IList Links { get; }

Like we do with other models returned from the API, we leverage dictionaries and JSON rewriting to handle adding additional information to the progress results.

Controller

Just for the sake of completeness, here is the controller action we now use for creating a task:

public HttpResponseMessage Post(CreateBackgroundTask dto)
{
IJobHandler handler = _registry.GetByKey(dto.Type);

string jobId = handler.CreateJob(dto.Parameters);

ProgressReportDTO reportDto = handler.GetProgressReport(jobId);

ViewBackgroundTaskModel wrapped = _viewModelMapper.Map(reportDto, Expands);

HttpResponseMessage response = Request.CreateResponse(HttpStatusCode.Created, wrapped);

response.Headers.Location = new Uri(_urlTransformer.ToAbsolute(string.Format("~/api/backgroundtask/{0}", jobId)));

return response;
}

When creating a new background task, we might get a result like this for example:

{
"Complete": false,
"TotalElements": 0,
"ProcessedElements": 0,
"StartedAt": "2012-08-06T11:28:45Z",
"ProgressInPercent": 0.0,
"Id": "ticketlinking_cba3035a-bf63-4006-89b1-b291aaac0460",
"Message": null,
"Self": "http://localhost/api/backgroundtask/ticketlinking_cba3035a-bf63-4006-89b1-b291aaac0460"
}

We can make additional GET requests to the Self URI to get progress updates, upon completion the response contains additional information (including in this case a link to a new resource that was created as part of the execution of this background task).

{
"Complete": true,
"StartedAt": "2012-08-06T11:39:45Z",
"FinishedAt": "2012-08-06T11:39:53Z",
"ProgressInPercent": 1.0,
"Id": "ticketlinking_9b01796c-a9ae-40cb-a6ad-a802346c0c33",
"Message": "Completed",
"IncidentId": "029b2c43-38be-4c94-b547-a0a50185fb9e",
"Self": "http://localhost/api/backgroundtask/ticketlinking_9b01796c-a9ae-40cb-a6ad-a802346c0c33",
"Links": [
{
"Href": "http://localhost/api/incident/029b2c43-38be-4c94-b547-a0a50185fb9e",
"Rel": "Incident"
}
]
}

What about SignalR

Currently getting progress for a background task is done by polling the resource URL - we did investigate leveraging SignalR to make this work in a more real-time fashion, but struck a few issues:

  • Internally the underlying sources of the progress information didn't support progress change events - so we would still be having to poll internally.
  • Many of our clients would still end up polling because it's simpler to implement
  • The SignalR + WebAPI story wasn't very well developed - we did review the SignalR.AspNetWebApi project on github, but it wasn't being updated at the same pace as the ASP.Net Web API preview releases were hitting github.

We also investigated some other ideas - including PushStreamContent Which is now really easy to implement in the RTM build of WebAPI or trying to leverage WebBackgrounder (but that didn't really fit our needs).

Next

Next in part 7 we are going to take a look at the approach we took to testing our API (including end-to-end testing and Approval Tests).

Read More

Authentication - Session, Basic and OAuth

Authentication

Authenticating users of an API is very important - and thankfully the many available extension points within ASP.Net MVC WebAPI make this really easy to implement.

We intended the API being developed to be consumed by both the application itself, as well as clients armed with a username and password - so that meant:

  • Session based authentication
  • Basic Authentication

The application was already an OAuth provider (to support our OpenSocial gadget support) - so we also decided to adopt this for the API, thus allowing those gadgets to also interact with the API (and to allow for delegated authentication scenarios).

Delegating Handler

Initially we attempted to support these three authentication methods through separate delegating handlers, but eventually abandoned that approach for a single class that handled all 3 authentication methods - here is the guts of determining which method to use:

Task AuthenticateRequest(HttpRequestMessage request, CancellationToken cancellationToken)
{
if (request.Properties.ContainsKey("user") && request.Properties["user"] != null)
{
return HandlePreAuthenticated(request, cancellationToken);
}

var context = request.GetHttpContext();

if (request.Headers != null
&& request.Headers.Authorization != null
&& request.Headers.Authorization.Scheme != null)
{
if (request.Headers.Authorization.Scheme.Equals("basic", StringComparison.OrdinalIgnoreCase))
{
return HandleWithBasicAuthAuthentication(request, cancellationToken);
}

if (request.Headers.Authorization.Scheme.Equals("OAuth", StringComparison.OrdinalIgnoreCase))
{
return HandleWithOAuthAuthentication(request, cancellationToken, context);
}

return Task.Factory.StartNew(() => new HttpResponseMessage(HttpStatusCode.Unauthorized));
}

return HandleWithSessionAuthentication(request, cancellationToken, context);
}

So the approach taken was that:

  • If the request properties contains a user, we treat the request as pre-authenticated (used for testing mostly, more on that in a future post).
  • If there is authorization header, we check the scheme and perform either Basic or OAuth handling of the request.
  • Otherwise, we fall through to handling the request with session authentication.

The HttpContext (and it's related abstractions) are fairly baked in to parts of the pre-existing Authentication infrastructure and so we need to extract this from the request to complete authentication in many cases - this has actually got much easier with each release of the WebAPI - the first WCF based drops of the WebAPI made this almost impossible to do without spelunking into reflection over private fields.

All authentication methods would eventually end up associating an authenticated user with the requests properties via a SetIdentity method:

void SetIdentity(User user, HttpRequestMessage request)
{
request.Properties.Add("user", user);
}

OAuth

Enterprise Tester uses DevDefined.OAuth - which includes support for the problem reporting extension as part of OAuth 1 - this is exposed as Report property on the OAuthException, which can be then used as the content of a response when Authentication fails:

Task HandleWithOAuthAuthentication(
HttpRequestMessage request,
CancellationToken cancellationToken,
HttpContextBase context)
{
var httpRequest = context.Request;

try
{
User user = _authenticationService.AuthenticateRequest(httpRequest);

SetIdentity(user, request);

return base.SendAsync(request, cancellationToken);
}
catch (OAuthException authEx)
{
string reportAsText = authEx.Report.ToString();

if (Logger.IsErrorEnabled) Logger.ErrorFormat(authEx, "OAuth Error occurred while authenticating OAuth request, url: {0}, method: {1}", httpRequest.Url, httpRequest.HttpMethod);

return Task.Factory.StartNew(() => new HttpResponseMessage(HttpStatusCode.Forbidden) {Content = new StringContent(reportAsText)});
}
catch (Exception ex)
{
if (Logger.IsErrorEnabled) Logger.ErrorFormat(ex, "General Error occurred while authenticating OAuth request, url: {0}, method: {1}", httpRequest.Url, httpRequest.HttpMethod);

var report = new OAuthProblemReport {Problem = OAuthProblems.PermissionUnknown, ProblemAdvice = "Encountered general error: " + ex.Message + " - please see application logs for more details"};

string reportAsText = report.ToString();

return Task.Factory.StartNew(() => new HttpResponseMessage(HttpStatusCode.Forbidden) {Content = new StringContent(reportAsText)});
}
}

Async

Within the application we have a simple service for returning the "current user" associated with a request/thread:

public interface IUserContext
{
User CurrentUser { get; }
}

With the WebAPI being asynchronous the mechanics of this didn't work very well for us (The thread the DelegatingHandler executes on for Authentication wasn't necessarily the same thread that constructed the controller and executed the action).

To avoid too much rework we just implemented an ActionFilterAttribute that was applied to a base controller which all the "authenticated" controllers inherited from:

public class AssociateUserWithThreadFilterAttribute : ActionFilterAttribute
{
public override void OnActionExecuting(HttpActionContext actionContext)
{
var service = IoC.Resolve();

var user = actionContext.Request.Properties["user"] as User;

if (user != null && service != null)
{
service.SetIdentity(user, Authorization.Everything);
}

base.OnActionExecuting(actionContext);
}
}

It's not a beautiful solution, but had no impact on our existing implementation for authentication which is great.

The one gotcha here is if returning a Task as the result of a controller action you need to be a little careful/take care with associating the user with the task's thread yourself.

So far we only return a Task from the POST methods handling the upload of attachments as a mime multipart request, so this hasn't been too much of a problem to deal with.

Next

Next in part 6 we take a look at how we handled exposing long running tasks.

Read More

OData, TQL and Filtering

Drilling down

When using an API the consumer generally wants to be able to:

  • Access a subset of a large collection of resources (paging)
  • Order the set of resources they get back.
  • Filter the resources returned based on some criteria.
  • Aggregate across the resources matching the filter (at the very least get a total count back)

Enterprise Tester aims to provide ways to achieve all of this - the approach we took was fairly pragmatic:

  • Leverage existing query language investments (we have a query language in our application already, it makes sense to also expose this in the API)
  • For anything else, use the OData $filter functionality to filter it (we didn't want to invent another query language)

TQL Support

Query Language


TQL (Testing Query Language) is a Domain Specific Query language developed for searching and aggregating test information - and worth a series of posts all on it's own - but was definitely an existing investment we wanted to leverage when building out the products API.

TQL Queries can be quite simple:

Status = Open

Or quite complex (the below query would find any requirements which are associated (indirectly) to bugs with a Resolution of 'Wont Fix' raised in the last week).

EntityType = Requirement
AND Relationships IN {
Destination IN {
Type = Bug
AND Resolution = 'Wont Fix'
AND CreatedAt >= "-1 week"
}
}
ORDER BY Package ASC, LastUpdatedAt DESC

The parsing of the Query Language is implemented in F# using FParsec (if you haven't looked at FParsec, then umm.. you should - I can't say enough good things about this library!)

We have not so for had to make any changes to the query language to make it more palatable to consumption from the API - I think a few things worked in our favor there:

  • Quoting strings is optional for words not containing whitespace, and you can use single or double quotes.
  • Encoding of strings follows the JSON conventions for escaping etc.
  • When implementing the parser we ensured it was whitespace insensitive - so the above query can also just be written on a single line.
  • We did not use symbols for AND and OR logical operators - so we avoided using ampersands [&] for AND
  • Having the query include ordering info avoided the need for a second order/sort query parameter

This allowed us to make it easy to search via the API without having to URL encode the query parameter in many cases.

Working against our favor is Lucene itself - the query language allows performing a contains search using a Tilde (~) operator:

Name ~ "defect bug"

Within the string being searched for we support the use of Lucene query parser syntax:

Name ~ "te?t" AND Description ~ "'REST API' && 'Curl Example'"

This can trip up people experimenting with the API directly within a browser, where in some cases not escaping these characters correctly can result in part of their query being parsed as a parameter - so far this hasn't really proven to be much of an issue.

Controller

The implementation of a controller taking a TQL query was fairly simple:

public HttpResponseMessage Get(string tql = null)
{
QueryResults results = _entitySearcher.Search(Request.RequestUri, tql, Skip ?? 0, Top ?? DefaultTopSearchResults);

List wrapped = results.Items.Select(Wrap).ToList();

var wrappedResults = new QueryResults
{
Items = wrapped
};

if (!NoInlineCount)
{
wrappedResults.Skip = results.Skip;
wrappedResults.Top = results.Top;
wrappedResults.Total = results.Total;
}

HttpResponseMessage response = Request.CreateResponse(HttpStatusCode.OK, wrappedResults);

return response;
}

The implementation defaults to including skip/top/total as well as next/prev/first/last links in the response - but we did provide a way for client consumer to excluded that information if they so desired (i.e. if implementing a search which is guaranteed to return 1 result) - by passing in the OData $inlinecount query parameter.

GET /api/automatedtests?tql=Name~Selenium&$inlinecount=none

We also exposed a search method, allowing you to execute TQL queries to search across all the entity types at once, implementation of the controller there was similar - one thing we did do is leverage the "Expansions" dictionary every view model has to embellish it with the type of the search result (as a property called "EntityType").

wrappedResults.Items = results.Items.Select(result =>
{
object mapped = _viewModelMapper.MapSearchResult(result, Expands);
string type = QueryUtility.FormatTypeForDisplay(result.GetUnproxiedType());
((AbstractModel) mapped).Expansions.Add("EntityType", type);
return mapped;
}).ToList();

In the case of search results we are dealing directly with NHibernate entities, which can be proxies - thus the call to .GetUnproxiedType().

OData

I always feel a little disingenuous referring to OData in our API docs - but unfortunately I don't know of a good term for what we are doing.

Our support for OData extends as far as filtering a set of results (GET requests) and goes no further - we certainly did not build an OData compliant API, or ensure the shape of our results conformed to something an OData consumer may expect.

The filtering specification outlined in OData though is incredibly useful to avoid inventing yet another query language unnecessarily (and was one of the draw cards for using WebAPI in the first place).

  • $expand
  • $filter
  • $inlinecount
  • $orderby
  • $skip
  • $top

Initially our collection resource GET methods looked like this (or in some cases with additional query parameters to identify the collection owner)

[Queryable]
public IQueryable Get()
{
...
}

But as we moved through the pre-releases for the WebAPI we hit a bit of a snag in that OData support was pulled from the beta - we knew the problem would eventually be rectified, but in the mean time we had code that didn't work any more - so we pulled the necessary pieces that made up OData execution in earlier builds and re-introduced support - so our controllers ended up like this:

public HttpResponseMessage Get()
{
QueryResults results = ODataQueryExecutor.Execute(someQueryable, Request.RequestUri);
HttpResponseMessage response = Request.CreateResponse(HttpStatusCode.OK, results);
return response;
}

And the code to handle executing the OData query and return the results:

public static class ODataQueryExecutor
{
public static QueryResults Execute(IEnumerable items, Uri uri)
{
StructuredQuery structuredQuery = ODataQueryDeserializer.GetStructuredQuery(uri);

IStructuredQueryPart inlineCount = structuredQuery.QueryParts.FirstOrDefault(p => p.QueryOperator == "inlinecount");

var resultsPage = new QueryResults();

if (inlineCount != null && inlineCount.QueryExpression == "allpages")
{
resultsPage.Total = ((IQueryable) ODataQueryDeserializer.Deserialize(items.AsQueryable(), GetQueryWithoutTopOrSkip(structuredQuery).QueryParts)).Count();

resultsPage.Top = TryParseQueryPartAsInt(structuredQuery.QueryParts, "top");

resultsPage.Skip = TryParseQueryPartAsInt(structuredQuery.QueryParts, "skip");
}

resultsPage.Items = ((IQueryable) ODataQueryDeserializer.Deserialize(items.AsQueryable(), structuredQuery.QueryParts)).ToList();

resultsPage.SetSelfAndGenerateLinks(uri);

return resultsPage;
}

This allowed us to just do filtering, and get the results back in our familiar QueryResults form (so we ensure consistency with the equivalent TQL query responses).

Aggregation

The TQL (Testing Query Language) also features the ability to perform aggregations (Think group by, count, sum, average, faceted querying etc.) - these can be quite complex:

COUNT,
Min(LastUpdatedAt) AS "Start At",
Max(LastUpdatedAt) AS "Finished At",
GROUP BY Status {
GROUP By Type {
Count
}
} AS "Statuses",
GROUP BY Path {
SUM(EstimatedDuration) AS "Total Estimated",
SUM(ActualDuration) AS "Total Actual",
AVG(EstimatedLessActualDuration) AS "Average Remaining Time (Per Script)"
} AS "Packages",
FACETED OrArgs(Equal(Status,Failed), Equal(Status,Blocked)) AS "Failed OR Blocked" {
Entities
} AS "Flagged"
WHERE EntityType = ScriptAssignment
AND Project = 'Project X'
AND Status != NotRun

Which within the application would return a result like this:

And from the API, returns a result like this:

{
"Results": {
"COUNT": 15,
"Start At": "2011-11-01T02:51:57Z",
"Finished At": "2011-12-29T20:48:48Z",
"Statuses": {
"Passed": {
"GroupByType": {
"Smoke": {
"Count": 4
},
"Functional": {
"Count": 4
},
"User Acceptance": {
"Count": 1
}
}
},
"InProgress": {
"GroupByType": {
"Smoke": {
"Count": 1
},
"Functional": {
"Count": 2
}
}
},
"Failed": {
"GroupByType": {
"Regression": {
"Count": 1
},
"Functional": {
"Count": 2
}
}
}
},
"Packages": {
"Execution Sets/Cycle 1/Reports": {
"Total Estimated": "02:05:00",
"Total Actual": "02:25:00",
"Average Remaining Time (Per Script)": "-00:03:00"
},
"Execution Sets/Cycle 1 - regression testing": {
"Total Estimated": "01:10:00",
"Total Actual": "02:00:00",
"Average Remaining Time (Per Script)": "-00:12:30"
},
"Execution Sets/Cycle 1/Transfers": {
"Total Estimated": "01:10:00",
"Total Actual": "04:45:00",
"Average Remaining Time (Per Script)": "-00:28:20"
}
},
"Flagged": {
"Failed OR Blocked": {
"Entities": [
{
"Id": "5b932463-6089-4bc9-9e23-a0b100e133b0",
"EntityType": "ScriptAssignment",
"Self": "http://localhost/EnterpriseTester/api/scriptassignment/5b932463-6089-4bc9-9e23-a0b100e133b0"
},
{
"Id": "39c54b0c-5340-4b93-ace9-a0b100e13519",
"EntityType": "ScriptAssignment",
"Self": "http://localhost/EnterpriseTester/api/scriptassignment/39c54b0c-5340-4b93-ace9-a0b100e13519"
},
{
"Id": "ac8e5dce-322c-4e05-9975-a0b100e13375",
"EntityType": "ScriptAssignment",
"Self": "http://localhost/EnterpriseTester/api/scriptassignment/ac8e5dce-322c-4e05-9975-a0b100e13375"
}
]
}
}
}
}

To achieve this we needed to parse the query to determine if it was an aggregation, and then handle the search query differently in each case:

public HttpResponseMessage Get(string tql = null, string format = null)
{
tql = tql ?? string.Empty;

PresentationContext context = CreatePresentationContext();

EnterpriseTester.Core.Search.AST.Query parsed = _queryParser.ParseQuery(tql);

if (parsed.IsAggregatedExpression && !string.IsNullOrEmpty(tql))
{
return HandleAggregatedExpression(parsed , format, context);
}

return HandleQuery(parsed, format, context);
}

Aggregated search results are generated as a set of nested IDictionary instances - which actually serialize very well to JSON when using JSON.Net - all except our entities...

With the aggregated queries support in TQL, you can use functions such as COUNT, SUM(Field), AVG(Field) and so on... one of these functions Entities - which actually just returns an List of all the entities matching the criteria. This is not something SQL does, but this isn't SQL now is it - so there is nothing stopping us returning an array as opposed to a single value for any node in the tree of results.

By default Entities will stop being collected by the query after there is 25, but the limit can be increased too if necessary. This feature is really useful when combined with a faceted search where expressions and formulas can be used to calculate which entities to include or not, and where you don't expect to get a large number of matching entities.

So within the dictionary of results returned from the TQL query engine for the aggregated query, we may have a list of EntityInfo elements, one for each entity returned from a "Entities" aggregate function.

EntityInfo consisted of the CLR Type and Guid ID for the entity - not something we want to expose in our API (but very useful for other parts of the application) so to overcome this we pass a visitor over the dictionary of results to rewrite these into a form that's palatable for our API consumers:

HttpResponseMessage CreateDefaultAggregatedResponse(IDictionary results)
{
var visitor = new ResolveEntityInfoDictionaryVisitor(entity =>
{
string entityType = QueryUtility.FormatTypeForDisplay(entity.EntityType);
return new Dictionary
{
{"Id", entity.Id},
{"EntityType", entityType},
{AbstractModel.SelfProperty, _viewModelMapper.ResolveUrlForEntityTypeResource(entityType,entity.Id)}
};
});

visitor.Visit(results);

return Request.CreateResponse(HttpStatusCode.OK, new RawAggregatedQueryModel {Results = results});
}

So we ensure those aggregation results now have a URL pointing at the resource for that entity.

Next

Next in part 5 we take a look at how we handled Authentication, including support for Session, Basic and OAuth.

Read More

Generating API documentation

Out of the box API help

The WebAPI supports generation of a help page featuring descriptions of each controller / Http method etc. - even generated samples - I found this blog useful to understand the process, including a nice video - and the feature has come a long way since it was first introduced.

For a green-fields application I would strongly recommend using it!

Here's one I prepared earlier...

However, for Enterprise Tester we already had an existing in-application help system in place, and so decided to also put our generated API documentation there as well.

Here's what the help system looks like:

We did initially look at harnessing what came out of the box with WebAPI, but decided given the way our JSON rewriting works, and some of our additional metadata such as expansions, that it would be easier to implement much of what we needed from scratch.

To do this we opted for a more localized approach - marking controllers and action methods up with attributes to provide the metadata to generate the documentation - this also provides a hint about the Expands it supports etc.

[HttpMethodInfo(
Description = "Retrieves all (or a subset) of automated tests that are visible to the current user.",
RequiredPermissions = new[] {CoreOperations.CoreOperationsTestManagementView.FullName},
SupportsTQL = true)]
[HttpMethodSupportsStatusCode(HttpStatusCode.OK, "....description...")]
[HttpMethodSupportsStatusCode(HttpStatusCode.Forbidden, "....description...")]
[ExpandInfo(typeof (AutomatedTest), typeof (ViewAutomatedTestModel))]
public HttpResponseMessage Get(string tql = null)
{
...
}

Next, we wanted to provide multiple examples for each HTTP method - we did this by setting up a class called "ExampleInfo":

public class ExampleInfo
{
public string Method { get; set; }
public string Title { get; set; }
public string Description { get; set; }
public List RequestHeaders { get; set; }
public List ResponseHeaders { get; set; }
public List RequestParameters { get; set; }
public object RequestModel { get; set; }
public string RenderedRequest { get; set; }
public object ResponseModel { get; set; }
public string RenderedResponse { get; set; }
public HttpStatusCode ResponseStatus { get; set; }
}

public class ParameterInfo
{
public string Key { get; set; }
public string Value { get; set; }
public string Description { get; set; }
}

And then controllers (or their abstract base controllers...) could provide examples by implementation a static method on the controller itself i.e.

static IEnumerable GetPostExamples()
{
yield return new ExampleInfo { .... };
}

The name of the method is not important, we match only on type signature (so you can have as many example methods as you like i.e. one per HTTP method etc.)

Also notice that in the example info we have both a ResponseModel and a RenderedResponse properties (and equivalents for the request) - rendered requests (strings) are useful when providing examples for documenting multi-part mime supporting methods, but primarily we leveraged the RequestModel/ResponseModel which meant changes to examples would always reflect the current codebase.

Resource documentation

Given all this information we then generate a help topic that looks like this:

In addition we generate a index list of all available resources, providing a quick overview of what methods are support by each resource.

For collection resources, where the results can be filtered (either by filtering via OData's $filter query parameter, or via TQL - Testing Query Language) we also include a small label indicating which query type is supported.

Last of all (an arguably one of the most useful views for a developer building an API) is the permission view - though you should obviously have tests to verify the authorization restrictions applied to each method, it can be very useful to see it presented in a matrix view - so for each resource, we can see what permissions are required.

Because of the underlying help system it's possible for 3rd party developers to further extend the content of these help topics or add new topics to further document the capabilities of the API.

Bulk Actions

Last of all beyond REST we also expose some existing functionality via JSON RPC, such as support for bulk actions.

Bulk actions within the application take a set of items (or a query returning a set of items) and then applies an action to that set - be it generating some form of Export, performing a bulk move/update/delete etc.

We contemplated trying to translate these concepts into a RESTful context, but it didn't really make sense and so instead opted for a mechanism to start these background tasks, and then monitor their progress.

Metadata related to bulk actions was already available via the IoC container, so we implemented some additional help topic "providers" to generate additional topics necessary for examples of the various types of action you could execute through the API.

Next

Next in part 4 we take a look at OData, TQL and filtering of collection resources.

Read More