More Base4 mocking...

More mocking...

The title for this entry should probably be "more IRepository mocking" - as that's what I'm mocking out... but at any rate, let's get down and dirty... In this entry I will cover mocking out a "complex" method, incidentally I think the size of this test suggests that our test isn't fine grained enough (and that individual features within the data loader should be exposed so we can test them better) - but that's my problem, not yours ;o)


So, as some background, this is a test for a class called "DataLoader" - a most unimaginative name for a class that loads data... it's used for loading an XML file which has a bunch of info for different entities... if your recall the last entry, I was talking about simplistic music information, well this class can be used to load musical data into the store, including something not mentioned so far, which is the "test suite" - the test suite defines tests which can be run against the music in the store, the details aren't really up for discussion, but it's for executing customer acceptance tests, not unit tests.

The test below checks one of the primary paths, where by no pre-existing data is available, so it must all be created by the data loader first, before creating the customer acceptance test, which references the track, release etc. data (the definition for customer acceptance test suites are also stored in the store, as are the results from run the customer acceptance tests).

Diving In...

So here is the test for loading a "test suite" for a single track, the song "Dirty Harry" by the "Gorillaz", we are testing the behavior of loading the file when the track, release and artist do not already exist in the store - most of this is basic RhinoMock's usage (though I'm no expert, I'm probably misusing dome of the features...).

public void LoadDataFreshForSingleTrack()
/**** RECORD ****/

Environment environment = new Environment();
SourceDevice sourceDevice = new SourceDevice();
RecordingDevice recordingDevice = new RecordingDevice();
Track track = new Track();

ObjectPath artistPath = (Artist.Fields.Name == "Gorillaz");

LastCall.Constraints(Property.Value("Name", "Gorillaz"));

ObjectPath releasePath = (Release.Fields.Name == "Demon Days");

LastCall.Constraints(Property.Value("Name", "Demon Days") &&
Property.ValueConstraint("Artist", Property.Value("Name", "Gorillaz")));

LastCall.Constraints(Property.Value("Name", "Dirty Harry") &&
Property.ValueConstraint("Release", Property.Value("Name", "Demon Days")));

ObjectPath encodingPath = (TrackContentEncoding.Fields.Name == "MP3");

LastCall.Constraints(Property.ValueConstraint("Track", Property.Value("Name", "Dirty Harry")));

ObjectPath trackReferencePath = (Track.Fields.Name == "Dirty Harry" &&
Track.Fields.Release.Name == "Demon Days" &&
Track.Fields.Release.Artist.Name == "Gorillaz");



ObjectPath environmentPath = (Environment.Fields.Name == "Indoors");

ObjectPath sourceDevicePath = (SourceDevice.Fields.Name == "Loopback");

ObjectPath recordingDevicePath = (SourceDevice.Fields.Name == "Loopback");

LastCall.Constraints(Property.Value("Name", "One track, one loopback sample") &&
"Testing with 30 second snippets pulled from the source files with audacity (Loopback)") &&
ItemListOf.NumberOfItems(1) && ItemListOf.IndexedItemConstraint(0, Is.Same(track))));

/***** REPLAY *****/


DataLoader loader = CreateLoader();

loader.LoadData(OneTrackFile, ContentPath);


First you'll probably see that the callback and anonymous delegates from the last post are absent, they've been replaced with constraints... which has made the syntax a lot more concise - so far the rhino mocks support is pretty basic, we have two static classes being used for creating the constraints:

  • Base4Query - for constraints on ObjectQuery and ObjectPath arguments/properties.
  • ItemListOf - for constraints on IItemList arguments/properties.

The constraints method in RhinoMocks expects the same number of arguments as the associated method call on the mock object, constraints work on the arguments value, however you can apply constraints to the value of single property of an argument, and those constraints can be logically AND or OR'd together - for instance here:

LastCall.Constraints(Property.Value("Name", "One track, one loopback sample") &&
"Testing with 30 second snippets pulled from the source files with audacity (Loopback)") &&
ItemListOf.NumberOfItems(1) && ItemListOf.IndexedItemConstraint(0, Is.Same(track))));

We are applying these constraints:

  • The first argument should:
    • Have a property called "Name", and it's value should be "One track, one loopback sample".
    • And have a property called "Description", and it's value should be "Testing...." etc.
    • And have a property called "Tracks", which is of type IItemList which:
      • Contains a total of 1 items
      • And the item at index 0 should:
        • Be the same as the instance returned from an earlier save call.

Notice that we have to call the method we want to apply constraints to first, so that we can address it with LastCall - a necessary evil when the method has a return type of void... otherwise you can use the more pleasant "Expect.Call(...)" convention.

Also, I'm declaring instances of the expected Base4 object paths, and then applying them in the constraints... I could in-line the object paths, but I then need to disambiguate the overloaded call to the FindOne method, so we have my preference:

ObjectPath trackReferencePath = (Track.Fields.Name == "Dirty Harry" &&
Track.Fields.Release.Name == "Demon Days" &&
Track.Fields.Release.Artist.Name == "Gorillaz");


or this:

Base4Query.PathEqual(Track.Fields.Name == "Dirty Harry" &&
Track.Fields.Release.Name == "Demon Days" &&
Track.Fields.Release.Artist.Name == "Gorillaz")).Return(track);

The second involves less code, but I think it's easy to get confused between the compile time query language's logical operators and those of the RhinoMock's constraints... each have merits of course.


I'm still working through some of the finer points of the implementation (and so won't publish any code just yet...) but this should give you some ideas about how you can test base4 interactions in your application with RhinoMocks, instead of hitting the base4 server and testing the results... and I think it's a lot more intuitive when attempting to do TDD with base4...

I would also like to make a special mention that this code won't actually work with the latest published release of Base4... there is an issue preventing the addition of items to an IItemList in some situations when you have not set a valid default context - there is a fix (Which works well) in the Base4 trunk, and I assume Alex James will release this at some point... and thanks for resolving that issue so quickly Alex :) much appreciated.

Read More

Mocking out base 4....

Domain Driven Design Afterthoughts...

I just finished reading Jimmy Nilsson’s Applying Domain-Driven Design and Patterns [ADDDP] book... I actually read it cover to cover, something I’ve found difficult to compel myself to do with some of the other books that have been lying around my desk for a wee while now (such as Petzold’s “Applications = Code + Markup” – a great book to assault someone with in a dark alley)...

First off, I think this book is a pretty good, it didn’t cover a lot of new ground for me, but It’s very down to earth, which I liked, and it’s encouraged me to have a read of Evan’s and Fowler’s more definitive works on the subject too... I also liked the fact that this book does attempt to tie the whole story together, from rough sketch through to identifying the domains language, using TDD to build up your domain model and even some of the gritty integration work, including evaluating OR/M features, using NHibernate as an example, and even looks into inversion of control contains (spring sadly, it would’ve been great to have seen Castle get a mention) and finally AOP. On the down side – I think the book could’ve tackled the application of rules to your domain model a little better (I wanted to see more code) and depending on your TDD knowledge, you might find that a couple of the chapters don’t really do much for you as there’s little focus on the model so much as the key concepts of red, green, refactor...

At any rate, one thing I did keep rolling around in the back of my mind is just how Base4.Net fits into the “domain model” picture ... it’s difficult to nail down, there are plenty of mechanisms for implementing most of what you need to create a domain model, for instance:

  • Inheritance, though it’s support for discriminators in user types isn’t quite up to scratch – it only works via “ItemBase” at the moment – though I think Alex James mentioned that this would be implemented at some point... and though I haven’t tried, you could roll your own in some way.
  • Aggregates (through extended properties).
  • Various hooks (Events) to allow for the application of custom behavior, for instance you could wire up to a BeforeSave event on a type and implement some custom validation rules... not that easy to test though .
  • A reasonable query abstraction - Good query support.
  • A “logical” transaction mechanism, suitable for supporting the concept of a “UnitOfWork”, though it’s explicit, rather the implicit, and based on the examples in the documentation this could be a little annoying to work with when you’re trying to persist the entire graph for an aggregate – however using a similar implementation for “UnitOfWork” as Ayende does in the Rhino.Commons library for NHibernate I’m sure I could get it all working nicely, without too much trouble.

And it sounds like I’m on to a winner... but I think what I struggle with is that the types in your schema are not really the focal point for your domain model, because they’re not POCO, unlike say a domain model implemented with NHibernate as the backing O/RM can be, you can’t enrich or decorate them with additional functionality all that easily... It’s not that you have to build your domain model this way, it’s just that’s the way I would like to do it, at least to satisfy me that I'm not being railroaded into a bad design choice – but I think it’s the small blood price you pay for letting Base4 generate the schema assemblies for you, that loss of control is also a boon in immediate productivity when you start developing apps with Base4 apps... I’ve come close while using ActiveRecord, but it’s still not the same.

So... given the restrictions I’m left to implementing additional abstractions for my domain model... which means using repositories and services for encapsulating the business logic... which, in turn, brings me to mocking...

Mocking out my Base4 Implementation...

Now, if you recall a while back I talked about my repository implementation... basically it let you do things like:
IRepository orderRepository = IoC.Resolve<>>();

Order orderForApples = new Order();
OrderLine greenOnes = new OrderLine();
OrderLine redOnes = new OrderLine();


Big woop, but what I probably neglected to mention is that the repository is great for implementing a chain of generic decorators (which can be set up in your IoC container of course)... so at the bottom/base of the chain we may have our “Base4StorageRepository” and layered on top of that we might have various decorators (each injected with a dependency for the next repository in the chain) for implementing some useful concepts...

Things that spring to mind are:

  • Security
  • Validation
  • Logging

Being able to configure these things is quite useful, and there’s little stopping you deploying additional decorators as additional assemblies for an already installed product – just throw in some additional container configuration - and it is a great deal more elegant then implementing this functionality with AOP.

But you do kind of paint yourself in a corner at the same time... this generic decorator pattern stops you from being able to decorate the repository with additional methods for implementing business level functionality (because any decorations that are applied to your base repositorty will mask out  methods and properties no present in the IRepository interface)... that’s fine though, I guess we just have to write the repository off as being more of a persistence mechanism, It’s really a logical separation of concerns anyway... you can decorate a "wrapper" at the top of the chain though (and this is how Ayende does, but lets ignore that for now ;o)

So... A higher level entity for dealing with the business level concerns is required... I call mine services, that may or may not sit right with you, but it makes reasonable sense in my application – and these service are injected as dependencies of the controllers re: MVC, yeah this is a Monorail app (or at least, part of it is)...

These services often aggregate the features of multiple repositories, like this catalogue service below which deals with a simple music structure:

public class CatalogueService : ICatalogueService
public CatalogueService(IRepository trackRepository, IRepository releaseRepository,
IRepository artistRepository, IRepository genreRepository)
if (trackRepository == null) throw new ArgumentNullException("trackRepository");
if (releaseRepository == null) throw new ArgumentNullException("releaseRepository");
if (artistRepository == null) throw new ArgumentNullException("artistRepository");
if (genreRepository == null) throw new ArgumentNullException("genreRepository");

_trackRepository = trackRepository;
_releaseRepository = releaseRepository;
_artistRepository = artistRepository;
_genreRepository = genreRepository;

In this case we have a dependency on four different repositories...

For this example we have a pretty simple schema... with a child-parent relationship between Track, Release and Artist... and a Many to Many relationship between Tracks and Genres...

Track -> Release -> Artist
Track(s) <-> Genre(s)

The catalogue service implements the business rules for dealing with the catalogue, in some cases this is no more than querying the associated repository... so for getting a list of releases for a particular artist we have this (and yeah, I know it’s pretty daft):

public PagedItemList ListTracksForArtist(Artist artist, int pageSize, int pageNumber)
ObjectQuery query = new ObjectQuery(typeof (Track));

query.Path = (Track.Fields.Release.Artist.ID == artist.ID);
query.Path.AddOrderBy("Name", OrderByDirection.Ascending);

return _trackRepository.Find(query, pageSize, pageNumber);

I say daft because it doesn’t support sorting and filtering by a query... but it’s here to illustrate a point, and until the customer actually asks for these features I’m not going to bother building them :P

At any rate, the point is not to critique the service, but instead how can we test this catalogue service without being connected to a Base4 server... and of course it’s RhinoMocks to the rescue!

Mocking with RhinoMocks...

So here’s the guts of the test in mid-refactoring ... post red-green for those sticklers for the rules ;o) (there’s still plenty yet to clean up, but it would muddy the waters a bit for this example I think...)
public void ListReleasesForArtist()
Artist artist = new Artist();

PagedItemList releases = new PagedItemList(new ItemList(), 1, 10, 20);

Func callback
= delegate(ObjectQuery query, int pageNumber, int pageSize)
ObjectPath path = Release.Fields.Artist.ID == artist.ID;

Base4Assert.ArePathsEqual(path, query.Path);
Base4Assert.AreScopesEqual(new string[] { "Artist" }, query.Scope);
Assert.AreEqual(1, pageNumber, "pageNumber");
Assert.AreEqual(10, pageSize, "pageSize");
return true;

Expect.Call(_releaseRepository.Find(null, 1, 10)).Callback(callback).Return(releases);

ICatalogueService service =
new CatalogueService(_trackRepository, _releaseRepository, _artistRepository, _genreRepository);

PagedItemList results = service.ListReleasesForArtist(artist, 1, 10);

Assert.AreSame(releases, results);


Pretty chunky I know - but as more tests are added there will be opportunities for removing some of that duplicated effort... however the key points to take away are:

  • We don’t need to have the base4 service running.
  • We are actually testing the catalogue service’s interactions with the repositories, instead of relying on detecting expected side effects in the underlying storage.
  • We’re verifying the object path and scope for the query, as well as paging information, and insulating ourselves from difficult to detect changes (like forgetting to apply an object scope, which may have a severe impact on performance).

Just to complete the story, the mock repository was created in the Setup (as we use it for every test case)... here's the code for it:
public void SetUp()
_mockRepository = new MockRepository();

_trackRepository = _mockRepository.CreateMock<>>();
_releaseRepository = _mockRepository.CreateMock<>>();
_artistRepository = _mockRepository.CreateMock<>>();
_genreRepository = _mockRepository.CreateMock<>>();

It does not stop us from incorrectly spelling an object scope I think compile time query support for ordering and scoping of an object query will help to make this a little more robust... small potatoes.

The callback in this case is a bit of a “bad smell” – there’s support in rhino mocks for parameter constraints, but the object paths and scopes are a little too complex to test using the out the box ones... though you can get surprisingly close, they are pretty powerful... but I believe you can write custom constraints yourself – which is something I’ll do for the next post (I’ve never done it before, I'm guessing it's easy) and hopefully that will reduce the complexity of these tests quite a bit, and replace the less concise anonymous delegate, and more importantly, make it easy to develop the catalogue service in a test-driven manor.

For those more observant people you may have noticed the “Base4Assert” as well – that’s a little static helper class I’m using in these tests... it’s not perfect, but it works for simple cases including things like multi-level scopes and gives meaningful failure methods like “expected scope ‘Release.Artist’ but found nothing.” Or "expected OrderBy Name Descending, but found OrderBy Name Ascending"... which can quickly narrow down problems for you, especially if your writing these tests first (which is the whole point of this exercise I feel).

Conclusion... For Now ;o)

Last thing I’ve done is to have tossed away a lot of the additional query overloads in my original repository design in favor of just using a single ObjectQuery parameter with and without page number and page size (returning a PagedItemList when providing a page number and page size, and returning an IItemList when querying without them) – It makes everything a lot more... predictable, and is a lot cleaner when you start considering generic decorator implementations (10 overloads is 10 more code paths that need testing in a decorator... ) - looking towards to the future it should conceivable that a compile time query will allow you to construct the entire object query, not just the unordered path, and I’m quite happy to build these up in a few lines of code before passing them to a find method for now.

Edit (Sunday 12th):

In a similar vein, I just noticed Ayende's great MSDN article this evening - which covers that whole IRepository story, including generic decorators and most importantly the intricacies of registering and chaining these components in the windsor container... great stuff!

Read More

Something to pass the time...

Nothing particularly interesting to report, while I'm sitting here watching a custom-built automated test suite chug away (which incidentally takes approximately 5 hours to run... at 100% CPU usage on 1 core... ugh) - I thought I'd muck around with Iron python and base4... my inspiration was this "On the fly" schema creation on the base4 site, however I wanted to just grab an existing assembly (from the server no less) and start using it's types in Iron Python... it's pretty easy (and quite cool, think about the implications of no references, every time you run your app the schema might have magically grown some more features ;o)

Getting Started...

First off, lets fire up the trusty iron python console... I like colours and being able to tab through member lists, so I'm going to throw in some command line parameters...

ipy -X:TabCompletion -X:ColorfulConsole

Cool, consoles up and running so lets get down to business... first things first, loading up the Base4.Storage assembly...

import sys
import clr
sys.path.Add("C:Program FilesBase4 Solutions LtdBase4 version")
from Base4.Storage import *

All done... now before we can do anything useful, we'll need to setup our default connection string (read: default context)...


This is the amiga speaking...

Now the fun begins... if you have System.Speech available, grab that puppy too... because deep down we all know that the thrill of making your computer talk still exists... especially if you started out your days on an Apple or better yet Amiga...

from System.Speech.Synthesis import *
synth = SpeechSynthesizer()

Sweet, now lets view (or listen) for all the available schema assemblies...

for asmFile in StorageContext.FindAll[SchemaAssemblyFile]():
print asmFile.Name

Loading the schema...

At this point you'll end up with a list of the assembly files (and hopefully, your machine will be droning away notifying of
you just what they're called... ;o) in my case I have a little "Experiment.dll", so lets load that up... I'm pressed for time so I won't be putting in speech detection for selecting the appropriate schema, though it's both entirely possible, and pretty easy to do...

asmFile = StorageContext.FindOne[SchemaAssemblyFile]("Name='Experiment.dll'","")
assembly = experimentAssembly.LoadAssembly()
from Experiment import *

And yes, the strongly typed schema is now there for me to play with...

The experiment schema contains among other things an "Order" type, which holds a collection of "OrderLine" line's... so we can now do this:

order = Order()
line1 = OrderLine()
line1.UnitCost = 30
line1.NumberOfUnits = 2

I think it makes quite a handy diagnostic tool, especially when you start considering that your dev, test and live environments may all have different schema versions...  it could certainly save a lot of stuffing around - it would be fairly tedious to repeat this exercise in C# by comparison (especially once you've written a few useful helper functions in your own "base4" python module... I could shorten this entire post to 3 or 4 lines instead).

Read More

IronPython on ASP.Net

IronPython on ASP.Net ?

Yes, that's right... dynamic language support in ASP.Net ... and some interesting things I didn't know existed in ASP.Net already, such as no-compile pages - well worth a skim through the whitepaper here ( - it's an interesting approach to bringing in dynamic language support to - and would certainly work for ruby on the CLR as well... probably better actually because ruby's statements are balanced and so you have less problem with whitespace.

Speaking of the deadly space that is white, It will be interesting to see if they've dealth with this horrible python integration issue at all...

I haven't seen any evidence of whitespace agnostic additions in the sample code that's been posted, so I'm guessing not... in which case there may be tears before bed time for some developers who want to do quick-and-dirty asp-like code (which I see is one of the advantages of using a dynamic language in this situation) ... As it all tends to blow up in your face when your tabs are slightly out of line - something which visual studio might just do for you when it feels like shuffling your html ;o)

I'm not sure how easy it would be to introduce a whitespace agnostic mode in IronPython for ASP.Net (Boo's whitespace agnostic mode, as used in Brail is quite elegant...  the alternative is to do something like the spyce framework, using opening and closing braces... which just looks mucky to me because you end up with stuff like this:

[[ if display:{ ]]
something here
[[ }else:{ ]]
nothing here
[[ } ]]

Seems a bit too noisey... the "if display:" is enough to know you need to locate a matching closing block, so adding a construct like "end" should do the trick... which is how it's done in Brail...
<%if display:%>
something here
nothing here

As clean as...
Read More


Castle goes RC2

Well it's an exciting week for Castle users, RC2 is out the door, and with it a new website look and feel with up-to-date content, and most interesting to me is the introduction of a new entity "CastleStronghold" a commercial venture, run by Hammet which offers professional support for developers and organisations implementing solutions in Castle, including guaranteed response times to inquiries and access to additional skilled development resources.

Personally Castle Stronghold is great news, as a developer for a company which has been using the Castle IoC for over a year now it adds a certain weight to our decision to run with this technology, and I think it offers a clear indication of the longevity this technology and product has... In time as our customer base grows it certainly looks appealing to have guaranteed support backing us up - and also helps customers to understand it's not some half backed open source project that's just going to fizzle out one day...

The RC2 release itself is exciting from a community point of view, it should be better then ever for people to pick up and play with this stuff - though we generally use interim snapshots of the trunk (ie. the last trunk release that didn't break our build...) so we've been exposed to most of the features for a while now, albeit without alot of the bug fixes ;o)


I've been a bit quiet on the IronPython front... which is mostly because I haven't had the time to play around with it much lately - however I'll try to finish off my look into IronPython as a scripting engine, and in particular the good and bad aspects of getting it to play with your .Net code... In the mean time a collection of useful IronPython links is slowly growing here:


A week into using the Base4.Net's latest release, with compile-time query support, I'm loving it... it definitely gives a huge leap in productivity and expressability (if that's even a word ;o) - looking forward to seeing these features rounded out at some point (with support for scopes, projections and ordering) - Alex James is talking about migrations (my "most-wanted" feature) and it's looking encouraging, it's where I feel the most pain at the moment as base4 isn't particularly friendly when you try to approach the problem of building your schema in a YAGNI fashion - there is quite a bit of pain involved in adding and removing properties from types during development as requirements are refined.

The rub is that the current "on the table" solution will require restarting the base4 service to apply the migration... which at the moment means restarting the windows service with a different command line... or more likely, stopping the service, then starting the standalone server with the right command line arguments, waiting till it's done, stopping the standalone server and starting the service again...  I was originally thinking of implementing these as a Nant or MSBuild task (much like RoR's migrations work with rake) but I think it might be a bit chunky... I need to think about it a bit more.


I haven't forgot about my little side project (Splicer is a library I've written for "attempting" to eliminate the pain of using DirectShow.Net to encode audio and video) - and I do intend to keep updating and supporting it as-needed... next release should see WinForm samples for encoding audio and video added, and updated code examples... after that I might review the implementation a bit, to see how effects and transitions can be made easier to use via relative times - I'm interested in DSL's at the moment, so maybe I could create a DSL for video editing ;o)... if nothing else it would be amusing... I've yet to establish if anyone actually uses the library yet (other than myself of course).
Read More