Sunday, December 21, 2008

Test Data Builder

In every project you reach a point where you need test data for unit testing. Having a reasonably complex domain model can make this task "difficult" because testing an entity in many cases means that you also have to setup associations to other entities. Intializing objects per test is cumbersome and any changes to constructor arguments will break your tests. One of the solutions to this problem is to use the Object Mother pattern. It's basically a class with factory methods that helps you setup an object for testing. The object creation code is moved out of the tests so that it can be reused and making the test data more maintainable. So if you are developing a scrum application and want to create a project with a sprint and a user story you would do something like this:

public static class ProjectMother
{
 public static Project CreateProjectWithSprint()
 {
     Project project = new Project();
     Sprint sprint = new Sprint();
     UserStory userStory = new UserStory();
     userStory.Name = "User story";
     userStory.StoryPoints = 8;
     sprint.AddUserStory(userStory);
     project.AddSprint(sprint);
     return project;
 }
}

Project project = ProjectMother.CreateProjectWithSprintAndUserStories();
However as time goes by you end up with a lot of factory methods for the slightest variation in the test data beacuse of the heavy coupling that exists since many tests use the same method. This make the Object Mother class hard to maintain. To solve this problem I usually write a fluent interface (embedded domain specific language) that I use to initalize my objects for testing. This is heavily based on the Expression Builder pattern. For each class I want to test I write a builder for that class. So when I want to create a Project object I just write:
Project project = ProjectBuilder.Create.Project
              .WithName("Test Project")
              .WithSprint(SprintBuilder.Create.Sprint
                  .WithName("Sprint 1")
                  .WithUserStory(UserStoryBuilder.Create.UserStory
                      .WithName("Story 1")
                      .WithStoryPoints(13)
                      .WithTask(TaskBuilder.Create.Task
                          .WithName("Task 1")
                          .WithHours(3))));

Behind the scenes the ProjectBuilder takes care of everything adding sprints and so on. In each method the builder just returns itself after having done some setup on the private Project instance. The ProjectBuilder is finally casted to Project using an implicit cast operator which returns the private Project instance.
public class ProjectBuilder
{
 private static Project mProject;

 public static ProjectBuilder Create
 {
     get { return new ProjectBuilder(); }
 }

 public ProjectBuilder WithName(string name)
 {
     mProject.Name = name;
     return this;
 }

 public ProjectBuilder Project
 {
     get
     {
         mProject = new Project { Name = "Test Project" };
         return this;
     }
 }

 public ProjectBuilder WithSprint(Sprint sprint)
 {
     mProject.AddSprint(sprint);
     return this;
 }

 public ProjectBuilder WithBacklog(Backlog backlog)
 {
     mProject.Backlog = backlog;
     return this;
 }

 public static implicit operator Project(ProjectBuilder builder)
 {
     return mProject;
 }

}
Still, I don't feel that the Object Mother and the Builder are mutually exclusive. If I have a lot of tests that use the same test data I often create an Object Mother with a factory method that uses the builders. When I need a specialized initialization of an object for a test I just use the builders directly in my tests.

I find this way of creating test data really useful and hopefully you do too! Feel free to download and use the code. The builders are located in the tests project.

Merry Christmas!!

Thursday, November 20, 2008

Presentation at NNUG Vestfold

Last night I held a presentation at NNUG Vestfold. I primarily held demos showing usage of Dependency Properties, Attached Dependency Properties and ControlTemplates in WPF.

Wednesday, September 10, 2008

Making SharpDevelop compile and debug Boo programs on a 64-bit machine

Recently I've started looking at Boo. As of now the only decent development tool for Boo is SharpDevelop. BooLangStudio, a Visual Studio plugin, recently came out in an alpha release. However, at the moment it's way too immature. Back to SharpDevelop... I'm running a 64-bit version of Vista which resulted in some problems compiling and debugging Boo programs. Since the compiler (booc.exe) is marked to run on AnyCPU it will start as a 64-bit process. SharpDevelop runs as 32-bit only. Therefore it will use the 32-bit version of MSBuild which in turn picks up a 32-bit version of System.dll and passes this version too booc.exe. Unfortunately this means that the 64-bit booc.exe process will crash when it tries to load the 32-bit System.dll. A solution to this is to use CorFlags to mark booc.exe as 32-bit only. However, this breaks the strong name so you'll have to resign it. Do the following:
  1. Open Visual Studio Command Prompt and run CorFlags "path\SharpDevelop\3.0\AddIns\AddIns\BackendBindings\BooBinding\booc.exe" /32BIT+ /Force.
  2. Download boo.snk from http://svn.codehaus.org/boo/boo/trunk/src/. Still in VS Command Prompt run sn -R "path\SharpDevelop\3.0\AddIns\AddIns\BackendBindings\BooBinding\booc.exe" "path\boo.snk".
You can now build your Boo programs!! A new problem suddenly arises when you try to debug your program. The debugger crashes because your program is compiled to run on AnyCPU. So the 32-bit only compatible debugger will launch a 64-bit program and crash. SharpDevelop suggests that you set the target cpu of your program to 32-bit. This is not possible to do; The only option is AnyCPU. So we'll have to hack some more:
  • Open the property page of your project and go to the Build Events tab. In the Post-build event command line text box type "path\Microsoft.NET\SDK\v2.0 64bit\Bin\CorFlags.exe" "$(TargetPath)" /32BIT+
Now build and debug your program. Voila!!

Monday, July 28, 2008

Balsamiq Mockups: Taking the evilness out of prototypes

Everyone that has heard Odd Helge Gravalid's presentation "Gui prototyper Onde" (GUI Prototypes Are Evil) knows why prototypes are evil shit. The main point is that proptypes create expectations about the functionality in the GUI that may not be fully implemented or may not be present at all, but it seems like it is. So what's wrong about that? Well, as I recently experienced; A couple of weeks after you presented the prototype, when you actually have implemented the functionality, the customer's project manager says: "What have you guys really been doing lately? This is nothing more than you showed me two weeks ago". And then you are in trouble. So the moral is that you should never make a prototype that actually looks like it's really implemented. It will most certainly backfire! Last Friday I came across this great GUI mockup tool called Balsamiq Mockups. The tool makes it really easy to create mockups by using the more than 60 pre-built controls. The cool thing is that it looks like they're actually drawn by hand, "so that people don't get attached to “that pretty color gradient”". This can certainly help us out not going in that prototype trap and rather let us concentrate on the important aspect of prototyping: Discussing functionality! Check it out: You can even try it out here!

70-502: WPF Exam

It's been a long time since I've blogged. What have I been up to lately? For the most part I've been working and tried to enjoy the summer as much as possible. I've also managed to pass the 70-502 - Microsoft .NET Framework 3.5 – Windows Presentation Foundation Application Development exam. For the last month and a half I've been working on a WPF project and I would recommend everyone that has done some work with WPF to give the exam a shot. It's actually quite easy.

Wednesday, April 23, 2008

Refactoring: Later? Continuously!

A couple of days ago I attended a meeting about refactoring. Some of the attendants were unfamiliar or felt unsecure about refactoring. One of the questions that arose was; "When is it time to refactor? Is it a joint descision in the developer team that it's time to refactor or is it an independent descision?" There is NEVER a special time for refactoring. You don't create a task that says refactoring. Refactoring goes hand in hand with coding and should be performed CONTINUOUSLY. ALWAYS. You should always take into account refactoring when estimating a task, if not you're in deep shit. The customers generally don't care if you build a house with duct tape, as long as it looks good enough. And they will certainly not pay for later changes not visible from the outside. But as a professional developer you know that the house will fall apart when the rainy day comes. So ALWAYS refactor and estimate accordingly!

Tuesday, April 22, 2008

DataGrid revisited

Today I held a little presentation at work showing off some of the DataGrid capabilities in Silverlight 2.0 beta. It's basically based upon my last post, but I added some new functionality that shows use of the DataGridCheckBoxColumn and the DataGridTemplateColumn.
<Data:DataGridCheckBoxColumn Header="Is done" DisplayMemberBinding="{Binding IsDone, Mode=TwoWay}" />
<Data:DataGridTemplateColumn Header="Due date">
  <Data:DataGridTemplateColumn.CellTemplate>
      <DataTemplate>
          <TextBlock Text="{Binding DueDate, Mode=TwoWay}" />
      </DataTemplate>
  </Data:DataGridTemplateColumn.CellTemplate>
  <Data:DataGridTemplateColumn.CellEditingTemplate>
      <DataTemplate>
          <DatePicker SelectedDateFormat="Short" FirstDayOfWeek="Monday" SelectedDate="{Binding DueDate, Mode=TwoWay}" />
      </DataTemplate>
  </Data:DataGridTemplateColumn.CellEditingTemplate>
</Data:DataGridTemplateColumn>
You can download the source code here

Sunday, March 30, 2008

Silverlight 2.0: Two-way data binding with DataGrid

I've written a little Master/Detail demo application using the DataGrid control in Silverlight 2.0 beta 1. The application shows usage of data binding, styles and definition of custom columns for the DataGrid control. You can download the source code here.




The following code fragment shows how to define your own columns for the data grid:

Tuesday, March 4, 2008

The next generation rule engines are pure C#

The project I'm currently working on is based on a rather narrow domain, but with a high density of business rules that comes along. Currently we are writing the rules directly in code, but there have been some pressure about using an in house created XML based rule engine. As the rule base becomes bigger and bigger the more the pressure. As I see it there are two reasons for using a rule engine:
  1. Separate the rules from the code so it can be edited without recompiling the application.
  2. Let the rules be edited by a domain expert.
Does an XML based rule engine support this? Sure, a rule engine certainly separates the rules from the code and makes it editable. However, I don't really see how rules in XML are much better than rules written in code. Rules in XML can certainly become as complex syntactically as code and I would argue that as the rules get more complex, rules written in code actually get more easy to read and understand. So at my point of view a rule engine doesn't provide a domain expert much more support in editing rules than plain code. Now comes the real pain with a rule engine; Testing. As the rules are written in XML it becomes real hard testing them incorporating your domain objects. Sure, the XML is often used to generate code, but I don't see how this improves the testability much. The generated code will normally be complex and hard to use as the XML and code generator tries to capture and provide functionality for general purpose problems and scenarios in non specific domains. Refactoring the code won't update the XML and suddenly you find yourself in a maintainability hell. This is why I would rather have my rules in code. Now the problem with writing rules in code isn't that it is impossible to separate the rules from the rest code. Writing rules in code suffers in the same matter as rules written in XML in that it is extremely hard for a non technical domain expert to read, edit and verify rules. Lets not forget create new rules. I can understand those who argue that rules in XML are more maintainable for a non technical than rules in code, and the fact that there are some stakeholders driven against a XML based rule engine really got me thinking. As I most certainly want to keep my business rules in code for testability and maintainability I started thinking about an alternative way that could let us keep the rules in code and at the same time make them easy to edit, which in turn would satisfy all parties. A couple of days ago, on my way to work, I was talking to a colleague who had attended a meeting at the Norwegian .Net User Group (NNUG) where Anders Norås, a former employee at my company held at talk about fluent interfaces and domain specific languages. I immediately associated this concept with my rules problem and started digging around on the Internet. In my search I found at great article by Anders Norås where he shows an example of an DSL implementation. I would also recommend an article by Martin Fowler where he discusses the pros and cons of internal and external DSLs. Although external DSLs can be evaluated at runtime, which has made XML so popular, and the fact that internal DSLs are limited by the syntax of the language of choice, I would still go for an internal DSL. I don't have to learn a new language and I can take advantage of my existing skills. The idea of fluent interfaces seems very interesting. Writing a fluent interface would mean that the domain expert editing the rules could use IDEs and even get intellisense while he/she is typing the rules. Writing the rules in a language like Boo, which runs on the .Net CRL could even let the rules be evaluated runtime. However, I don't see this as an requirement for my problem. Writing a fluent interface in C# would probably satisfy all my needs. Extension methods would most certainly become handy. Still, I'm a little bit concerned about the effort and cost of creating such a DSL. Hopefully I soon will find the time to implement a little prototype and write a part 2 of this post. I'll keep you posted!