Practical Patterns: Refactoring with CoR Pattern

This is a 3rd post in the series of Design Patterns. Since I would be refactoring an existing solution it would help to refer to my previous posts to understand the sample code base. You may begin with Bridge and move to Decorator pattern to see how the application has evolved.

Using a single if else or a small switch case has never been difficult. But if the options or decisions to make continues to grow, then you would soon end up with an unruly code base. You would very quickly lose loose coupling and with violation of Single Reponsibility Principle, there would be a huge risk of breaking existing code or introducing defects with newer requirements.

Chain of Responsibility (CoR) is very popular frameworks which employ it to process events or requests. One can write a single atomic unit of process and then chain them together to form a complex processing pipelines. OWIN is one such specification through which you can implement complex frameworks by chaining individual OWIN components together.

CoR provides a chance for more than one object to process the request. Every object in the chain can process the request or once the request has been processed by a suitable handler, it can return. The sender or the client needs to only know the initial handler or the base class to launch the request.

There is a special case which requires to be handled in CoR. Since each handler is chained like nodes in a linked list, the end of chain condition requires to be handled. We’ll review the options shortly.

Let us work with adding a new requirement to the existing application. A client requires to go through an approval process for Corporate accounts. Depending on the amount’s value a manager or director can approve the WithDrawal. Let us see how the solution looks without implementing it through the pattern. Let us pick up from where we left from our Decorator implementation.
Introduce a new Employee class which would provide us with a possible approver. The Approve method would determine if this employee can approve or not.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
class Employee
{
public Employee(string name, decimal approvalLimit)
{
Name = name;
ApprovalLimit = approvalLimit;
}

public string Name { get; }

private decimal ApprovalLimit { get; }

public bool Approve(decimal amount)
{
if(amount < ApprovalLimit)
{
return true;
}
return false;
}
}

In the Withdraw method, we now check through the list of approvers to determine if we can continue with the withdrawal.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
class Corporate : ITransaction
{
private decimal _balance = 0;
private decimal _overdraft = 0;
const decimal overdraft_factor = 0.1m;
// list of approvers
List<Employee> approvers;
public Corporate(decimal balance)
{
_balance = balance;
Overdraft = _balance;
Console.WriteLine("Opened corporate account with balance {0}", balance);
Console.WriteLine("Overdraft stands at {0}", _overdraft);

approvers = new List<Employee>
{
new Employee("Micky Manager", 1000),
new Employee("Donald Director", 5000)
};
}

// Omitted Deposit for brevity

public void Withdraw(decimal amount)
{
bool amountProcessed = false;
foreach (Employee approver in approvers)
{
amountProcessed = approver.Approve(amount);
if (amountProcessed)
break;
}
if(!amountProcessed)
{
Console.WriteLine("Amount {0} did not get approved", amount);
return;
}
if (amount < Overdraft)
{
_balance -= amount;
Overdraft = _balance;
Console.WriteLine("Widthdrawn: {0} | New balance: {1} | Overdraft: {2}", amount, _balance, Overdraft);
return;
}
Console.WriteLine("Could not withdraw. Current overdraft limit {0}", Overdraft);
}
}

At this point there is no change in the client application.

1
2
3
IAccount account = new CreditAccount("C2102", AccountType.Corporate);
account.Deposit(1000);
account.Withdraw(1000);

In the Withdraw method we now iterate through the list of approvers. Why is this a cause of concern? This means that the algorithm for knowing how the approval process works is known to the caller. In this simple case it is iterating through a list of employees. It goes from one approver to the next until someone in the list approves or the list ends. CoR takes all these reponsibilities away from the caller, while also reducing coupling and granting more flexibility.
To move this to CoR implementation, we introduce a handler. The handler takes the responsibility of the action as well as promoting or delegating to the next handler if the current instance is unable to process.

Though it is not imperative to have an interface (I normally have just an abstract class), in this instance we would go with one.

1
2
3
4
5
interface IWithdrawHandler
{
bool Approve(decimal amount);
void SetNext(IWithdrawHandler next);
}

Implementing the interface we can get a common handler.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
class WithdrawHandler : IWithdrawHandler
{
private readonly Employee _approver;
private IWithdrawHandler _next;
public WithdrawHandler(Employee approver)
{
_approver = approver;
}
public bool Approve(decimal amount)
{
if (_approver.Approve(amount))
return true;
else
return _next.Approve(amount);
}

public void SetNext(IWithdrawHandler next)
{
_next = next;
}
}

The Corporate class can take in instance of IWithdrawHandler in its constructor.
In AccountBase where we were creating the instance of Corporate class.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
class Corporate : ITransaction
{
private decimal _balance = 0;
private decimal _overdraft = 0;
const decimal overdraft_factor = 0.1m;

private readonly IWithdrawHandler _handler;
//List<Employee> approvers;
public Corporate(decimal balance, IWithdrawHandler handler)
{
_balance = balance;
Overdraft = _balance;
Console.WriteLine("Opened corporate account with balance {0}", balance);
Console.WriteLine("Overdraft stands at {0}", _overdraft);
_handler = handler;
}

public void Withdraw(decimal amount)
{
bool amountProcessed = _handler.Approve(amount);

if(!amountProcessed)
{
Console.WriteLine("No approvers for amount {0}", amount);
return;
}
if (amount < Overdraft)
{
_balance -= amount;
Overdraft = _balance;
Console.WriteLine("Widthdrawn: {0} | New balance: {1} | Overdraft: {2}", amount, _balance, Overdraft);
return;
}
Console.WriteLine("Could not withdraw. Current overdraft limit {0}", Overdraft);
}
}

abstract class AccountBase : IAccount
{
protected ITransaction transactionImp;
public AccountBase(string name, AccountType type)
{
Name = name;
switch (type)
{
case AccountType.Personal:
transactionImp = new AuditTransactionDecorator(new Personal(1000));
break;
case AccountType.Corporate:
IWithdrawHandler micky = new WithdrawHandler(new Employee("Micky Manager", 500));
IWithdrawHandler donald = new WithdrawHandler(new Employee("Donald Director", 1000));
IWithdrawHandler scrooge = new WithdrawHandler(new Employee("Scrooge Owner", 1500));
micky.SetNext(donald);
donald.SetNext(scrooge);
transactionImp = new AuditTransactionDecorator(
new NotifyTransactionDecorator(
new Corporate(5000, micky)));
break;
default:
throw new NotImplementedException("Unknown account type");
}
}
//...
//...
//...
}

There still is no change in the calling application, but only change is how the instance of Corporate is being created. When this is handled by DI or a factory, the change would be even less intrusive. One requirement of CoR is the handling of “end of chain” condition. What happens when we reach the last handler (scrooge here) and is unable to approve? In this case NullReferenceException is thrown. This might be an acceptable behaviour, though you might want to throw a custom exception. The client in this case should be prepared to take a corrective action. Another way is to have the last handler special such that it knows not to try and promote to next handler, but always returns. Or you could have a special handler which is meant to handler ‘end of chain’ event. This is akin to a Null Object where it has a default implementation.

Entity Framework mapping private members with Search

In one of my project’s, I am in the process of implementing a complex domain by embracing DDD. I am using Entity Framework and working on a legacy database.

EF is good enough to provide us with sufficient mapping powers to use the data model as our business model. Also I would want to keep things simple and direct by using the same model for both domain as well as EF. However, this brings up a few challenges.

If you need a fully encapsulated domain, you would probably want to expose your collections as IEnumerable<T>. But, you cannot have IEnumerable<T> for your collection and map to EF. It requires the collection to be of type ICollection<T>. But this would allow direct access to manipulate the collection through its Add, Remove methods. Your domain is not completely encapsulated. You can get around this limitation and use IEnumerable<T> in your model so that it is protected by your domain rules, but you would have to load the collection manually through your repository or factory in to the domain model. You would need to track the changes yourself and let EF know at the time of persisting by attaching each object from the collection back to EF and set with the relevant EntityState. This is needed since you could not have mapped an association in EF as your ICollection<T> backing member was private.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
public class Student
{
public string Name { get; private set; }

private List<Course> _courses;
public IEnumerable<Course> Courses { get { return _courses; } }

public void AssignCourses(IEnumerable<Course> courses)
{
_courses = new List<Course>(courses);
}

public Student(string name)
{
Name = name;
}
private Student() { }
}

public class StudentRepository
{
public Student Get(string name)
{
var student = context.Students.FirstOrDefault(c => c.Name == name);
var courses = context.Courses.Where(c => c.Name == name).AsEnumerable();
student.AssignCourses(courses);
return student;
}
}

Another approach would be for you to map your private properties or members. This is a bit more involved than the previous method, but you can take advantage of EF while loading objects. In this article, mapping to non-public member shows one way how this can be achieved. As the article explains, this would not be a pure POCO since it contains knowledge of configuration for persistence. But I like this approach as it is simple and non-intrusive to the actual behaviour of the POCO itself.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
public partial class Account
{
public int Id { get; private set; }

public string Name { get; private set; }

private string Code { get; set; }
}

public partial class Account
{
public class EFStorageExpression
{
public static readonly Expression<Func<Account, string>>
CodeAccessor = a => a.Code;
}
}

You can then map it as follows under EntityTypeConfiguration<Account>

1
Property(Account.EFStorageExpression.CodeAccessor)

The reason I stated this as complicated was that, sometimes it does not end with us just being able to map the entity and load it. We would also need to query using the private properties. If a private field is holding a collection, then there might be more reasons to filter on them. You would still be able to Include them to eager load them. You would then need to go through the Expression which you have exposed.

Let us see how we can query the for Account.

1
var account = context.Accounts.Where(a => a.Id == id).FirstOrDefault();

This would infact load the value for the property Code. But what if you need to filter on Code? All we have is the Expression<Func<Account, string>> pointing to Code property, but we require an Expression<Func<Account, bool>> which should be passed as the Where clause. For this we’ll need to build an Expression which represent our filter.

1
2
3
4
5
6
7
8
9
10
11
static Expression<Func<Account, bool>> SearchCode(string code)
{
ParameterExpression param = Expression.Parameter(typeof(Account));
var codeExp = Expression.Constant(code);

var invokedExpr = Expression.Invoke(Account.EFStorageExpression.CodeAccessor,
param);
var searchCodeExp = Expression.Equal(invokedExpr, codeExp);
// a => a.Code == code
return Expression.Lambda<Func<Account, bool>>(searchCodeExp, param);
}

Using the above, our search on Code can be done like this.

1
var account = context.Accounts.Where(SearchCode("A001")).FirstOrDefault();

But during runtime, this fails miserably throwing an exception

Message=The LINQ expression node type ‘Invoke’ is not supported in LINQ to Entities.

The problem is LINQ to Entities does not support Invocation of expressions. This works fine with LINQ to Objects and LINQ to SQL, but not with EF. You would need to use an ExpressionVisitor to rewrite the tree.
Or use LinqKit as I do here. This can be downloaded from Nuget. With this you can use your Expressions with EF.

The same query above after LinqKit looks as below.

1
2
var account = context.Accounts.AsExpandable().
Where(SearchCode("A000001")).FirstOrDefault();

The only addition was AsExpandable() extension before calling your custom expression. Everything works like a charm.

You can now hide your data and also make use of EF without adding much gunk to your model.

Thinking on your feet

I have been away from my blog past few weeks. Most of my weekends have been very busy. But I want to keep the momentum going in contributing towards my blog, so decided to write a product review instead. It addresses an integral part of our code, the posture we take while keying away our creations.
I am talking about standing desks if any one of you were wondering. Couple of years ago I dismissed it as a fad. But then heaps of research started surfacing indicating the benefits of standing. Let me rephrase it, indicating the horrors of sitting for long durations.

This intrigued me. But before jumping and investing in a standing desk, I had to test it out first. Will I like standing and working? These desks don’t come cheap. Would I be really using it and find it comfortable? I knew it would be very difficult to judge any immediate health benefits by using the desk. I am not sure how this can be determined after continued use also. But what I needed to know was how comfortable would it be in using it.

I scoured the internet and came across a few hacks and Ikea hacks. I even came up with a design to build a frame using PVC pipes to support a laptop and a keyboard, when placed on a table. All required new material purchases, time and effort. Then it hit upon me to simply use cardboard boxes already lying around which were sturdy enough to support a laptop and also high enough to provide ergonomic viewing of the screen. I used shorter box to place my external keyboard and mouse on. This way my forearm was comfortable and parallel to the floor. With this setup I was able to very comfortably work standing on my laptop without any strain or discomfort. I could easily stand and work for 2 hours at a stretch. And all for zilch. This is what I would call a good POC.

I had a similar setup at office and tried out standing and working for more than a month. It was much easier at work since I would use my desktop while sitting and switch over to laptop while standing. I just had to clear the boxes whenever I felt like reclaiming my desk space.

My DIY desk @ home

Conclusion. Standing and working rocks. I absolutely loved it. One thing I observe is that each time I switch to standing mode, it gives me a fresh boost of energy and focus. It definitely gives you a psychological advantage.

Now for those who plan to research further, there are basically 2 categories of standing desks. One which sits on top of your existing desk. This involves least rearrangement of your existing furniture and you could get to use your existing cubicle desk or table. The other variety is where the entire desk moves up or down with the help of a motor. These type of desks are more expensive, but also offer you greater flexibility and more usable space.

When I was ready for a more serious desk, thanks to my company, I settled down with a brand new VERIDESK PRO.

VERIDESK PRO @ Work

It is very convenient to transform it from sitting to standing and vise versa. You can easily pull up the desk using the hand slots and lock it at any height. It does use up some desk space and you might have to rearrange your desk items to suit the new furniture.

I also love using multiple monitors while working. This desk accommodates two large ones very easily. You also can keep few small items while using it in standing mode.

Taking it down to sit is even more easier. But I have to warn you! Always ensure there is nothing under the lower platform. It is going to crush it. Be it cables, pen or ear phones. Yes, during the first lowering excersice I crushed my favourite ear phones. Sigh! Now I always look underneath to ensure nothing is going to get crushed. I also stop at the lowest possible level and pull out the keyboard and mouse cables out of the way before lowering it all the way down. You need to push down until it clicks to lock in.

Motorised desks obviously won’t have this drawback of having to clear or rearrange your desk. Whether down or up, it is the same.

Whichever standing desk you choose, here the end is more important than the means.

Command Bridge

Events Part 3 On steroids using TPL Dataflow

Events

For last couple of days I have been exploring TPL Dataflow in seeing how it can reduce complexity of parallel processing by alleviating callbacks and synchronizations such as locks and sharing data. And I am loving it. I am still exploring this library and the resources and documentations on this seem a bit scarce as of now. I feel there are not enough developers using it as they are not aware or the design pattern is unfamiliar.

How can we use Dataflow to handle events?

There was a particular feature which caught my attention since my last 2 articles were based on events, I set out to implement a solution to handle events by using Dataflow. For those of you who are exploring DDD, I recommend taking a look at Domain Events which helps you keep the domain completely encapsulated. I like the idea here of exposing events as an object instead of a delegate. The immediate benefit I see is that the domain class no longer needs to hold a reference to the handler and the handlers need not explicitly unsubscribe from the events. These were few of the drawback which I had stated earlier by using delegate based event and this addresses them nicely.

TPL Dataflow is built on top of Task Parallel Library and so with it brings all the familiar support for tasks and asynchronous programming. It is not distributed with the .NET framework and needs to be installed through NuGet. Look for Microsoft.Tpl.Dataflow and you should have System.Threading.Tasks.Dataflow.dll when installed.

Quick Intro

To get started, Dataflow provides us blocks with pipes through which data can be processed or just flow. A block can either receive (ITargetBlock<T>) or offer data (ISourceBlock<T>) or do both when combined. The data can be processed by the block or simply transferred. You can push data to a block (Post(input)) and get output from it (Receive). You can connect two or more blocks together by linking them together (LinkTo(ITargetBlock)). That’s where the analogy of pipes comes in. Once you have linked the blocks and send data to the top most block, you need not worry about moving data. The linking would take care of propagating data from one block to another. You can instruct a block to stop accepting input anytime.
There are different blocks readily made available and one could create custom blocks too. You can find more information from Microsoft’s Introduction to TPL Dataflow.

In this article I’ll hit the ground running with Dataflow by using it to handle my domain events. I would be building this on top of my earlier simple domain example used here and here. I would be borrowing the idea of domain event here and represent it explicitly through an instance.

Domain Event Interface

1
2
3
4
public interface IDomainEvent
{
DateTime EventOccurred { get; }
}

Define my event for product added.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
public class ProductAddedEvent : IDomainEvent
{
public DateTime EventOccurred
{
get; private set;
}

public Product Product { get; private set; }

public ProductAddedEvent(Product product)
{
EventOccurred = DateTime.Now;
Product = product;
}
}

The event would now be raised by the domain object Product

1
2
3
4
5
6
7
8
9
10
11
public Product(IBroadcaster mediator)
{
_mediator = mediator;
}

public void Add(string name, int quantity)
{
Name = name;
Quantity = quantity;
_mediator.Post(new ProductAddedEvent(this));
}

As you can see, the domain no longer raises an event, but sends the change in state through an object instance of ProductAddedEvent. Also there are no static calls here to raise the event, but through an injected instance of IBroadcaster.

1
2
3
4
public interface IBroadcaster
{
void Post<T>(T args) where T : IDomainEvent;
}

All those interested in subscribing to any type of IDomainEvent would do through a concrete instance of ISubscriber.

1
2
3
4
5
6
7
public interface ISubscriber
{
IDisposable Subscribe<T>(Action<T> action)
where T : IDomainEvent;

void UnSubscribe(IDisposable obj);
}

Ok. So how do we link all this up and where does Dataflow fit in? Under Dataflow we have a BroadcastBlock<T> concrete implementation. Its sole mission is to send a copy of every message published to all linked targets. Optionally you can even issue a cloning function Func<T,T> on how that copy would be offered to the targets.

We implement both the interfaces here to manage subscription and posting of messages.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
public class EventMediator : ISubscriber, IBroadcaster
{
private readonly BroadcastBlock<IDomainEvent> broadcast =
new BroadcastBlock<IDomainEvent>(args => args);

public void Post<T>(T args)
where T : IDomainEvent
{
broadcast.Post(args);
}

public IDisposable Subscribe<T>(Action<T> action)
where T : IDomainEvent
{
var handler = new ActionBlock<IDomainEvent>(
args => action((T)args));

return broadcast.LinkTo(handler,
e => e.GetType() == typeof(T));
}

public void UnSubscribe(IDisposable obj)
{
obj.Dispose();
}
}

Event handlers can subscribe to the events they are interested in. The message would be passed only to those handlers which satisfy the filter condition.
Another beauty is anytime a handler can unlink by calling UnSubscribe which just disposes off the IDisposable object returned by the LinkTo call. This would internally unlink and stop sending future messages to that handler.

That’s it. Now we write the handler and subscribe.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
public class CheckStock : IDisposable
{
private IDisposable _subscription;
public CheckStock(ISubscriber subscriber)
{
_subscription = subscriber.Subscribe<ProductAddedEvent>
(m => Handler(m));
}

private void Handler(ProductAddedEvent args)
{
Console.WriteLine("Received by CheckStock,
event at {0}", args.EventOccurred);
Console.WriteLine("Checking stock for {0} ",
args.Product.Name);
}

public void Dispose()
{
_subscription.Dispose();
}
}

Stitching them up together below we can see them in action.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
var mediator = new EventMediator();
// subscribers
var stockChecker = new CheckStock(mediator);
var notify = new SendNotification(mediator);

// add an inline handler
var adHoc = mediator.Subscribe<ProductAddedEvent>(
(m) =>
{ Console.WriteLine("This is an inline process for {0}",
m.Product.Name); });

var product = new Product(mediator);
product.Add("Cutting edge", 10);

//notify.Dispose();
// Unsubscribe one of them
mediator.UnSubscribe(adHoc);
product = new Product(mediator);
product.Add("Bleeding edge", 10);

What we now have is completely unit testable and injectable. Those adhering to DDD would have a fully encapsulated domain. Since the Subscribe takes in an Action<T>, send any action which you can assert when the event is raised.

If there are many events in a domain, we can define new type implementing IDomainEvent

1
public class ProductDeletedEvent : IDomainEvent

Assign a handler for the new event and subscribe

1
mediator.Subscribe<ProductDeletedEvent>(m => Handler(m));

The above handlers would run as concurrent operations. You can send message to as many handlers as required while still being able to maintain throughput. Dataflow allows you to throttle the processing or buffer your messages. Like we have multiple consumers registered with EventMediator, we can have multiple producers sending data through the same instance of EventMediator.

Events Part 2 - Asynchronously call handlers

Events

In the previous post we saw how to make use of events to extend functionality. We also saw that the execution of the handlers were blocking in nature. The execution was synchronous. There might be a real benefit if the handlers were executed asynchronously without blocking the main thread. Please note that I stressed on might. You have to choose carefully if your operations requires parallelism, because this will complicate your application and would take more resources without any benefit in return. If you foresee many handlers listening to an event, this might give you the desired throughput by offloading them to background threads. If you are making the call from a UI thread, it would also be beneficial to free up the UI thread.
Since the events are handled by delegates we can leverage asynchronous methods calls. You can find more information of calling methods asynchronously here as of this writing.

There are a few things we have to consider and be cautious while dealing with events asynchronously. Events return void by default. When you use the generic delegates (EventHandler / EventHandler<EventArgs>) they always return void. If you use custom delegates which return value with your events, it can become awkward to deal with them when they have multiple handlers attached to them. I am not saying it is not possible, just difficult. You should be able to correlate each handlers return value, if not you would end up only with the value from the last handler’s result. Then again if you require return value from your events, then you might nned to use something other than events.

The only thing the subject needs to wait for is until all the handlers complete executing. If the thread (main or foreground) from which the event handlers were invoked exits before the handlers return, which would be running on the background thread, then they would be abruptly terminated. You would also only know if there were any exceptions in any of the handlers when you call the EndInvoke of the delegate.

Asynchronous Programming Pattern (APM) is now obsolete in favour of Task-based Asynchronous Pattern (TAP) and so is Event-Based Asynchronous Pattern (EAP). Calling of Begin method, dealing with IAsynResult and calling of End was not easy to implement and for a client consuming it, the very least, cumbersome.

We can now use Task.Factory.FromAsync which makes it easy to deal with APM calls by returning a Task. One of its overloads takes the Begin/End pair and while taking any additional arguments that gets passed to these methods. If you find the overloads lacking, then you can provide one of your own wrapper using TaskCompletionSource internally similar to FromAsync.

For those projects which cannot yet take advantage of TAP, I recommend using callback method when the call completes as described in MSDN. Through this one can handle multiple handlers since the callback would provide you with a unique IAsyncResult so that you don’t need to keep track of it. It provides the most flexibility and least blocking.

Ok, heading back to implement our asynchronous event handlers, we’ll see that there are not much changes that need to be done to the synchronous implementation. We would be receiving a Task for each Add. Add method signature would now be look as below

1
public async Task AddAsync(string name, int quantity)

Notice we are returning a Task and also marked it as async. Also changed the method name to reflect this as per the conventions.
One for all, all for one. If one method is async in a chain of calls, then it is best to have all the methods up the order as async-await.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
public class ProductAsyncTask
{
public event EventHandler<ProductAsyncTaskEventArgs> Saved;

// simulate data store
private readonly IDictionary<string, ProductAsyncTask> _productRepository;

public ProductAsyncTask()
{
_productRepository = new Dictionary<string, ProductAsyncTask>();
}

public string Name { get; set; }
public int Quantity { get; set; }

public async Task AddAsync(string name, int quantity)
{
if (_productRepository.ContainsKey(name))
{
_productRepository[name].Quantity = quantity;
Console.WriteLine("Thread# {2}: {0} quantity updated to {1}",
name, quantity, Thread.CurrentThread.ManagedThreadId);
// return a completed task
await Task.CompletedTask;
return;
}
var p = new ProductAsyncTask() { Name = name, Quantity = quantity };
_productRepository.Add(name, p);
Console.WriteLine("Thread# {1}: New Product added - {0} ",
name, Thread.CurrentThread.ManagedThreadId);
// wait without blocking the thread
await OnSaved(p);
Console.WriteLine("Thread# {1}: Completed add for {0}",
name, Thread.CurrentThread.ManagedThreadId);
}

protected async virtual Task OnSaved(ProductAsyncTask p)
{
var multiDels = Saved as EventHandler<ProductAsyncTaskEventArgs>;
if (multiDels == null)
{
await Task.CompletedTask;
return;
}
// list of event subscribers
Delegate[] delList = multiDels.GetInvocationList();
Task[] tasks = new Task[delList.Length];
for (int i = 0; i < delList.Length; i++)
{
EventHandler<ProductAsyncTaskEventArgs> e =
delList[i] as EventHandler<ProductAsyncTaskEventArgs>;
Task t = Task.Factory.FromAsync(
e.BeginInvoke, e.EndInvoke, this,
new ProductAsyncTaskEventArgs(p), null);
tasks[i] = t;
}
await Task.WhenAll(tasks);
}
}

Nothing much has changed inside the AddSync method. We are now awaiting on task or returning a completed task if there is nothing to continue for.
Inside OnSaved we see a little bit more action. We are now getting a list of delegates from the MulticastDelegate and use Task.Factory.FromAsync to create a Task for async call of each delegate. If we would have created a Task using Task.Factory.StartNew or Task.Run and made an Invoke of each delegate, the final outcome would be the same. But, this would have blocked each thread on the thread pool until completion since the task would run synchronously. In FromSync, the EndInvoke call would signal the completion and the thread would not be blocked.

The client application

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
var productTask = new ProductAsyncTask();
productTask.Saved += handler1.SendMail;
productTask.Saved += handler2.DoCall;
productTask.Saved += handler3.Audit;

var t1 = productTask.AddAsync("Cadbury classic", 10);
var t2 = productTask.AddAsync("Lindt Dark 80%", 10);
var t3 = productTask.AddAsync("Mars", 10);
var t4 = productTask.AddAsync("Cadbury classic", 10);
Console.WriteLine("Doing some other work here...");
try
{
Task.WaitAll(t1, t2, t3, t4);
}
catch (AggregateException ae)
{
foreach (var e in ae.InnerExceptions)
{
Console.WriteLine(e.Message);
}
}
// unregister
productTask.Saved -= handler1.SendMail;
productTask.Saved -= handler2.DoCall;
productTask.Saved -= handler3.Audit;

I have not including any error handling, but since we have a Task returned, we can follow its exception handling pattern. You can have a continuation task on fault or catch the exceptions closest to its source. Also if the subject is long lived, remember to un-register the handlers.

So we have seen how we can relatively easily make our event handlers run asynchronously using TAP.