Strategy: Flickr - Do the Essential Work Up-front and Queue the Rest
The process:
This approach makes it much easier to bound response latencies as features scale.
Queues Give You Lots of New Knobs to Play With
As features are added data consumers multiply, so throwing a new task into a sequential process has a good chance of blowing latencies. Queueing gives much more control and flexibility over the performance of a system. With queues some advanced strategies you have at your disposal are:
These ideas have been employed in embedded real-time systems forever and now it seems they'll move into web services as well.
What Can You do with Your Queue?
The options are endless, but here are some uses I found out in the wild:
Queuing Implies an Event Driven State Machine Based Client Architecture
Moving to queuing has architecture implications. The client and server are nolonger connected in a direct request-response sort of way. Instead, the server continually sends events to clients. The client is event driven instead of request-response driven. Internally clients often simulates the reqest-response model even though Ajax is asynchronous. It might be better to drop the request-response illusion and just make the client an event driven state machine. An event can come from a request, or from asynchronous jobs, or events can be generated by others performing activities that a client should see. Each client has an event channel that the system puts events on for a client to consume. The client is responspible for making sense of the event in its current context and is capable of handling any event regardless of its original source.
Queuing Systems
If you are in the market for a queuing system take a look at: