Load Balancing and Workload distributionQueues can be used to organise the distribution of a workload across multiple processors.
A common technique is to post messages to a queue which are a key to a task which needs to be performed, and have multiple worker processes pick these tasks from the queue to perform in parallel.
This can be combined with other storage mechanisms to perform complex tasks. Say you were trying to render an animation. The queue could be loaded with keys, which point to an entity in Table storage. This entity could contain the Url of an image stored in blob storage. A worker process would pick up the message from the queue, reference the entity from the table and download the image from blob storage. It could then perform compute intensive tasks on the image while other worker processes picked up other images.
In this way, cloud storage can be used to augment worker roles and perform powerful parallel operations.
Asynchronous Processing and Temporal DecouplingIt might sound like something you do with a sonic screwdriver, but temporal decoupling simply means that senders and receivers don't need to be sending and receiving messages at the same time. With messages stored on the queue, they can be processed when the receiver is ready to do so, and the sender is not held up.
This can be useful for performing background tasks which are not critical to the main operation of a program, for example, a web application, rather than providing a slow response, can just queue up a thumbnail generation process and respond to the user quickly. A game, which requires the utmost UI responsiveness, might store replay data asynchronously using queues.