This is a very generic pattern & practice as such where maybe you want to upload a Blob (e.g. an image or a video), but when you want to upload the Blob, you want to process something on the Blob (e.g. encode or transform the image) and then make it available.
A recommended way of doing this is your application can add a message to the Azure queue storage with the blob URI and the maximum timeout to wait for the blob upload. Next, your application uploads the blob. The worker role processes the message in the queue: If the blob is not present (maybe it is still being uploaded), wait until the conservative max timeout. If the blob exists, process the blob (e.g. encode the image and store encoded image), delete the original blob, and delete the message in the queue.
This pattern is much better than listing your blobs or scanning your tables or anything of that sort that may be needed, and this pattern is also used in a lot of cases to keep a secondary replica of your data in a completely different region, for example, by many applications, like Skype, where they put a message in the queue, then insert into a table, and then the queue worker is now responsible for storing this replica elsewhere. The queue pattern is a very common pattern across most scaleable and resilient applications that have been built.