Backing up SQL Azure

Posted by Herve Roggero on Geeks with Blogs See other posts from Geeks with Blogs or by Herve Roggero
Published on Tue, 21 Jun 2011 18:27:55 GMT Indexed on 2011/06/22 0:24 UTC
Read the original article Hit count: 524

Filed under:

That's it!!! After many days and nights... and an amazing set of challenges, I just released the Enzo Backup for SQL Azure BETA product (http://www.bluesyntax.net). Clearly, that was one of the most challenging projects I have done so far.

Why???

Because to create a highly redundant system, expecting failures at all times for an operation that could take anywhere from a couple of minutes to a couple of hours, and still making sure that the operation completes at some point was remarkably challenging. Some routines have more error trapping that actual code...

Here are a few things I had to take into account:

  • Exponential Backoff (explained in another post)
  • Dual dynamic determination of number of rows to backup 
  • Dynamic reduction of batch rows used to restore the data
  • Implementation of a flexible BULK Insert API that the tool could use
  • Implementation of a custom Storage REST API to handle automatic retries
  • Automatic data chunking based on blob sizes
  • Compression of data
  • Implementation of the Task Parallel Library at multiple levels including deserialization of Azure Table rows and backup/restore operations
  • Full or Partial Restore operations
  • Implementation of a Ghost class to serialize/deserialize data tables

And that's just a partial list... I will explain what some of those mean in future blob posts. A lot of the complexities had to do with implementing a form of retry logic, depending on the resource and the operation.

 

© Geeks with Blogs or respective owner