Versioning millions of files with distributed SCM

Posted by C. Lawrence Wenham on Programmers See other posts from Programmers or by C. Lawrence Wenham
Published on 2011-02-27T19:53:19Z Indexed on 2011/02/27 23:32 UTC
Read the original article Hit count: 395

Filed under:
|
|

I'm looking into the feasibility of using off-the-shelf distributed SCMs such as Git or Mercurial to manage millions of XML files. Each file would be a commercial transaction, such as a purchase order, that would be updated perhaps 10 times during the lifecycle of the transaction until it is "done" and changes no more.

And by "manage", I mean that the SCM would be used to not just version the files, but also to replicate them to other machines for redundancy and transfer of IP.

Lets suppose, for the sake of example, that a goal is to provide good performance if it was handling the volume of orders that Amazon.com claimed to have at its peak in December 2010: about 150,000 orders per minute.

We're expecting the system to be distributed over many servers in order to get reasonable performance. We're also planning to use solid-state drives exclusively.

There is a reason why we don't want to use an RDBMS for primary storage, but it's a bit beyond the scope of this question.

Does anyone have first-hand experience with the performance of distributed SCMs under such a load, and what strategies were used?

Open-source preferred, since the final product is to be FOSS, too.

© Programmers or respective owner

Related posts about Xml

Related posts about git