Is there a distributed VCS that can manage large files?
Posted
by joelhardi
on Stack Overflow
See other posts from Stack Overflow
or by joelhardi
Published on 2008-09-16T08:35:24Z
Indexed on
2010/03/29
3:13 UTC
Read the original article
Hit count: 317
Is there a distributed version control system (git, bazaar, mercurial, darcs etc.) that can handle files larger than available RAM?
I need to be able to commit large binary files (i.e. datasets, source video/images, archives), but I don't need to be able to diff them, just be able to commit and then update when the file changes.
I last looked at this about a year ago, and none of the obvious candidates allowed this, since they're all designed to diff in memory for speed. That left me with a VCS for managing code and something else ("asset management" software or just rsync and scripts) for large files, which is pretty ugly when the directory structures of the two overlap.
© Stack Overflow or respective owner