Managing large binary files with git

Posted by pi on Stack Overflow See other posts from Stack Overflow or by pi
Published on 2009-02-12T08:52:52Z Indexed on 2010/03/19 19:01 UTC
Read the original article Hit count: 314

Hi there. I am looking for opinions of how to handle large binary files on which my source code (web application) is dependent. We are currently discussing several alternatives:

  1. Copy the binary files by hand.
    • Pro: Not sure.
    • Contra: I am strongly against this, as it increases the likelihood of errors when setting up a new site/migrating the old one. Builds up another hurdle to take.
  2. Manage them all with git.
    • Pro: Removes the possibility to 'forget' to copy a important file
    • Contra: Bloats the repository and decreases flexibility to manage the code-base and checkouts/clones/etc will take quite a while.
  3. Separate repositories.
    • Pro: Checking out/cloning the source code is fast as ever, and the images are properly archived in their own repository.
    • Contra: Removes the simpleness of having the one and only git repository on the project. Surely introduces some other things I haven't thought about.

What are your experiences/thoughts regarding this?

Also: Does anybody have experience with multiple git repositories and managing them in one project?

Update: The files are images for a program which generates PDFs with those files in it. The files will not change very often(as in years) but are very relevant to a program. The program will not work without the files.

Update2: I found a really nice screencast on using git-submodule at GitCasts.

© Stack Overflow or respective owner

Related posts about git

Related posts about version-control