Search code examples
performancegitmercurialbenchmarkingdvcs

git vs mercurial performance


Do any performance benchmarks exist?

I'm looking to create a repo and commit/ push for legacy code that runs several gigs deep.

Is either faster / footprint etc?

I apologize if this is too vague...


Solution

  • Original Answer (March 2011, GitHub had less than 3 years)

    The correct performance to measure about a DVCS (which performs all operations locally anyway) is the one about your daily tasks:

    The raw performance of basic operations isn't that relevant, provided you understand the limits of a DVCS: you cannot have one single repo into which you would put everything (all projects, or all kind of files like binaries).
    Some kind of modules reorganization must take place to define the right amount of repo per "modules" (coherent file sets).


    Update 2018, seven years later: The Windows support for Git is now a reality, and aim at improving perfomance/scalability of Git.

    To illustrate that, Microsoft has its entire Windows codebase into one (giant) Git repository: See "The largest Git repo on the planet": 3.5M files, 300GB, 4,000 engineers producing 1,760 daily “lab builds” across 440 branches in addition to thousands of pull request validation builds.
    But this is with the addition of GVFS (Git Virtual FileSystem), which allows to dynamically download only the portions you need based on what you use.
    This is not yet in Git native, although its integration has begun last Dec. 2017, with the implementation of a narrow/partial cloning.