)]}'
{
  "commit": "a13e3d0ec87c9d12b93700375fceacdfb2c6885f",
  "tree": "3ba0da89780a3e44fedb043cc4e4e8d8558e13f4",
  "parents": [
    "52fe41ff1cd6e1f0b67d4e864e718d949e225f30"
  ],
  "author": {
    "name": "Derrick Stolee",
    "email": "dstolee@microsoft.com",
    "time": "Fri Sep 25 12:33:37 2020 +0000"
  },
  "committer": {
    "name": "Junio C Hamano",
    "email": "gitster@pobox.com",
    "time": "Fri Sep 25 10:53:05 2020 -0700"
  },
  "message": "maintenance: auto-size incremental-repack batch\n\nWhen repacking during the \u0027incremental-repack\u0027 task, we use the\n--batch-size option in \u0027git multi-pack-index repack\u0027. The initial setting\nused --batch-size\u003d0 to repack everything into a single pack-file. This is\nnot sustainable for a large repository. The amount of work required is\nalso likely to use too many system resources for a background job.\n\nUpdate the \u0027incremental-repack\u0027 task by dynamically computing a\n--batch-size option based on the current pack-file structure.\n\nThe dynamic default size is computed with this idea in mind for a client\nrepository that was cloned from a very large remote: there is likely one\n\"big\" pack-file that was created at clone time. Thus, do not try\nrepacking it as it is likely packed efficiently by the server.\n\nInstead, we select the second-largest pack-file, and create a batch size\nthat is one larger than that pack-file. If there are three or more\npack-files, then this guarantees that at least two will be combined into\na new pack-file.\n\nOf course, this means that the second-largest pack-file size is likely\nto grow over time and may eventually surpass the initially-cloned\npack-file. Recall that the pack-file batch is selected in a greedy\nmanner: the packs are considered from oldest to newest and are selected\nif they have size smaller than the batch size until the total selected\nsize is larger than the batch size. Thus, that oldest \"clone\" pack will\nbe first to repack after the new data creates a pack larger than that.\n\nWe also want to place some limits on how large these pack-files become,\nin order to bound the amount of time spent repacking. A maximum\nbatch-size of two gigabytes means that large repositories will never be\npacked into a single pack-file using this job, but also that repack is\nrather expensive. This is a trade-off that is valuable to have if the\nmaintenance is being run automatically or in the background. Users who\ntruly want to optimize for space and performance (and are willing to pay\nthe upfront cost of a full repack) can use the \u0027gc\u0027 task to do so.\n\nCreate a test for this two gigabyte limit by creating an EXPENSIVE test\nthat generates two pack-files of roughly 2.5 gigabytes in size, then\nperforms an incremental repack. Check that the --batch-size argument in\nthe subcommand uses the hard-coded maximum.\n\nHelped-by: Chris Torek \u003cchris.torek@gmail.com\u003e\nReported-by: Son Luong Ngoc \u003csluongng@gmail.com\u003e\nSigned-off-by: Derrick Stolee \u003cdstolee@microsoft.com\u003e\nSigned-off-by: Junio C Hamano \u003cgitster@pobox.com\u003e\n",
  "tree_diff": [
    {
      "type": "modify",
      "old_id": "5f877b097ad15e4861f935fd0f0bafce10b1c326",
      "old_mode": 33188,
      "old_path": "builtin/gc.c",
      "new_id": "8d22361fa9ae800dc28c02c9186ccfb9ce9157ff",
      "new_mode": 33188,
      "new_path": "builtin/gc.c"
    },
    {
      "type": "modify",
      "old_id": "a2db2291b0bd23f2e13ada83983ed6c9fc82099c",
      "old_mode": 33261,
      "old_path": "t/t7900-maintenance.sh",
      "new_id": "9e6ea23f35666410e181bb9cc236aa17b1e1229e",
      "new_mode": 33261,
      "new_path": "t/t7900-maintenance.sh"
    }
  ]
}
