There is a hard limit of 2 GB when hosting Git on Bitbucket - if this is exceeded, you only have read-only access to the repository. To prevent this, you can, for example, retroactively remove large folders or files from your commits. But also in other cases (if access data has entered the history or node_modules has slipped back to master) you have to retrospectively manipulate the history of Git, contrary to its nature.
Bitbucket has written a detailed article about this itself, so to play through the whole thing on a case, we first create a new repository:
Then we clone the repository to an empty folder on the local machine:
6ab7686fc508ce87c52b10bb5d01ee51
Now we create two subfolders with files of random contents:
6ab7686fc508ce87c52b10bb5d01ee51
We are now pushing on master:
6ab7686fc508ce87c52b10bb5d01ee51
Now we have almost reached the hard limit of 2 GB on Bitbucket:
We can also check this locally (see "size-pack"):
6ab7686fc508ce87c52b10bb5d01ee51
The task now is to retroactively remove "foo" from the repository to cut its size in half. To do this, we first edit the current HEAD and write the folder to gitignore:
6ab7686fc508ce87c52b10bb5d01ee51
Finally, we remove the folder with the help of the BFG Repo Cleaner (BFG needs a current JRE on the system as a system requirement):
6ab7686fc508ce87c52b10bb5d01ee51
We now see the result locally:
6ab7686fc508ce87c52b10bb5d01ee51
But on Bitbucket, the repository size has not yet changed, because the garbage collector has not yet been executed remotely and bitbucket does not execute a "git gc" on every push:
This is also confirmed by the support:
So it is best to send a request directly to support@bitbucket.org to manually run "git gc" in the repository. After a short time, this was also done by the support team:
If you "freshly" move the repository to another computer, only 0.9 GB will end up on the disk. If someone has the 1.8 GB version available locally, a "git pull" followed by "git gc" is sufficient.