There have been several questions in the past about maximum git repository sizes and all of the answers are basically responses describing what GitHub or Gitlab permit ie here and here and here, and mention of a bug in windows that limits individual files to 4Gb.
There are no questions as far as I can tell that consider the maximum practical size of a git repo independent of provider, ie if you are self hosting and pushing via SSH.
So for those people who are familiar with Git internals, what are the practical real world limitations of Git if you ignore what rules hosting providers put in place.
(My particular use case is to keep track of a few thousand 5-20mb image and audio files, but I am trying to keep the question generic)
Based on my own personal experience, it is possible to use Git with a large number of large files, Git itself doesn't seem to have any problems with being slow to process the files. The two gotchyas I have experienced are only practical limitations.
If you commit 10Gb of images at once, that 10Gb might take a long time to push, and it will leave half pushes on the server that waste space and won't disappear with a 'git gc' so they have to be manually deleted on the server. Best to limit your commits to a few 100mb at a time if you can.
Certain Git operations require using as much free disk space as the repository itself. So a 20Gb repo cant be managed on a server with less than an extra 20Gb of free space (40Gb total).