I have a setup where we package source code into nuget packages through Team City to be deployed through Octopus deploy. The size of the folder that is packaged is ~267MB. This process takes about 5-6 min just to package these files. The folder contains 12,000 files and 339 folders. I also have a folder that is ~183MB and has 297 files and 24 folders. This only takes about 35 seconds. Any idea why there is such a big difference in time when packaging these folders with Nuget? Are there any solutions to help with performance ?
Thanks
NuGet packages are really just compressed archives under the hood, so the performance you'll get will be very similar to the speeds you see for compressing the files (NuGet just has a strict convention for the structure within the archive).
Given different numbers of files, adding up to the same total size - All common archiving formats achieve faster compression, and better compression (smaller resulting file) with fewer files.
If you want to get into the low-level details, if you imagine defragging your hard disk, each file will be made contiguous, but if you have lots of files, they will be all over the place. This adds overhead too.
Any per-file operations (that may only cost a tiny fraction of time on a single large file) may also add up once you are dealing with many files.
All-in-all it is pretty much a true-ism in software development that an uninterrupted operation is more efficient in chunky-land, not chatty-land.