I'm coding on a project that has several Azure-based applications, as well as several Windows services, etc. Needless to say, it's just a bunch of individual applications that are deployed out to Azure, or elsewhere, and expected are all expected to work together.
We use Nuget for our underlying library project versioning. Every feature or change results in a bump to the Nuget version, a package published to our private Nuget server, and a subsequent update to every other application that needs the update. This is currently a tedious manual task, but is not even my most immediate source of frustration.
The thing that I struggle with the most, currently, is while doing development on a feature that requires changes across the entire set of applications, from bottom to top, and having to constantly push out Nuget packages and update Nuget packages just to even develop and debug.
Prior to using Nuget, we may have just added all of these projects as direct dependencies on disk, which removes versioning but instantly lets me develop against my local changes.
Now with Nuget, I can't develop against local changes without pushing out a new package.
Is there a workflow that I'm missing that would allow me to still use Nuget but also be able to make changes and work locally without having to push and pull Nuget packages all the time?
Can I somehow develop against local projects, but also somehow have the project dependencies know to use the Nuget packages?
Given the projects are all in the same repo, just use project references instead of package references.
When you pack a project, NuGet will convert project references into NuGet dependencies, and the dependency version will be the same as what the other project is if/when it is packed.