I'm building a console app that is designed to scale in an Azure Batch Service, but I can't seem to figure out how I can update the batch service application programmatically.
I've been looking at the API reference at https://learn.microsoft.com/en-us/rest/api/batchmanagement/ which has options to create, update, and modify applications for the batch service... but I don't see how to actually push the new binaries to the batch service, or how to tell the batch service the blob location of the new binaries. Am I missing something obvious?
Cool, I think you should add more information for the specific design scenario you have. I have added the API and have also detailed few things which might help in better understanding and also sorry if I missed anything obvious :)
.
Down below, as an example I came out with one possible scenario and answered it as thinking out loud process.
WRT (With respect to) the question:
How are your new binaries generated? and Do you manage them on the fly?
create, update
etc: please refer to the following doc which talks about the create, delete, or update
: https://learn.microsoft.com/en-us/rest/api/batchmanagement/application
Coming back to the more information part: (Reason I ask is to get clarity for the design how app is trying to utilize resources etc...) a really good over all intake on resource management et. al. here https://learn.microsoft.com/en-us/azure/batch/batch-api-basics .
More information which should be though for scaling reasons are like:
Sample Scenario which comes to my mind
Say for example:
My pool is running 20 tasks,
task 1
generates the binary which will get used intask 4
then 2 concepts comes to my mind first : my tasks are dependent tasks so look for this concept, then I can do upload files form my tasks using output file concept here, then I use the new uploaded files in my task now.
Hope this helps. Thanks!