I am trying to figure out where time is used in my application.
ServerA sends to serverB using the following function:
protected async Task<T> SendRequest<T>(HttpRequestMessage request)
{
using (var telemetryRequest = _telemetry.ExternalRequest("Outgoing call -> ", request.RequestUri.OriginalString, "", false)) // Creates a DependencyTelemetry
{
using (var response = await Client.SendAsync(request))
{
var data = await response.Content.ReadAsStringAsync();
var returnObject = JsonConvert.DeserializeObject<T>(data, new JsonSerializerSettings
{
Error = HandleDeserializationError
});
return returnObject;
}
}
}
The call is then picked up in serverB using a custom middleware:
public class MyCustomkMiddleware
{
public async Task InvokeAsync(HttpContext context)
{
using (var request = _telemetryClient.InCommingRequest($"Incoming request: {context.Request.Method}) // Creates a RequestTelemetry
{
await _next.Invoke(context).ConfigureAwait(false);
}
}
}
The middleware is configured duing startup:
public void Configure(IApplicationBuilder app, ContextInitializer contextInitializer, )
{
app.UseMiddleware<MyCustomkMiddleware>();
}
If you look at the log, you see that the call takes a lot of time, but I dont really understand why. My guess is that there is a high startup and shutdown time because of something silly we do. How can I narrow it down or find the problem?
I never figured out how to debug this, but the slowdown was caused by services.AddScoped. Changing it to services.AddSingleton() greatly improved the performance!