Search code examples
asp.net-coresignalrazure-web-app-servicecloudflare

Longpolling responds 502 bad gateway on the nineth request


I am debugging an Azure architecture consisting of a blob storage account performing CORS requests to an App Service. The request configuration is set up using Signalr core "Long polling". The CORS configuration appears to be working up until the nineth negotiation request which fails due to a 502 bad gateway error. The DNS records route through Cloudflare which provides DDOS protection / rate limiting.

My Instinct tells me this is caused by an infinite loop during JSON serialization. After the server has rebooted or whatever, it failed to add the Access-Control-Allow-Origin header which it normally did to the previous 8 requests.

I need help creating a consistent streamed connection using the mentioned tools. This would involve fixing the bad gateway error.

Signalr server configuration

https://github.com/DoubleCouponDay/portfolio/blob/master/server/portfolio/Startup.fs

Signalr client configuration

https://github.com/DoubleCouponDay/portfolio/blob/master/client/src/app/services/music.service.ts

enter image description here

enter image description here


Solution

  • In general, most people are getting 502 bad gateway error with signalr because of random server crashes. There is no consistent error that will always match the situation of people reading this answer, so in future assume that bad gateway is a very generic request crash for signalr.

    In my case, it was caused by WindowsCryptographicException while reading an X509 certificate from file. The error would not propagate through to the signalr client or the azure logs, so remote debugging was necessary to detect this.

    The solution was to pass the X509KeyStorageFlags.MachineKeySet enum as argument to my X509Certificate object, instead of other choices. It's because the server is storing the certificate in the local machine store, not the user store.