We are uploading shapefile zips to our GeoServer via REST requests in our ASP.net application. Small .zip files (~2MB) work fine, but anything bigger than that (we have one .zip that is ~70MB) will not. WebRequest.GetResponse() returns a 502 Bad Gateway, reading out the response stream supplies this error:
"The specified CGI application encountered an error and the server terminated the process."
This is the code that creates the request and gets the response:
WebRequest request = WebRequest.Create(URL);
request.ContentType = "application/xml";
request.Method = "PUT";
request.Credentials = new NetworkCredential(username, pswd);
Stream requestStream = request.GetRequestStream();
requestStream.Write(contentBytes, 0, contentBytes.Length);
requestStream.Close();
WebResponse response = request.GetResponse();
The REST call is to /workspaces/{workspaceName}/datastores/{storeName}/{method}.{format}, which uploads files to the specified datastore, and creates it if it doesn't exist.
contentBytes is a byte[] created from a MemoryStream from the uploaded HttpPostedFile zip.
Like I said, works with smaller zips. Googling the error seems to suggest it comes from Azure or IIS, however, none of the solutions seem to fit our case (altering the web.config in ways that don't apply to our application or don't work), as it is not a .net Core application.
This question seems to have had a similar issue, but getting different error messages. Upping the timeout has not worked in either the C# or the web.config.
There doesn't appear to be anything wrong with the shapefiles themselves, other shapefile viewers are able to handle these shapefiles.
Our best guess ended up being that we were hitting some sort of body size limit on REST requests somewhere, and not getting a helpful error message back or getting any error message in the GeoServer logs. What we ended up doing, in short, was switching from uploading the file through the GeoServer REST API (easy and clean), over to uploading manually through FTP and then using the URL of the newly uploaded files in the REST request (messy and complicated):
In our C# code we make an FTP request (getting our FTP credentials from Azure) to create a directory in the "data" folder for the new files. This is where the importer was putting the files when we uploaded via REST.
Make FTP requests for each file in the .zip, uploading them to the new directory.
Lastly, make a REST call to import the layer, (using /workspaces/{workspaceName}/datastores/{storeName}/{method}.{format}), and instead of sending the file in the body, send the URL of the .shp file that was uploaded.
If any step fails, rollback and delete the files in the FTP directory and then delete the directory.