Search code examples
jsonsocketsasp.net-web-apiadobe-indesignextendscript

InDesign Socket HTTP response is missing sections


I am automating Adobe InDesign to create documents using JSON data gathered from a web API with a SQL Server backend. I am using the Sockets object to make an HTTP 1.0 call to our server. Sometimes the response received is missing about 1700 characters from various points within the JSON string, yet when I call the same API endpoint using curl or Postman I get a complete and valid response.

The response should be about 150k characters long, and I'm using conn.read(99999999) to read it. In addition, the appearance of the end of the string looks correct, so I don't believe it's any kind of truncation problem.

The problem only seems to occur when I request a UTF-8 encoding. If I request ASCII I get a complete and valid response, but missing various Unicode characters. If I request BINARY I get a complete and valid response but the JavaScript/ExtendScript seems to be handling any multi-byte Unicode characters received as individual bytes, rather than as the Unicode characters we want to display.

Here is an illustration of the behavior I'm seeing, using bogus data...

"Expected" response...

[{"Id":1, "name":"Random Name", "Text":"A bunch of text", "AnotherId": 1}]

"Actual" response...

[{"Id":1, "name":"Random Name", "Text":"A bunc": 1}]

The problem first manifested itself as a JSON2 parsing error, for obvious reasons, but the root of it seems to be the fact that parts of the data are going missing in-transit.

So far we've only seen this problem when making the call using the InDesign Sockets object, and not every response exhibits this behavior.

Any help or insights you could offer would be appreciated.

Here is the function I'm using to call for data...

function httpRequest(url, encoding) {
   try {
      var response = "";

      var hostName = getHostFromUrl(url);
      var pathAndQuery = getPathAndQueryFromUrl(url);

      var httpGet = "GET ";
         httpGet += pathAndQuery;
         httpGet += " HTTP/1.0\r\nHost: ";
         httpGet += hostName;
         httpGet += "\r\n";

      var conn = new Socket;

      conn.timeout = 30;
     //conn.encoding = encoding || "UTF-8";
     //conn.charset = "UTF-16";

      if (conn.open(hostName + ":80", encoding || "UTF-8")) {
         // send a HTTP GET request
         conn.writeln(httpGet);

         // and read the server's response
         response = conn.read(99999999);

         conn.close();
      }

      return parseHttpResponse(response);
   }
   catch (e) {
      $.writeln(e);
      $.global.alert("There was a problem making an HTTP Request: " + e);
      return null;
   }
}

Solution

  • It turns out my handling of the HTTP response was too simplistic and needed extra logic to handle Unicode characters properly.

    The solution, in my case, was to use the GetURL method made available by Kris Coppieter here.