I am using com.opencsv.CSVReader
to read a CSV from a URL served by my nginx Web server. The CSV file content is exactly this:
0.999,1.399,1.799,2.199,2.599,2.999,3.399,3.799,4.199,4.599,4.999,5.399
The problem is that when I read the file I get no csv values:
try (InputStreamReader in = new InputStreamReader(new URL(...csv).openStream(), "UTF-8"); CSVReader r = new CSVReader(in)) {
List<String[]> csv = r.readAll();
....
After this point, csv has failed to get the data (but no errors are raised), and csv.get(0).length
is incorrectly 1
, an empty string. I do not know why. Tests that I have done to isolate the problem:
I have used a BufferedReader
to see what was in in
before reading the csv and I got:
Date: Fri, 10 Aug 2018 03:11:13 GMT
Content-Type: text/plain
Content-Length: 71
Last-Modified: Wed, 12 Jul 2017 16:23:24 GMT
Connection: keep-alive
ETag: "..."
Strict-Transport-Security: max-age=31536000
X-Frame-Options: DENY
X-Content-Type-Options: nosniff
X-XSS-Protection: 1; mode=block
Accept-Ranges: bytes
0.999,1.399,1.799,2.199,2.599,2.999,3.399,3.799,4.199,4.599,4.999,5.399
So I am not sure if the headers were also read by the csv reader and messed with it somehow.
I have used the same code to access an external CSV instead of the one served by my nginx and it worked fine, and the test using the BufferedReader
with the external CSV did not show any header, just the csv values.
That is why I am also considering that it could be an nginx problem, since I also experienced a recent error of files not finishing download and I had to change my previous keepalive_timeout 65
that was working fine to keepalive_timeout 0
. I am not sure why this happened all of a sudden.
This was very hard to debug, but the solution was to reinstall nginx because my copy was corrupted. My nginx instance was sending extra \r
characters that did not interfere with curl
or similar but that disrupted the Java implementation.
By the way - This probably highlighted a bug in .openStream()
implementation.