Search code examples
rxmlxml2readxml

Can't read_xml or xmlParse, but when downloading manually the XML is fine and can be loaded by R


I am trying to get to read this xml in R:

url <- 'https://fnet.bmfbovespa.com.br/fnet/publico/downloadDocumento?id=155020&cvm=true&#toolbar=0'
xml <- url %>% httr::GET(httr::config(ssl_verifypeer = FALSE, ssl_verifyhost = FALSE)) %>% read_xml()

The output of this is:

Error in read_xml.raw(content, encoding = encoding, base_url = if (nzchar(base_url)) base_url else x$url,  : 
  Start tag expected, '<' not found [4]

When I attempt to go the route of getURL:

xdata <- getURL(url,ssl.verifypeer = FALSE) 
doc <- XML::xmlTreeParse(xdata)

The output is:

Error: XML content does not seem to be XML:

And trying this:

curl::handle_setopt(h, ssl_verifyhost = 0, ssl_verifypeer=0) curl::curl_download(url=url, destfile = "file_test.html", handle = h)

Just downloads a big string into the file. The same that couldn't be read as XML by xmlTreeParse.

All of this seems to point to a badly formed XML (I believe). However when I download the XML manually the file can be read by read_xml() no problem. I can visualize it on my browser as well just like any other XML.

How can I work around this? This is one of many urls in a scraper and all of them have the same issue.


Solution

  • the website seems to use a base64 encryption, I'm not very familiar with it

    url <- 'https://fnet.bmfbovespa.com.br/fnet/publico/downloadDocumento?id=155020&cvm=true&#toolbar=0'
    
    xml <- url %>% 
     httr::GET(httr::config(ssl_verifypeer = FALSE, ssl_verifyhost = FALSE)) %>% 
     httr::content(as="text") %>%
     jsonlite::base64_dec() %>%
     rawToChar() %>%
     read_xml()
    

    You might need to install jsonlite, but this seems to work for me.