http://snomedct.t3as.org/ This is a web service that will analyse English clinical text, and report any concepts that can be detected.
For e.g.- I have headache. It will identify headache as a Symptom.
Now what I would like to do is send the sentence to the web service through R, and get the table back from the web page to R for further analysis purpose.
If we take their example curl
command-line:
curl -s --request POST \
-H "Content-Type: application/x-www-form-urlencoded" \
--data-urlencode "The patient had a stroke." \
http://snomedct.t3as.org/snomed-coder-web/rest/v1.0/snomedctCodes
that can be translated to httr
pretty easily.
The -s
means "silent" (no progress meter or error messages) so we don't really have to translate that.
Any -H
means to add a header to the request. This particular Content-Type
header can be handled better with the encode
parameter to httr::POST
.
The --data-urlencode
parameter says to URL encode that string and put it in the body
of the request.
Finally, the URL is the resource to call.
library(httr)
result <- POST("http://snomedct.t3as.org/snomed-coder-web/rest/v1.0/snomedctCodes",
body="The patient had a stroke.",
encode="form")
Since you don't do this regularly, you can wrap the POST
call with with_verbose()
to see what's going on (look that up in the httr
docs).
There are a ton of nuances that one should technically do after this (like check the HTTP status code with stop_for_status()
, warn_for_status()
or even just status_code()
, but for simplicity let's assume the call works (this one is their example so it does work and returns a 200
HTTP status code which is A Good Thing).
By default, that web service is returning JSON, so we need to convert it to an R object. While httr
does built-in parsing, I like to use the jsonlite
package to process the result:
dat <- jsonlite::fromJSON(content(result, as="text"), flatten=TRUE)
The fromJSON
function takes a few parameters that are intended to help shape JSON into a reasonable R data structure (many APIs return horrible JSON and/or XML). This API would fit into the "horrible" category. The data in dat
is pretty gnarly and further decoding of it would be a separate SO question.