Search code examples
httpclientcommon-lisp

Why does Dexador works for some URLs and does not work for others?


I am using Dexador while coding in Common Lisp with SBCL and Slime. All this using a PC with NixOS as the Operational System.

I do not understand this library behavior. Following the documentation, it provides the following example:

(dex:get "http://lisp.org/")

For me, I get the following error:

CL-USER> (dex:get "http://lisp.org/")

SSL verify error: 10 X509_V_ERR_CERT_HAS_EXPIRED

Paradoxically, it works for my personal blog:

CL-USER> (dex:get "http://www.pdelfino.com.br")
"<!DOCTYPE html>
<link rel=\"stylesheet\" href=\"https://use.fontawesome.com/releases/v5.8.1/css/all.css\"
      integrity=\"sha384-50oBUHEmvpQ+1lW4y57PTFmhCaXp0ML5d60M1M7uH2+nqUivzIebhndOJK28anvf\" crossorigin=\"anonymous\">
<html lang=\"pt\">

[... omitted]

On Wikipedia it also does not work:

CL-USER> (dex:get "https://en.wikipedia.org/wiki/Main_Page")

Why is this happening?

And, is there an alternative that would work for all types of websites (http and https schemes)?


Solution

  • It says CERT_HAS_EXPIRED, which means that the SSL certificate has expired. This might be a configuration error on your computer, though (missing updated certificates).

    There are alternative http client libraries, e. g. drakma, but likely the SSL problem is not related to the used library.