Search code examples
javajavadoc

Obtain and download Javadoc (JDK API documentation) to a local file for offline reading


When writing Java code, I refer extensively to the Javadoc—that is, the Java® Platform, Standard Edition & Java Development Kit Version x API Specification. I know how to read it online from the website, but I would like to download a copy to my computer so that I can read it offline when no Internet connection is available.

How can I download the documentation? Is there a way to "download" the Javadoc (JDK documentation) from an online site to a local file?

The online docs I am using tend to reject clients such as Eclipse, making work difficult, so I need to pull them onto my machine and attach them to my library JAR.


Solution

    1. First, make sure they don't already offer an download in zip form or similar.

    2. Then, make sure you are actually allowed to do this (this may depend on where you live, and on any conditions mentioned on the web site from where you want to pull this).

    3. Then, have a look at the Wget tool. It is part of the GNU system, thus included in many Linux distributions, but also available for Windows and Mac, I suppose.

    Something like this works for me:

    wget --no-parent --recursive --level inf --page-requisites --wait=1 \
       https://epaul.github.io/jsch-documentation/simple.javadoc/
    

    (without the line break; it should be escaped by the \ backslash here).

    Look up what each option does in the manual before trying this.

    If you want to do this repeatedly, look into the --mirror option. For downloading other websites, --convert-links might also be useful, but I found that is not needed for Javadocs, which usually have the correct absolute and relative links.

    This downloads lots of the same copy of the index.html file with appended ?... names (for the FRAMES links on each page). You can remove these files after downloading by adding the --reject 'index.html\?*' option, but they still will be downloaded first (and checked for recursive links). I did not yet find out how to avoid downloading them at all. (See this related question on Server Fault.)

    Maybe adding the right recursion level would help here (I didn't try).

    After downloading, you might want to zip the resulting directory to take less disk space. Use the zip tool of your choice for this.