I can't get the Javadoc for Spark core library to work on Eclipse and Windows 10. I have no JRE defined under preferences. I load the Javadoc I right-clicked on the jar file in eclipse-> project explorer -> maven -> download Javadoc. What I typically do. See attached image. How to fix this?
Stack Trace is:
Java Model Exception: Java Model Status [Unknown javadoc format for JavaRDD {key=Lorg/apache/spark/api/java/JavaRDD<Ljava/lang/String;>;} [in JavaRDD.class [in org.apache.spark.api.java [in C:\Users\karln\.m2\repository\org\apache\spark\spark-core_2.11\2.2.1\spark-core_2.11-2.2.1.jar]]]]
at org.eclipse.jdt.internal.core.JavadocContents.getTypeDoc(JavadocContents.java:81)
at org.eclipse.jdt.internal.core.BinaryType.getAttachedJavadoc(BinaryType.java:999)
at org.eclipse.jdt.internal.ui.text.javadoc.JavadocContentAccess2.getHTMLContent(JavadocContentAccess2.java:538)
at org.eclipse.jdt.internal.ui.text.java.hover.JavadocHover.getHoverInfo(JavadocHover.java:757)
at org.eclipse.jdt.internal.ui.text.java.hover.JavadocHover.internalGetHoverInfo(JavadocHover.java:675)
at org.eclipse.jdt.internal.ui.text.java.hover.JavadocHover.getHoverInfo2(JavadocHover.java:667)
at org.eclipse.jdt.internal.ui.text.java.hover.BestMatchHover.getHoverInfo2(BestMatchHover.java:164)
at org.eclipse.jdt.internal.ui.text.java.hover.BestMatchHover.getHoverInfo2(BestMatchHover.java:130)
at org.eclipse.jdt.internal.ui.text.java.hover.JavaEditorTextHoverProxy.getHoverInfo2(JavaEditorTextHoverProxy.java:86)
at org.eclipse.jface.text.TextViewerHoverManager$4.run(TextViewerHoverManager.java:166)
And Eclipse Version
Eclipse Java EE IDE for Web Developers.
Version: Oxygen.2 Release (4.7.2)
Build id: 20171218-0600
EDIT: Added Error Detail Screenshot.
The Spark Project Core 2.2.1 Javadoc JAR spark-core_2.11-2.2.1-javadoc.jar
does not contain Javadoc, but only HTML documentation that differs in content and structure from Javadoc. Obviously, the HTML documentation was created from Scala source code (see spark-core_2.11-2.2.1-sources.jar
).
Thanks for reporting it to Eclipse. In my view, this is not a bug of Eclipse, but caused by an invalid Spark Core Javadoc JAR. Unfortunately, the Eclipse error message is somewhat misleading.
You can validate the attached Javadoc of a JAR as follows:
spark-core_2.11-2.2.1.jar
and choose Properties