Question: How to submit jars stored in AWS S3 to Local Spark?
I'm looking for a way to submit jars stored in S3 to Local Spark (not EMR). However, when I investigated this, I only see how to submit it to the EMR Spark Cluster.
The closest answer to the method I've been looking for is in the link. However, the answer above seems to only work with EMR. (Isn't it? I may be wrong.)
spark documentation here says:
Spark uses the following URL scheme to allow different strategies for disseminating jars:
file: - Absolute paths and file:/ URIs are served by the driver’s HTTP file server, and every executor pulls the file from the driver HTTP server.
hdfs:, http:, https:, ftp: - these pull down files and JARs from the URI as expected
local: - a URI starting with local:/ is expected to exist as a local file on each worker node.
So, as long as your s3 bucket is public and you can access it using URI, you can specify one of the above mentioned