I am quiet new to Apache Spark. I want to have a RESTFul service which receives a file (Large file) with Apache Spark to process this file in near real-time at the back-end. I want to implement this in Java.
how can i pack this application (distribution). Say i have a server (e.g. Tomcat). Should i pack Spark inside the Webservice ?
How can i run the Spark cluster programmatically? Like a service which is always up and down on demand.
Thanks
Use Akka HTTP to implement REST services. Complete working example of how to integrate Akka and Spark is killrweather.