I have written a Flink job that reads data from Kafka and writes to hdfs file in ORC format for HIVE(uses 20 executors). I need to run a simple job that inserts partition to the HIVE table every hour. Is it possible to run this simple partition addition job in job manager?
The job manager only plays a coordinating/supervisory role. You must have at least one task manager.
However, what you can do to run a simple job is to use a "mini-cluster" that runs entirely in the same JVM as the client/application. What you want is a LocalStreamEnvironment, created by
final StreamExecutionEnvironment env = StreamExecutionEnvironment.createLocalEnvironment();
or a LocalEnvironment, if you are using the DataSet (batch) API:
ExecutionEnvironment env = ExecutionEnvironment.createLocalEnvironment();