We need to pass 4 parameters from AWS Lambda to AWS Glue job while triggering Glue job.
response = client.start_job_run(JobName = 'my_test_Job',
Arguments = {
'--yr_partition_val': 2017,
'--mon_partition_val': 05,
'--date_partition_val': 25,
'--hour_partition_val': 07 } )
Glue need to catch these 4 parameters to proceed further in pyspark glue code.
I have tried using below in glue to catch parameters:
import sys
from awsglue.utils import getResolvedOptions
args = getResolvedOptions(sys.argv,
['JOB_NAME',
'yr_partition_val',
'mon_partition_val',
'date_partition_val',
'hour_partition_val'])
but got the error as:
self.error(_('argument %s is required') % name)
awsglue.utils.GlueArgumentError: argument --JobName is required
Can someone help it out?
AWS says
'--JOB_NAME'
is internal to Glue and should not be set. Also, the arguments are case-sensitive.
When calling from -
Glue API
Name='job_name_value'
needs to be specified as first argument
Lambda API
JobName='job_name_value'
needs to be specified as first argument
See Example below:
current_year_full = '2019'
current_month = '01'
current_day = '21'
current_hour = '01'
int_bucket_name = 'datascience-ca-input'
glue_job_name = os.getenv("job_name")
response = gl.start_job_run(
JobName = glue_job_name,
Arguments = {
'--intermediate_bucket_name': int_bucket_name,
'--year_partition_value': current_year_full,
'--month_partition_value': current_month,
'--date_partition_value': current_day,
'--hour_partition_value': current_hour } )