I am creating a batch prediction from a pipeline. According to the documentation sync=False
argument for Model.batch_predict
would submit a batch prediction asyncronously, meaning (in my understanding) that the pipeline won't wait till the batch prediction completes. Instead it is pinging the batch prediciton status and completed only after the batch prediction completes.
Sample code:
model.batch_predict(
gcs_source=gcs_source,
gcs_destination_prefix=gcs_destination,
machine_type='n1-standard-4',
instances_format='csv',
sync=False
)
Pipeline logs:
According to the answer on python-aiplatform Github, asynchronous submission of batch prediction does not mean that the pipeline will not wait for its completion. It means that we can execute some more code after we have submitted the batch prediction. But the pipeline will still wait for the completion of batch prediction.