I'm utilizing Python's simple-salesforce package to execute a bulk upload. I'm seeing some inconsistent response errors that I believe can be resolved by changing the 'concurrencyMode' to 'Serial'
I don't see that option in the documentation. Does anyone know if it's possible to update the source code to add that parameter to a request? I tried updating the headers in api.py and bulk.py with no luck.
Thanks
simple-salesforce
bulk methods are using Salesforce Bulk API 1.0 by POSTing to https://<salesforce_instance>/services/async/<api_version>/job
. In bulk.py
, the job is created like so:
def _create_job(self, operation, object_name, external_id_field=None):
""" Create a bulk job
Arguments:
* operation -- Bulk operation to be performed by job
* object_name -- SF object
* external_id_field -- unique identifier field for upsert operations
"""
payload = {
'operation': operation,
'object': object_name,
'contentType': 'JSON'
}
This generates the following XML payload:
<jobInfo
xmlns="http://www.force.com/2009/06/asyncapi/dataload">
<operation>...</operation>
<object>...</object>
<contentType>JSON</contentType>
</jobInfo>
To explicitly request a serial job, you need to add concurrencyMode
element to the request. The jobInfo
fragment should be
<jobInfo
xmlns="http://www.force.com/2009/06/asyncapi/dataload">
<operation>...</operation>
<object>...</object>
<concurrencyMode>Serial</concurrencyMode>
<contentType>JSON</contentType>
</jobInfo>
Change the _create_job
to have this extra element:
def _create_job(self, operation, object_name, external_id_field=None):
""" Create a serial bulk job
Arguments:
* operation -- Bulk operation to be performed by job
* object_name -- SF object
* external_id_field -- unique identifier field for upsert operations
"""
payload = {
'operation': operation,
'object': object_name,
'concurrencyMode': 'Serial',
'contentType': 'JSON'
}