Python Simple-Salesforce 改变 'concurrencyMode'
Python Simple-Salesforce change 'concurrencyMode'
我正在使用 Python 的 simple-salesforce 包执行批量上传。我看到一些不一致的响应错误,我相信可以通过将 'concurrencyMode' 更改为 'Serial'
来解决
我在文档中没有看到该选项。有谁知道是否可以更新源代码以将该参数添加到请求中?我尝试更新 api.py 和 bulk.py 中的 headers 但没有成功。
谢谢
simple-salesforce
批量方法正在使用 Salesforce Bulk API 1.0 by POSTing to https://<salesforce_instance>/services/async/<api_version>/job
. In bulk.py
,工作是这样创建的:
def _create_job(self, operation, object_name, external_id_field=None):
""" Create a bulk job
Arguments:
* operation -- Bulk operation to be performed by job
* object_name -- SF object
* external_id_field -- unique identifier field for upsert operations
"""
payload = {
'operation': operation,
'object': object_name,
'contentType': 'JSON'
}
这会生成以下 XML 负载:
<jobInfo
xmlns="http://www.force.com/2009/06/asyncapi/dataload">
<operation>...</operation>
<object>...</object>
<contentType>JSON</contentType>
</jobInfo>
要明确请求串行作业,您需要在请求中添加 concurrencyMode
元素。 jobInfo
片段应该是
<jobInfo
xmlns="http://www.force.com/2009/06/asyncapi/dataload">
<operation>...</operation>
<object>...</object>
<concurrencyMode>Serial</concurrencyMode>
<contentType>JSON</contentType>
</jobInfo>
更改 _create_job
以包含此额外元素:
def _create_job(self, operation, object_name, external_id_field=None):
""" Create a serial bulk job
Arguments:
* operation -- Bulk operation to be performed by job
* object_name -- SF object
* external_id_field -- unique identifier field for upsert operations
"""
payload = {
'operation': operation,
'object': object_name,
'concurrencyMode': 'Serial',
'contentType': 'JSON'
}
我正在使用 Python 的 simple-salesforce 包执行批量上传。我看到一些不一致的响应错误,我相信可以通过将 'concurrencyMode' 更改为 'Serial'
来解决我在文档中没有看到该选项。有谁知道是否可以更新源代码以将该参数添加到请求中?我尝试更新 api.py 和 bulk.py 中的 headers 但没有成功。
谢谢
simple-salesforce
批量方法正在使用 Salesforce Bulk API 1.0 by POSTing to https://<salesforce_instance>/services/async/<api_version>/job
. In bulk.py
,工作是这样创建的:
def _create_job(self, operation, object_name, external_id_field=None):
""" Create a bulk job
Arguments:
* operation -- Bulk operation to be performed by job
* object_name -- SF object
* external_id_field -- unique identifier field for upsert operations
"""
payload = {
'operation': operation,
'object': object_name,
'contentType': 'JSON'
}
这会生成以下 XML 负载:
<jobInfo
xmlns="http://www.force.com/2009/06/asyncapi/dataload">
<operation>...</operation>
<object>...</object>
<contentType>JSON</contentType>
</jobInfo>
要明确请求串行作业,您需要在请求中添加 concurrencyMode
元素。 jobInfo
片段应该是
<jobInfo
xmlns="http://www.force.com/2009/06/asyncapi/dataload">
<operation>...</operation>
<object>...</object>
<concurrencyMode>Serial</concurrencyMode>
<contentType>JSON</contentType>
</jobInfo>
更改 _create_job
以包含此额外元素:
def _create_job(self, operation, object_name, external_id_field=None):
""" Create a serial bulk job
Arguments:
* operation -- Bulk operation to be performed by job
* object_name -- SF object
* external_id_field -- unique identifier field for upsert operations
"""
payload = {
'operation': operation,
'object': object_name,
'concurrencyMode': 'Serial',
'contentType': 'JSON'
}