Salesforce Data Import Problem Due To Batch Size

  • 3Views
  • Last Post 3 weeks ago
0
votes
Korhan Mulcar posted this 3 weeks ago

Hi,

We set up an import task from a CSV file on an FTP into Salesforce. It runs ok, but it produces errors because of apex cpu time error. It tries to import all of them at once, we don't want that. We need to divide it into batches just like in DATA LOADER. If you are familiar with Salesforce Data Loader, you can define a batch size so that Salesforce runs post-commands, post-processes smoothly. I have several processes run after a lead is created, so when skyvia tries to put all data at once, it gives error. I need to be able to define a batch size for import tasks. This is a crucial feature, please help me with that.

Regards.

0
votes
Mariia Zaharova posted this 3 weeks ago

Hi!

Skyvia uses Bulk API to load data to Salesforce. Please refer to the API Calls for Loading Data to Salesforce section here.

When using Bulk API The max size of a batch is 5000. In case of using SOAP API for loading data, the max size of a batch per API call is 200 records.

Unfortunately, there is no way to change batch size in Skyvia.

 

Best regards,

Mariia

Close