Salesforce Data Import Problem Due To Batch Size

  • 17Views
  • Last Post 25 March 2019
0
votes
Korhan Mulcar posted this 01 March 2019

Hi,

We set up an import task from a CSV file on an FTP into Salesforce. It runs ok, but it produces errors because of apex cpu time error. It tries to import all of them at once, we don't want that. We need to divide it into batches just like in DATA LOADER. If you are familiar with Salesforce Data Loader, you can define a batch size so that Salesforce runs post-commands, post-processes smoothly. I have several processes run after a lead is created, so when skyvia tries to put all data at once, it gives error. I need to be able to define a batch size for import tasks. This is a crucial feature, please help me with that.

Regards.

Order By: Standard | Newest | Votes
0
votes
Mariia Zaharova posted this 01 March 2019

Hi!

Skyvia uses Bulk API to load data to Salesforce. Please refer to the API Calls for Loading Data to Salesforce section here.

When using Bulk API The max size of a batch is 5000. In case of using SOAP API for loading data, the max size of a batch per API call is 200 records.

Unfortunately, there is no way to change batch size in Skyvia.

 

Best regards,

Mariia

0
votes
Korhan Mulcar posted this 25 March 2019

Hi,

You misunderstand. We are importing data from CSV on FTP and it imports all data at once. We need to be able to divide this in batches. When it imports all data, it caused APEX CPU errors and process time limit errors. We need a configuration step where I put batch size like 20 or 50. And with that feature, Skyvia will import data in 20 or 50 batch sizes.

Thanks

0
votes
Mariia Zaharova posted this 25 March 2019

There is no possibility to change batch size in Skyvia. We will definitely inform you if this feature is implemented.

 

Best regards,

Mariia

Close