Salesforce backup performance

  • 30Views
  • Last Post 06 September 2018
0
votes
IT-Kleve Spectro posted this 28 August 2018

We experience poor backup performance. After 17 hours the Salesforce backup job is still running without progress indication and schedule is again listed as disabled. We need to backup 35 GB and idea was to have this as a daily backup. Since there is no option for incremental/differential (see feature request) the idea to use Skyvia for daily backup does not work out. Are there any options to speed up the data transfer?

Order By: Standard | Newest | Votes
0
votes
Mariia Zaharova posted this 29 August 2018

The time backup takes depends on many things. For example, it depends on the volume of data to backup, the backed up data source, data source load, etc. Skyvia works via Salesforce API and uses SOAP API for data retrieve. Usually one API call is used for 2000 records. If an entity has a Long Text column, then one API call is used for 200 records. Also, the Attachment entity is processed in a special way - one API call is used for each record of the Attachment entity. Your backup package includes many tables and backups much data each package run, so it can take much time to perform.

As we can see, last run was performed successfully. At the moment, there is no possibility to speed-up the process of your backup. However, we will definitely contact you if we have any suggestion or solution for your case.

Best regards,

Mariia

0
votes
IT-Kleve Spectro posted this 03 September 2018

Do you have other custmers doing SF backup using Skyvia? If so, how do they handle this? Our Salesforce org with about 100 users is not what I consider to be a large SF instance. Is there anything that can be changed on the Salesforce end to speed up processing of API requests? If I start a full backup directly on Salesforce it completes within 30-60 minutes.

Can you see how many API calls were used during a complete backup? With this I could check if we eventually run against the daily API request limit. That would be a reason I can think of that would explain the time it takes to complete. 

0
votes
Mariia Zaharova posted this 05 September 2018

We have enabled Bulk Query option for your backup package. This option should speed up the process of executing the package. Please try running this package and notify us about the results.

As for API calls, we have mentioned above the way how Skyvia works via Salesforce API and how many API calls are used for data retrieve. You can check your API usage directly in your Salesforce account.

 

Looking forward to your reply with the results.

0
votes
IT-Kleve Spectro posted this 05 September 2018

Thanks Mariia! I will let you know the results of the next backup.

0
votes
IT-Kleve Spectro posted this 05 September 2018

I triggered a manual backup. Instead of 26 hours it was no completed within 65 minutes. So enabling the bulk query had a huge impact on backup performance.However, now 34 tables were skipped due to an error.In the past we always had 18 skipped tables. I need to check what happened with the additional tables.

0
votes
IT-Kleve Spectro posted this 05 September 2018

On the additional skipped tables I get the followig message: Entity '<table_name'> is not supported by the Bulk API. According to SF support only tables accessible by the SOAP-API are enabled for use through the Bulk API.

So I assume that it would make sense to create two jobs each having it's own connection. One will use the bulk query and the other job (connection) will take care of the remaining tables not supporting bulk api.

Can you confirm that this would be a good approach? Thanks for your help!

0
votes
Mariia Zaharova posted this 05 September 2018

Yes, we have also checked logs of your current backup. These tables do not support Bulk API. So, yes, we confirm this would be a good approach to have two backup packages. You can exclude such tables from this package and create additional one. This new package will not use Bulk Query, because we have enabled it only for the package "SPECTRO Salesforce Backup - bulk" #53037.

Best regards,

Mariia

0
votes
IT-Kleve Spectro posted this 05 September 2018

The backup of attachments was skipped due to an "Internal Server error: Retried more than 15 times" when running in bulk mode. Was this a coincidence or doe we need to add this table to the no-bulk job?

0
votes
Mariia Zaharova posted this 06 September 2018

When a bulk query is processed, Salesforce attempts to execute the query within the standard 2-minute timeout limit. If attempt fails, Skyvia tries to retrieve data for several times. Most likely, you have much data in the Attachment table and, thus, attempts to read this data fail. Also, as I can see, you have included this table to the last run of the non-bulk package and the package is still running. NOTE: Attachment entity is processed in a special way - one API call is used for each record of the Attachment entity. 

Thus, most likely, the Attachment table is the main reason of the issues with the performance in a non-bulk mode.

 

Best regards, 

Mariia

Close