(400)Bad Request Error

  • 19Views
  • Last Post 08 July 2020
0
votes
Ibha Gupta posted this 30 June 2020

Hi,

I am running upsert and update mapping in skyvia. my first question is, does Skyvia support multiple packages to run at a time? Second question is, the same mapping was running yesterday but today it is getting failed. And giving me the error of (400) Bad request.

package number: 39975757

Package no: 39976521

 

Regards,

Ibha Gupta

 

Order By: Standard | Newest | Votes
0
votes
Mariia Zaharova posted this 30 June 2020

 Hello Ibha,

 

Skyvia supports running several packages at the same time.

We have studied the log files of your packages. Salesforce returns the error: "ApiBatchItems Limit exceeded."

 

You can find your batch statistics in the Salesforce admin panel. For more information, please refer to https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/asynch_api_concepts_limits.htm

 

Thus, you will be able to run your package the next day, when the limit will be reset.

 

Please also check https://docs.skyvia.com/connectors/cloud-sources/Salesforce_API_and_API_Calls.html

 

Best regards,

Mariia

0
votes
Ibha Gupta posted this 02 July 2020

Thanks for support and answers to my questions. I got the data loading works after 24 hours and we can check this limit as Bulk data APi in salesforce. As it allows 10000 api calls per day/24 hours. And shows the API calls consumed per day.

0
votes
Ibha Gupta posted this 03 July 2020

Again, i got this issue with 1 of my file which had only 28k records and it consumed whole limit for the day and stopped in between after reaching the limit of 10000 api calls. 

I checked about the api calls via skyvia and there is no long text or many lookups(only 1 lookup in the mapping). Then how it can consume the limit of 10000 api calls.

could you please check if something is wrong in the job #40133155. I am not able to work with this limitation and also my file runs very slow. I am reading this file from SFTP server but I dont think that SFTP location has anything with speed or api call limits.

Please suggest, I have so many files like this to upload in Salesforce.

 

Regards,

Ibha Gupta

1
votes
Mariia Zaharova posted this 06 July 2020

 Hello Ibha,

 

We have checked your package #100397. This package uses 1 lookup mapping - Account is a lookup table. Please try using a cached lookup. For this, when editing the lookup mapping, click Options and select the "Use cache" checkbox in the lookup options. For more information, please refer to https://docs.skyvia.com/data-integration/common-package-features/mapping/lookup-mapping-target-lookup-and-source-lookup.html

 

In this case, when the package is run, the data from the lookup table will be selected to cache, and lookup is performed over cached data. This allows to reduce the number of API calls and time needed to perform the task. 

 

 

Best regards,

Mariia

  • Supported by
  • Ibha Gupta
0
votes
Ibha Gupta posted this 07 July 2020

I did the changes for cache lookup, then also it is taking same time and exhaust the salesforce batch limit even for 26000 records.

 

Regards,

Ibha Gupta

0
votes
Mariia Zaharova posted this 08 July 2020

Hello Ibha,

 

In this case, you need also to pay attention to other Skyvia packages, or other tools, apps that could use Salesforce API calls.

 

 

Best regards,

Mariia

Close