Big Query DML Limitations

  • 38Views
  • Last Post 5 weeks ago
0
votes
OWOX Analytics posted this 21 May 2018

Hi Support Team,

I have some questions about Salesforce to BigQuery import.

The import task for this type is scheduled and runs aproximately 8 times per day. Sometimes it processes a few entries, but sometimes it processes a large set of data per table.

As I see at BigQuery Query History the data is imported to Big Query  row by row.

If the task is processed few thousand entries, many of them is failed with an error:

Errors: dml_per_table.long: Quota exceeded: Your table exceeded quota for UPDATE, DELETE or MERGE queries per table. For more information, see https://cloud.google.com/bigquery/troubleshooting-errors (error code: quotaExceeded)

My Query History looks similar to this:

Do you address to this limitation at BigQuery import task? If so, can you please explain multiple similar errors per one import task?

Will be glad to share with you more detail if needs.

Best regards, Viacheslav

Order By: Standard | Newest | Votes
0
votes
Mariia Zaharova posted this 22 May 2018

Hello Viacheslav,

In fact, Skyvia uses bulk load for importing data to Google BigQuery. Please specify the package (its number, name) with which this behaviour occurs and we will investigate it in more details.

 

Best regards,

Mariia

0
votes
OWOX Analytics posted this 22 May 2018

Hello Maria,

 

Thank you for response. 

Package number - 36004

Package Name - Export to GBQ (All Tables WO Leads, Contacts)

Task Id - 12449549

 

Will be wait for updates.

0
votes
Mariia Zaharova posted this 23 May 2018

Thank you for the reply. We have checked this package, in this package only Insert operations are used and, thus,  Skyvia uses bulk load, not an update.

In this case data is divided into batches and loaded as CSV files to the bucket (Google Storage). Then BigQuery jobs are started. BigQuery has a load job, that allows loading CSV data to BigQuery https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs#configuration.load . These BigQuery jobs load data from CSV files on Google Storage to a BigQuery table. After this, all the temporary CSV files are deleted. This process occurs at BigQuery side, not Skyvia.

Probably, BigQuery jobs perform update operation for some reason. Or, probably, you have another package that is configured for updated operations for this table.

 

0
votes
OWOX Analytics posted this 23 May 2018

Mariia, thank you for the answer.

We have checked this package, in this package only Insert operations are used and, thus,  Skyvia uses bulk load, not an update.

For now, you're right, I use settings "Updated records > Insert" because it runs import task w/o errors.

When described errors have occurred the task's configuration was "All > Update". The reason for this is to take all existing records and update them all to the actual state. Now I cannot use this configuration because I'm not sure that I'll get all records correctly as well.

0
votes
Mariia Zaharova posted this 5 weeks ago

Thank you for these details. In case of Update operation single update commands are used (1 row = 1 update). And, thus, these BigQuery limitations may be encountered:

https://cloud.google.com/bigquery/quotas#data_manipulation_language_statements

This is an expected behaviour.

0
votes
OWOX Analytics posted this 5 weeks ago

I agree that is the expected behavior but only for those who have an appropriate technical background.

You should report your users about possibility exceed limits and getting errors when they choose Update or Delete at the tasks.

Once you exceed the limit, you must wait next day for updating table. I think it should be user's decision about using such a configuration or not.

Thanks for your support!

0
votes
Mariia Zaharova posted this 5 weeks ago

Thank you for your reply and suggestion. We will consider expanding the documentation on this case.

Best regards, 

Mariia

Close