Hi Support Team,
I have some questions about Salesforce to BigQuery import.
The import task for this type is scheduled and runs aproximately 8 times per day. Sometimes it processes a few entries, but sometimes it processes a large set of data per table.
As I see at BigQuery Query History the data is imported to Big Query row by row.
If the task is processed few thousand entries, many of them is failed with an error:
Errors: dml_per_table.long: Quota exceeded: Your table exceeded quota for UPDATE, DELETE or MERGE queries per table. For more information, see https://cloud.google.com/bigquery/troubleshooting-errors (error code: quotaExceeded)
My Query History looks similar to this:
Do you address to this limitation at BigQuery import task? If so, can you please explain multiple similar errors per one import task?
Will be glad to share with you more detail if needs.
Best regards, Viacheslav