Recent Topics
Google Big Query Doesn't Support JSON, ARRAY or STRUCT types
When working with Array or Struct types in BigQuery you cannot select any columns using these types for mapping. Having JSON type columns crashes the task creation wizard stopping you form being able to create a mapping.
Woocommerce / impossible to get product VARIATIONS sales
Hi, Everything's fine with my Skyvia / woocommerce connection, except the fact that it looks impossible to get product variations sales. I can get product total sales, I can get some product variation informations, but I can't get product variations sales,
No support for Affinity API field that is an array of objects
We are trying to export data from the affinity organizations API, docs here: https://api-docs.affinity.co/#the-organization-resource The list_entreies field is an array of objects according to the documentation, skyvia shows this as TEXT field in the
Completed integration / import shows as running and fails after 30 minutes after data successfully imported
Seeing odd behavior in UI and import / upsert from MySQL to SQL server: Initiated manual run to upsert 25K rows from MySQL to SQL. I see correct data being pulled into SQL and all rows transfered within few minutes, During this time no activity visible
Handling of Null or blanks during import
It appears that Skyvia is replacing content when a null or blank field is imported. Is there a setting that will allow us to ignore null values during imports, leaving any existing values in the destination unchanged?
'Trusted Devices' IPs listing Cloudflare instead of actual source
Hello, Theres something wrong with how 'Trusted Devices' are being logged here: https://app.skyvia.com/#/profile/two-factor-authentication Since at least June 1st, 2023, 100% of the IP address access logs against my account are listing Cloudflare proxy
Append timestamp not working as expected
I have created an export integration and checked the "Append timestamp" checkbox. The resulting file does not have either a date or timestamp added. Here is the integration package I am using https://app.skyvia.com/#/138012/packages/232230 Here is a screenshot
Lookup
Hi, We have a integration , wherein we are doing lookup on target which has 717188 rows. The data flow goes in hung state for hours and we cancel it after few hours. Is there any other alternative?Run id 155117268.
i want to have imported data direct references
Good Afternoon, I want to be able to import data from NetSuite into my Jira Service Management environment and make said imported data referenceable via a custom field within JSM for the end user. If i could get some assistance with accomplishing this
dataflow doesn't work run id 152711724
dataflow doesn't work run id 152711724
Wordpress backup integration. Not a real backup
Hello, Our Wordpress backup should come out to a compressed size of around 1.6Gigabytes. Once I managed to get the Wordpress Connection working and then configured a Skyvia backup process against this, the resutled data comes out to 3.2Megabytes. So either
Skyvia Backup NetSuite core 2023.2 SuiteTalk endpoint version 2016_2 is no longer supported
Does anyone know of any options or recommendations to resolve the "Skyvia Backup NetSuite core 2023.2 SuiteTalk endpoint version 2016_2 is no longer supported" issue described below? We are experiencing an issue with NetSuite backup using the NetSuite
Updating data into Hubspot
Trying to update data into Hubspot from SQL Server (so I want to take the data in Hubspot and update it from what I have in SQL Server), but I'm not sure which integration to use. Which integration would update the existing files with new data without
Has anybody successfully integrated a MySQL database to a Shopify Store and do bidirectional update?
I did a Replication of my Shopify Store with 4 demo products to a MySQL database. Then I modified some of the ProductVariants table data on the MySQL database and tried to synchronize it back to the Shopify Store. I cannot get pass the following failure
undecipherable error
What clear explanation for the error of this execution? Run Id 152661201 quoteObject error Hide details Timeouts are not supported on this stream.
Data flow "Integration failed: Disposable provider cannot be reused."
My last 3 runs of a simple import integration have all failed with the message "Integration failed: Disposable provider cannot be reused.". The data integration is valid and I have tested the connection to my source and target. We have not exceeded our
Import Failed - No Logs
Can you please provide information/logs about the failed execution with run ID: 136447256. No data is provided. The package has been running for more than 24 hours, and it is really critical for us to ensure that we have all the data transferred from
error connecting to datasource using odbc
hi, I'm trying to connect looker datastudio to a database called ocient using the ODBC driver but I'm getting this error, any idea what this meant? An invalid connection string attribute was specified (failed to decrypt cipher text)
Dataflow not working
Why did Run Id 150581815 run for 127 hours? The tool needs to have an automatic cancellation mechanism and better log detail.
Data Flow It has been running for 56 hours
Why did it run for more than 56 hours? https://app.skyvia.com/#/76069/packages/218908/debug run ID 146352365
Errors while trying out skyvia backup
Hi, I'm trying out skyvia mailchimp backup to make sure it works well before using it to backup our whole mailchimp (around 1.3millions contacts), I'm having some trouble, the backups have some failed items and I was wondering : - how can we have more
What detailed explanation for this error?
What detailed explanation for this error? https://app.skyvia.com/#/76069/packages/219228/debug 150831150 Error Sql error: Generic SQL error. CRM ErrorCode: -2147204784 Sql ErrorCode: -2146232060 Sql Number: 0
Zoho Projects automation - Insert task failures on specific fields
I have been trying out the new automation tools, finding them pretty cool but having some headaches with the Zoho Projects connector. To be honest I think the issue is Zoho's absolute mess of a schema (and lack of consistency therein). I have an action
Exporting logs as .CSV
I am having issues exporting logs from the log preview window. I am seeing title.error files in my downloads instead of usable title.csv files. Thank you, Noah
Slow query
Slow query. Why is the query below taking so long? This slows down the Data Flow. When looking at the package you will see that the speed is faster select sim.opportunityid,sim.csh_origem,sim.csh_mktcanal,sim.csh_term,sim.csh_mktcampanha,sim.csh_urlcampanha
Data flow stfp connection fails while normal import works
I've run into a situation where I would prefer to use dataflow but for some reason when I try to run the sftp file through data flows I get the error, 'an established connection was aborted by the server'. I know that the connection is working, however
Amplitude query running for long periods of time with no output
Hi, I have written a query to pull data from Amplitude, but the task runs for a long time and produces no output - here is the url of the export task: https://app.skyvia.com/#/151008/packages/228237 Does the query in the task look okay in your opinion?
Run Id 150406607
It has to automatically cancel when a run has been running for more than 2 hours and restart it automatically. I no longer have any answers to tell users about delay in loads.
Can I create a custom inbound webhook to receive POST data?
We use this platform for client survey data: https://formcrafts.com There's no integration available in Skyvia but Formcrafts over the ability to pipe survey responses into an external API using a webook connection. How to Configure Webhooks (formcrafts.com)
What is the cause of the error?
What is the cause of the error? It's a new package, with automatic creation by Skyvia, how can it give an error? https://app.skyvia.com/#/76069/packages/227362/debug
Data Flow schedule does not work
The Data Flow schedule of the type to run every 2 hours between 2 hours does not work. I had set the data flow below to run between 10PM and 2AM every 2 hours... but it didn't start. Now I changed it to between 8PM and 4AM to see if it runs. https:/
why did it run successfully, a short time later it doesn't run anymore?
why did it run successfully, a short time later it doesn't run anymore? Run Id 148445645 Date Jul 27, 2023 8:24:40 AM State Failed Run Id 148442409 Date Jul 27, 2023 7:52:03 AM State Succeeded
Enable logs for Skyvia users to see the issue
why did 148452273 run successfully, but 148456582 immediately failed? Enable logs for Skyvia users to see the issue. https://app.skyvia.com/#/76069/packages/166398/debug
Bug package dynamics to snowflake
Integration packs are not running. And it is not possible to see the error in the log. Even if the message "Click on quantity for preview and export errors" appears, there is no way to click.
File Mask for Yesterday, not today
Can I use the file mask to import a CSV that was produced yesterday? The data source I'm using (Toast POS) exports CSV files at 4am for the day prior in a folder structure like this: RestaurantID / yyyyDDmm / TimeEntries.csv So if I run this task at 4:30am
Four jobs are not working AND canceling does not work
It is the end of the workday, support is not availabe... 4 jobs are hanging. Cancel button does not end the tasks. 2 jobs should run hourly, they are fairly small 1 job should run every 4 hours 1 job should run twice a day. Our business has a problem,
Building a connection to a SQL Virtual Machine
Trying to build a connection to a SQL Virtual Machine for SQL Server. However, I am having some trouble with setting it up. How can I go about doing this?
Data not inserting at bulk to the target Table.
I've been trying to insert all the data in a table from MySQL to Salesforce. But, it inserts the last row only. ''' SELECT * FROM countries where id > :MaxId ''' This is the query running from the source table
Importing to existing contact records in Salesforce NPSP
Hello, I'm attempting to import opportunities into existing contact records in Salesforce NPSP. I know the contacts have existing account but my lookup is not finding the records. I am currently using a Target lookup that searches the First Name, Last
Zoho Books replication ongoing fails on 2 tables
Hi guys, I've had 2 tables withing the Zoho Books connection continually failing for several weeks now. These are: Contacts Purchase Orders The error received is: Integration failed: For security reasons you have been blocked for some time as you have
Next Page