Transport Level error when loading 343230 records into Salesforce

  • 1.8KViews
  • Last Post 09 December 2017
0
votes
Michael Kumaraswamy posted this 16 November 2017

I am running a package to import 343230 records into Salesforce. After inserting about 50000 records it gives the following error.

 

A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - An existing connection was forcibly closed by the remote host.)

Source is a SQL Server Table ( hosted on AWS).

Is there a limitaiton on the number of records that can be loaded in a run ?

 

 

 

Order By: Standard | Newest | Votes
0
votes
Mariia Zaharova posted this 16 November 2017

 Could you please tell us the package number with which this error occurs.

Looking forward to your reply.

 

0
votes
Michael Kumaraswamy posted this 16 November 2017

0
votes
Mariia Zaharova posted this 16 November 2017

Is there a limitaiton on the number of records that can be loaded in a run ?

Not, we do not have such limitations. This error usually occurs when connection to the server was broken (i.e. network related issues, unstable connection, etc).

 

You can try increasing Connection/Command Timeout parameters of your SQL Server connection (you can find them in the Advanced tab).

 

0
votes
Michael Kumaraswamy posted this 17 November 2017

Changed the Connection / Command timeout parameters in Skyvia to 600 and 3000 - Still get the same error & this time fewer fewer records(13733) were inserted. 

 

 

0
votes
Michael Kumaraswamy posted this 29 November 2017

 Hi - Is there any other way to fix this issue ? Is there a way for you to see where the connection timeout occurs ? 

0
votes
Mariia Zaharova posted this 30 November 2017

As we have said above, most likely, this error is not related to Skyvia and usually occurs when connection to the server was broken (i.e. network related issues, unstable connection, etc): https://blogs.msdn.microsoft.com/spike/2009/04/16/a-transport-level-error-has-occurred-when-sending-the-request-to-the-server-provider-tcp-provider-error-0-an-existing-connection-was-forcibly-closed-by-the-remote-host/

 

However, we will investigate this issue more clearly and contact you if any news or solutions are available.

0
votes
Michael Kumaraswamy posted this 01 December 2017

We have checked our logs do not show any connection loass ( AWS and Salesforce).

Will it be possible to see from the log see which connection is getting dropped ?

0
votes
Mariia Zaharova posted this 04 December 2017

It seems, you have deleted the package with this error. At the moment, we cannot find any package with this error under your account. Please re-create the package and run it. Please contact us as soon as the error is reproduced. 

0
votes
Michael Kumaraswamy posted this 06 December 2017

Here is the task id that gave the error 9557533 (the package will be under lonnie.gray@roberthalf.com).

This time we limited the total # of records to just 39795 - after it errrored out checked SFDC and it has 24168 records 

 

0
votes
Mariia Zaharova posted this 06 December 2017

Thank you for the reply. This error occurs while data retrieving from SQL Server database. Please try one more thing: set Command Timeout parameter of your SQL Server connection to 0. After these changes, run package several times and notify us about the results.

0
votes
Michael Kumaraswamy posted this 06 December 2017

When you say the error is when retriving data from SQL , you are saying that there is no coinnection issue from Skyvia - It is not terminating the connection- The isse is at  the SQL server end -. Is my understanding correct ?

Changed the Command Time out parameter for SQL server to 0 as adviced . We also split up the tasks to limit the number of records .The 1st task with 39795 records ran without an issue. The 2nd task with 48022 records ran for 3 mins and failed with the Transport Level Error - Task Id =9569046

 

0
votes
Michael Kumaraswamy posted this 08 December 2017

Worked with the DBA and ran a trace when executing the query and it shows no issues wiht SQL SERVER connection 

Skyvia was able to connect and retrieve 345000 records - The issue is when writing data into Salesforce. Is there some kind of parameter that you can increase to allow more records to be inserted per job ? 

Bulk job report in Salesforce has every job  inserting 539 records  - Is this something that Skyvia controls - Number of records submitted for insertion into Salesforce ? 

0
votes
Mariia Zaharova posted this 08 December 2017

Here is the stack trace of the error that occurs in your package:

A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - An existing connection was forcibly closed by the remote host.)

System.Data.SqlClient.SqlException (0x80131904): A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - An existing connection was forcibly closed by the remote host.) ---> System.ComponentModel.Win32Exception (0x80004005): An existing connection was forcibly closed by the remote host

at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)

at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)

at System.Data.SqlClient.TdsParserStateObject.ReadSniError(TdsParserStateObject stateObj, UInt32 error)

at System.Data.SqlClient.TdsParserStateObject.ReadSniSyncOverAsync()

at System.Data.SqlClient.TdsParserStateObject.TryReadNetworkPacket()

at System.Data.SqlClient.TdsParserStateObject.TryPrepareBuffer()

at System.Data.SqlClient.TdsParserStateObject.TryReadByteArray(Byte[] buff, Int32 offset, Int32 len, Int32& totalRead)

at System.Data.SqlClient.TdsParserStateObject.TryReadString(Int32 length, String& value)

at System.Data.SqlClient.TdsParser.TryReadSqlStringValue(SqlBuffer value, Byte type, Int32 length, Encoding encoding, Boolean isPlp, TdsParserStateObject stateObj)

at System.Data.SqlClient.TdsParser.TryReadSqlValue(SqlBuffer value, SqlMetaDataPriv md, Int32 length, TdsParserStateObject stateObj)

at System.Data.SqlClient.SqlDataReader.TryReadColumnData()

at System.Data.SqlClient.SqlDataReader.TryReadColumnInternal(Int32 i, Boolean readHeaderOnly)

at System.Data.SqlClient.SqlDataReader.TryReadColumn(Int32 i, Boolean setTimeout, Boolean allowPartiallyReadColumn)

at System.Data.SqlClient.SqlDataReader.GetValueInternal(Int32 i)

at System.Data.SqlClient.SqlDataReader.GetValue(Int32 i)

 

After splitting package in two ones, one package is still have much data to retrieve or some column contains huge data in it (e.g. BLOB or CLOB).

We recommend you to try:

- separate failed package into several ones in order to reduce data being retrieve for one package;

- if previous step doesn't help, try to find the record with huge data in it and exclude this record from the package via filter conditions.

 

Is there some kind of parameter that you can increase to allow more records to be inserted per job ? 

Bulk job report in Salesforce has every job  inserting 539 records  - Is this something that Skyvia controls - Number of records submitted for insertion into Salesforce ? 

Unfortunately, there is no possibility to change batch size in Skyvia.

0
votes
Michael Kumaraswamy posted this 09 December 2017

Thank You - We have created multiple tasks to insert the data.

Close