Hi, there;
I use ASP.NEP to create a SSIS package to import data from .dbf file. When I import data from a big file (216,173KB) my package throw exception:
An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Communication link failure".
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "TCP Provider: The specified network name is no longer available.
".
The "input "OLE DB Destination Input" (71)" failed because error code 0xC020907B occurred, and the error row disposition on "input "OLE DB Destination Input" (71)" specifies failure on error. An error occurred on the specified object of the specified component.
The ProcessInput method on component "OLE DB Destination" (58) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
The PrimeOutput method on component "OLE DB Source" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
Thread "WorkThread0" has exited with error code 0xC0209029.
Thread "SourceThread0" has exited with error code 0xC0047038.
An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Communication link failure".
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "TCP Provider: The specified network name is no longer available.
".
The "input "OLE DB Destination Input" (77)" failed because error code 0xC020907B occurred, and the error row disposition on "input "OLE DB Destination Input" (77)" specifies failure on error. An error occurred on the specified object of the specified component.
The ProcessInput method on component "OLE DB Destination" (64) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
Thread "WorkThread0" has exited with error code 0xC0209029.
The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
The PrimeOutput method on component "OLE DB Source" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
Thread "SourceThread0" has exited with error code 0xC0047038
Does anybody know what the problem is?
Thanks.
Sounds like the connection to your destination database timed out and was closed. Most likely the SQL Server instance severed the connection. Are you performing a large sort or an aggregate in the dataflow? You might first try going to a raw file instead of the database table and then create a second data flow that uses the raw file as the source straight into the database table.|||Thanks.
My structure is: my data source files (*.dbf file)are under another machine, some files are very big (216,173KB, 283,845KB, some are even bigger as time goes on). I want to import these data to my new SQL2005 server which is empty now. I create a table before importing data (Works now). In my source component I set " oSrcInstance.SetComponentProperty("SqlCommand", "SELECT * FROM [" + sourceTable+"]");", there is no other aggregate operation between source and destination, just simply pump all data from source to destination. I can do this with my SQL2000 DTS (same as a ASP.NEP project).
So simple as Source (*.dbf)--> Destination (SQL2005).
Thanks.
|||
Are you creating the new table through an Exec SQL task? Is it using the same connection manager as your data flow task? Trying to verify that the package is connecting successfully. Also, how long is the package running before you received this error? Is it occuring if you try a smaller file?
You might try setting the OLE DB Destination data access mode property to one of the fast load options, and specify Rows Per Batch as 10,000 or 20,000, to see if you get different results.
|||No, I don't use ExecSQL task to create table, I just use SqlConnection to create table and then close+dispose the connection.
Actually I am runnning the package now. It looks like it is running slower and slower. I can see the row numbers increaing in the SQL2005 database.
I am using OLE DB Destination,
oDestInstace.SetComponentProperty("CommandTimeout", 0);
oDestInstace.SetComponentProperty("OpenRowset", "[dbo].[" + destinationTable+"]");
oDestInstace.SetComponentProperty("OpenRowsetVariable", null);
// oDestInstace.SetComponentProperty("SqlCommand", null);
oDestInstace.SetComponentProperty("DefaultCodePage", 1252);
oDestInstace.SetComponentProperty("AlwaysUseDefaultCodePage", false);
oDestInstace.SetComponentProperty("AccessMode", 0);
oDestInstace.SetComponentProperty("FastLoadKeepIdentity", false);
oDestInstace.SetComponentProperty("FastLoadKeepNulls", false);
// oDestInstace.SetComponentProperty("FastLoadOptions", null);
oDestInstace.SetComponentProperty("FastLoadMaxInsertCommitSize", 0);
The exception happens after package runs for 30 minutes or more. It doesn't occur for small files.
Thanks
|||
Try setting FastLoadOptions to TABLOCK, CHECK_CONSTRAINTS, ROWS_PER_BATCH=1000
and FastLoadMaxInsertCommitSize to 1000.
|||Yes, you are right. I did that and it works perfectly. Apart from FastLoadMaxInsertCommitSize = 0.
Thanks.
No comments:
Post a Comment