Showing posts with label dbf. Show all posts
Showing posts with label dbf. Show all posts

Monday, March 19, 2012

Importing Excel 2007 and/ or DBF files into SQl server 2005

Greetings,

I'm having a tough time importing some of my legacy database into sql.
I have a number of dbase (IV) files I need to get into SQL. I have tried building a SSIS package with either an foxpro oledb connection or a jet 4.0 one, none of them work bec. of inconsistencies in the data format in my tables (e.g. date fields, etc).

I have tried to save the .dbfs as excel 2007 files, taking advantage of the larger space that comes with '07. Problem is you can't use the import/export wizard with 2007 for some reason and I haven't been able to create a package with the access 12 oledb as I have read.

I have to get some crucial data out of that old system and into the new one and I can't seem to be able to import them properly.

Any hints on what I should do ? (maybe I'm doing something awfully wrong)

Thank you for taking the time to answer my question,
Val

If you have SP2 installed, you should be able to load data from Excel 2007 format using the I/E wizard. Do not use the Excel connection, but use the new OLE DB provider for Office 2007. You will need to set extended properties to "Excel 12.0".

Let me know if you need more assistance.

Thanks.

|||

Try by looking at this : http://msdn2.microsoft.com/en-us/library/aa337084.aspx and also you'll need to configure your connection manually to connect. Set up a Jet OLEDB Connection - point to your folder containing the DBase files. Click the "All" button and change the "Extended Properties" to "DBASE IV".

Friday, March 9, 2012

Importing data into a view

Hi all,
I am trying to import data from dbf files into a sql view using DTS. I
see that only buck copy allows me to import data into a view. I can't use
this because I need to choose which rows and columns I need to import when I
am importing them. What is the easiest way to import data into view?
Thanks in advcance...If you are very new to SQL Server then I recommend that you should import
data into a dummy table in SQL Server and put your filter clauses after you
have imported data into SQL Server. If you want to do lot of transformation
then you should use DTS instead. If you are a newbee then just import data
into SQL Server without filter clause and filter it once you have imported
data,
"Nikhil Patel" wrote:

> Hi all,
> I am trying to import data from dbf files into a sql view using DTS. I
> see that only buck copy allows me to import data into a view. I can't use
> this because I need to choose which rows and columns I need to import when
I
> am importing them. What is the easiest way to import data into view?
> Thanks in advcance...
>
>

Friday, February 24, 2012

Importing big data with exception.

Hi, there;

I use ASP.NEP to create a SSIS package to import data from .dbf file. When I import data from a big file (216,173KB) my package throw exception:

An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Communication link failure".
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "TCP Provider: The specified network name is no longer available.
".
The "input "OLE DB Destination Input" (71)" failed because error code 0xC020907B occurred, and the error row disposition on "input "OLE DB Destination Input" (71)" specifies failure on error. An error occurred on the specified object of the specified component.
The ProcessInput method on component "OLE DB Destination" (58) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
The PrimeOutput method on component "OLE DB Source" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
Thread "WorkThread0" has exited with error code 0xC0209029.
Thread "SourceThread0" has exited with error code 0xC0047038.

An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Communication link failure".
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "TCP Provider: The specified network name is no longer available.
".
The "input "OLE DB Destination Input" (77)" failed because error code 0xC020907B occurred, and the error row disposition on "input "OLE DB Destination Input" (77)" specifies failure on error. An error occurred on the specified object of the specified component.
The ProcessInput method on component "OLE DB Destination" (64) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
Thread "WorkThread0" has exited with error code 0xC0209029.
The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
The PrimeOutput method on component "OLE DB Source" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
Thread "SourceThread0" has exited with error code 0xC0047038

Does anybody know what the problem is?

Thanks.

Sounds like the connection to your destination database timed out and was closed. Most likely the SQL Server instance severed the connection. Are you performing a large sort or an aggregate in the dataflow? You might first try going to a raw file instead of the database table and then create a second data flow that uses the raw file as the source straight into the database table.|||Thanks.

My structure is: my data source files (*.dbf file)are under another machine, some files are very big (216,173KB, 283,845KB, some are even bigger as time goes on). I want to import these data to my new SQL2005 server which is empty now. I create a table before importing data (Works now). In my source component I set " oSrcInstance.SetComponentProperty("SqlCommand", "SELECT * FROM [" + sourceTable+"]");", there is no other aggregate operation between source and destination, just simply pump all data from source to destination. I can do this with my SQL2000 DTS (same as a ASP.NEP project).
So simple as Source (*.dbf)--> Destination (SQL2005).

Thanks.
|||

Are you creating the new table through an Exec SQL task? Is it using the same connection manager as your data flow task? Trying to verify that the package is connecting successfully. Also, how long is the package running before you received this error? Is it occuring if you try a smaller file?

You might try setting the OLE DB Destination data access mode property to one of the fast load options, and specify Rows Per Batch as 10,000 or 20,000, to see if you get different results.

|||

No, I don't use ExecSQL task to create table, I just use SqlConnection to create table and then close+dispose the connection.

Actually I am runnning the package now. It looks like it is running slower and slower. I can see the row numbers increaing in the SQL2005 database.

I am using OLE DB Destination,

oDestInstace.SetComponentProperty("CommandTimeout", 0);

oDestInstace.SetComponentProperty("OpenRowset", "[dbo].[" + destinationTable+"]");

oDestInstace.SetComponentProperty("OpenRowsetVariable", null);

// oDestInstace.SetComponentProperty("SqlCommand", null);

oDestInstace.SetComponentProperty("DefaultCodePage", 1252);

oDestInstace.SetComponentProperty("AlwaysUseDefaultCodePage", false);

oDestInstace.SetComponentProperty("AccessMode", 0);

oDestInstace.SetComponentProperty("FastLoadKeepIdentity", false);

oDestInstace.SetComponentProperty("FastLoadKeepNulls", false);

// oDestInstace.SetComponentProperty("FastLoadOptions", null);

oDestInstace.SetComponentProperty("FastLoadMaxInsertCommitSize", 0);

The exception happens after package runs for 30 minutes or more. It doesn't occur for small files.

Thanks

|||

Try setting FastLoadOptions to TABLOCK, CHECK_CONSTRAINTS, ROWS_PER_BATCH=1000

and FastLoadMaxInsertCommitSize to 1000.

|||

Yes, you are right. I did that and it works perfectly. Apart from FastLoadMaxInsertCommitSize = 0.

Thanks.