Showing posts with label contains. Show all posts
Showing posts with label contains. Show all posts

Wednesday, March 28, 2012

Importing text file removes decimal separator

Hi,

I'm trying to import a semi-comma separated text file into a SQL db. I have a field in the text file that contains decimal number. As a decimal separator it's used a comma (15,35). When i use a DTS package to create a destination table and import all rows, the field is created as a float field. In this field the decimal comma is removed so the number in SQL becomes 1535. If I change the decimal separator to (.) i works OK. But I need to get it work with comma as decimal separator. In the DTS package the field form the text file is recognised as varchar (8000). Any ideas?

Ingar

You have to choose appropriate locale which threats comma as a decimal delimiter. There is the drop-down control for choosing locales on the first page of the Flat File Connection Manager UI.

Thanks.

|||

Hi, thanks for your answer. I can see that I didn't specify which SQL version i was using. I use SQL 2000 not 2005. I know that the theme in this forum is SSIS but it also states "transforms/data flow" and I thought DTS packages from SQL 2000 was included here. I appologize if I have misunderstood this but english is not my first language.

So as far as I know the Flat file connection manager is not available in SQL 2000.

But thanks anyway.

Ingar

Importing text file into database

Hello Everyone,

I would like to import a text file which contains one string (a large integer) per line not separated by commas or anything else except a carriage return. Does anyone know of an easy way to store this in a database file? I'm open to suggestions if there is more than one way to save this kind of information within a database. I have SQL server 2005 developer edition if that helps in any way. I'm also starting to learn about Linq so if there is some other way you would store this information for that purpose I would love to hear about that as well. C# code is preferable, but I can use the automatic translators if that's all you have. By the way, I'm a newbie to this subject (if you couldn't tell). Thanks in advance.

Robert

check out

- BCP

- Bulk Insert

|||

Your answer looks promising after a review of the information about it. I'll try it out and if it works then close this post. Thanks for your suggestion.

Robert

|||

Hello Everyone,

I did try this and I was amazed to find out that you can import plain text files straight into excel or access. Obviously with an access database in hand using it with SQL server 2005 should be straightforward. Thanks for your help.

Robert

Friday, March 23, 2012

Importing Null Date Fields

I'm using SQL Server Express and am trying to import a CVS file. The CVS file contains a string field (named DAS) that represents a Date. This field can be null.

I've tried using the DTS Wizard to import this CVS file and convert the DAS field to a Date, which works great until it hits a record with a NULL DAS field. It then throws a convertion error.

Still using the DTS Wizard, I've changed the DataType of the DAS field in the Source file to [DT_DATE], it works fine but all the null dates are converted to 12/30/1899.

Is there a way (DTS Wizard or something else) that will allow me to import these CVS files with null Date fields and keep them as null in SQL Server table.

Thanks for any help,

Jon

Hi,

I read your post and I can advice you the next:

1. Once you have date 12/30/1899 instead all NULL values in SQL Server => you may easy to sort that records, to mark them and place NULL value with “copy” and “paste”. It is easy and quickly.

2. You may write a program instead DTS Wizard /I don’t know that wizard/. That program will read from your CVS file and write to SQL server. From the program you will have full control on all fields. I personally prefer to write a program when I have some unusual case.

I hope that advices will solve the problem. But if you still can’t make NULL values – let me know.

Regards,

Hristo Markov

|||Off the top of my head, I'm thinking there is an option to "Keep Nulls" inside the wizard. I haven't run it and I'm not on my machine w/ SSIS installed so I can't verify it. If you do see such an option, that is how you tell SSIS to preserve the NULLs.

If this isn't an option, then you'll have to go one step deeper and either do custom transformations or write an EXECUTE SQL TASK that will go through and update any record of 12/30/1899 to be NULL.

Wednesday, March 21, 2012

importing from excel

I need to copy the data in two spreadsheets into two
tables in "SQL Server 2000". Below are the details:

One spreadsheet contains the data that needs to be added
to an existing table in SQL server. All field names in the
spreadsheet match the DB table column names.
Another spreadsheet contains the data that needs to be
copied to a new DB table. The table currently does NOT
exist in the DB.

I'm not sure how to accomplish this. ANy help would be
appreciated.use DTS
If its one time work you can just query the Excel tables and inser the data to existing table
or Import and Export data|||They want to append the data, but this is the problem I'm having now...

The first row of the spreadsheet matches the column heading in the table - but when I go to import the spreadsheet, it is showing me THREE seperate tables or views on the Select Source tables/views screen. I don't know why this is if I'm importing from ONE spreadhsheet... unless the spreadsheet is formatted incorrectly. I went ahead and selected the first one and set my destination table and I when I run it, i get an error: Violation of Primary Key constraint 'PK_Faultresolutions' cannot insert duplicate Key in FaultResolutions.

maybe this is somethig simple.. but can anyone help??sql

Importing from dBase

I'm trying to import a dBase database to SQL Server, but it contains Swedish characters (,,), and these are replaced with other characters.

I've tried importing from dBase 5, III, and IV, without luck. I don't know how to solve this problem. Is there any way to replace characters using T-SQL, or do I need to change some settings?

I hope someone can help me with this...Have you tried using Swedish collation on sql server?
It will also depend on how you are doing the import. It may be losing the data on the extract from dbase or on the import into sql server.|||Thanks for your reply!

I tried changing the collation using this:

alter database TEST COLLATE Finnish_Swedish_CS_AS

It didn't work. I got an error message saying:

"Incorrect syntax near 'Finnish_Swedish_CS_AS'."

I've used the syntax described in Books Online.

*confused*|||I just learned that collation only can be changed in Sql Server 2k, so I'll solve it in another way. Thanks anyway.

Monday, March 19, 2012

Importing fixed field length files

Hi,
Im importing data from a file, and saving it into a table, nothing complex.
The file contains fixed length fields, and the table is integers ( default
NULL ). When a value in the file is NULL, it is simply a series of spaces
indicating no value.
When EM parses the spaces, instead of inserting NULL into the table (because
it didn't find a digit), it complains about incompatable types?
Is the parsing engine really so stupid that it cannot convert spaces to a
single NULL when it knows it should be looking for numbers?!
How can i make this work?
Jeo
Does no-one know how to do the conversion?
"JoeB" <joe@.kybert__***NO_SPAM*sdfsdfsd****___.com> wrote in message
news:e$f0MfchEHA.3944@.tk2msftngp13.phx.gbl...
> Hi,
> Im importing data from a file, and saving it into a table, nothing
> complex.
> The file contains fixed length fields, and the table is integers ( default
> NULL ). When a value in the file is NULL, it is simply a series of spaces
> indicating no value.
> When EM parses the spaces, instead of inserting NULL into the table
> (because it didn't find a digit), it complains about incompatable types?
> Is the parsing engine really so stupid that it cannot convert spaces to a
> single NULL when it knows it should be looking for numbers?!
>
> How can i make this work?
>
> Jeo
>
>
|||Hiya,
There are two ways I can think of doing this, the quick dirty way and
the complicated clever way,
1.Import the integer into a text field and then use an update
query to populate the interger field. You can then use a combination
of trim and case statements to get the values you want. It'll work,
but not ideal.
2.I assume you're using DTS to import the data. In which case
you can change the transformation the this column from a simple copy
column to a an activeX script. You can then create some VB script to
change the value to null if it finds all spaces.
If it was me then it would depend on whether this is a one off or a
regular job. For a one off I would do option 1, if I'm going to have
to do this regularly then I would take the time and do option 2.
Darious
On Tue, 24 Aug 2004 11:12:19 +0100, "JoeB"
<joe@.kybert__***NO_SPAM*sdfsdfsd****___.com> wrote:

>Does no-one know how to do the conversion?
>
>"JoeB" <joe@.kybert__***NO_SPAM*sdfsdfsd****___.com> wrote in message
>news:e$f0MfchEHA.3944@.tk2msftngp13.phx.gbl...
>
|||Cheers, Im going to try option one as i cannot write VB code.
Thanks.
Joe
"Darious" <news@.areyoukidding.com> wrote in message
news:q2dni0haq1pemlcbvmnpbmg37js7q5a5em@.4ax.com...
> Hiya,
> There are two ways I can think of doing this, the quick dirty way and
> the complicated clever way,
> 1. Import the integer into a text field and then use an update
> query to populate the interger field. You can then use a combination
> of trim and case statements to get the values you want. It'll work,
> but not ideal.
> 2. I assume you're using DTS to import the data. In which case
> you can change the transformation the this column from a simple copy
> column to a an activeX script. You can then create some VB script to
> change the value to null if it finds all spaces.
> If it was me then it would depend on whether this is a one off or a
> regular job. For a one off I would do option 1, if I'm going to have
> to do this regularly then I would take the time and do option 2.
> Darious
> On Tue, 24 Aug 2004 11:12:19 +0100, "JoeB"
> <joe@.kybert__***NO_SPAM*sdfsdfsd****___.com> wrote:
>

Importing Excel Data

Here is the scenario: I have an excel spreadsheet that contains 182 columns, and I need to move this data into a semi-normalized database for reporting. The SQL Server database schema has 11 tables. Some of the tables are going to use identity columns for their PK, other tables are using a value that comes from this spreadsheet for their PK values.

Anyway, I have never done a DTS package of any significance before, and know I most likely need to write some VBScript to handle sticking data into the proper data tables, etc.

I am just hoping someone can point me at a good resource, give me an alternative means of doing this (this is a process that will need to happen whenever a new Excel spreadsheet is dropped into a folder or on a schedule, either one). I would love to write some C# code to handle these things, but a DTS package would probably be the best, I just don't know where to start.

Thanks,Here's a generic Microsoft KB article:

http://support.microsoft.com/default.aspx?scid=kb;en-us;Q321686|||Darrell, thank you for taking the time to respond, I really appreciate it; I will have a look. Thanks!

Monday, March 12, 2012

Importing Dates using derived column transformation

Hi All

I've got a flatfile which contains a column SALEDATE with this data

20070802

''

20070804

''

20070805

20070901

I've got a table with a column SALEDATE datetime.

I use a derived column with this expression

SALEDATE <add as new column> (DT_DBDATE)(SUBSTRING(SaleDate,1,4) + "-" + SUBSTRING(SaleDate,5,2) + "-" + SUBSTRING(SaleDate,7,2)) database date [DT_DBDATE]

This expression fails whenever it comes across a blank field.

How can evaluate my expression to insert a null if it comes across a blank field?

Thank You

SaleDate == "" || ISNULL(SaleDate) ? NULL(DT_DBDATE) : (DT_DBDATE)(SUBSTRING(SaleDate,1,4) + "-" + SUBSTRING(SaleDate,5,2) + "-" + SUBSTRING(SaleDate,7,2))|||

Phil thank you for replying.

There are some records that don't even have anything for SALEDATE

20070801

''

20070802

20070901

20070902

The package fails at the row 627 when the empty field comes across.

How should i modify the expression?

|||The package doesn't fail because of the expression -- check your logs -- it likely fails because of an invalid row (missing delimiter?). SSIS expects the metadata to be the same across the whole file.|||You're right

627 is the last row in the flatfile. it fails on that row.

I checked the file the last record contains data 20070831

We get flatfiles in a RaggedRight format.

There are no text qualifiers and Header row delimiter is set to {CR}{LF}

Here is the error log:

[Transform Columns [1]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "component "Transform Columns" (1)" failed because error code 0xC0049063 occurred, and the error row disposition on "output column "SALEDATE" (4334)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.

[DTS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Transform Columns" (1) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.

[DTS.Pipeline] Error: SSIS Error Code DTS_E_THREADFAILED. Thread "WorkThread0" has exited with error code 0xC0209029. There may be error messages posted before this with more information on why the thread has exited.

I'll try different options and see what i can come up with.

Thanks for you help.

|||Sounds like a short row and it is missing its row terminator|||

I can't seem to get any rows into a table.

Still getting the same error.

I tried changing the (DT_DBDATE) to (DT_STR) but it turns red.

OnSaleDate == "" || ISNULL(OnSaleDate) ? NULL(DT_DBDATE) : (DT_DBDATE)(SUBSTRING(OnSaleDate,1,4) + "/" + SUBSTRING(OnSaleDate,5,2) + "/" + SUBSTRING(OnSaleDate,7,2))

Did anyone come across this problem?

|||Please share all of your errors.

There's nothing wrong with that expression, syntactically.|||

I got the same problem and I fixed cleaning garbage data in the column that caused the error. Ex I expected a number and I got a "-" character.

Good luck

|||

Here is how i solved it.

LEN(TRIM(OnSaleDate)) > 1 ? (DT_DBDATE)(SUBSTRING(OnSaleDate,1,4) + "-" + SUBSTRING(OnSaleDate,5,2) + "-" + SUBSTRING(OnSaleDate,7,2)) : NULL(DT_DATE)

Importing Dates using derived column transformation

Hi All

I've got a flatfile which contains a column SALEDATE with this data

20070802

''

20070804

''

20070805

20070901

I've got a table with a column SALEDATE datetime.

I use a derived column with this expression

SALEDATE <add as new column> (DT_DBDATE)(SUBSTRING(SaleDate,1,4) + "-" + SUBSTRING(SaleDate,5,2) + "-" + SUBSTRING(SaleDate,7,2)) database date [DT_DBDATE]

This expression fails whenever it comes across a blank field.

How can evaluate my expression to insert a null if it comes across a blank field?

Thank You

SaleDate == "" || ISNULL(SaleDate) ? NULL(DT_DBDATE) : (DT_DBDATE)(SUBSTRING(SaleDate,1,4) + "-" + SUBSTRING(SaleDate,5,2) + "-" + SUBSTRING(SaleDate,7,2))|||

Phil thank you for replying.

There are some records that don't even have anything for SALEDATE

20070801

''

20070802

20070901

20070902

The package fails at the row 627 when the empty field comes across.

How should i modify the expression?

|||The package doesn't fail because of the expression -- check your logs -- it likely fails because of an invalid row (missing delimiter?). SSIS expects the metadata to be the same across the whole file.|||You're right

627 is the last row in the flatfile. it fails on that row.

I checked the file the last record contains data 20070831

We get flatfiles in a RaggedRight format.

There are no text qualifiers and Header row delimiter is set to {CR}{LF}

Here is the error log:

[Transform Columns [1]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "component "Transform Columns" (1)" failed because error code 0xC0049063 occurred, and the error row disposition on "output column "SALEDATE" (4334)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.

[DTS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Transform Columns" (1) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.

[DTS.Pipeline] Error: SSIS Error Code DTS_E_THREADFAILED. Thread "WorkThread0" has exited with error code 0xC0209029. There may be error messages posted before this with more information on why the thread has exited.

I'll try different options and see what i can come up with.

Thanks for you help.

|||Sounds like a short row and it is missing its row terminator|||

I can't seem to get any rows into a table.

Still getting the same error.

I tried changing the (DT_DBDATE) to (DT_STR) but it turns red.

OnSaleDate == "" || ISNULL(OnSaleDate) ? NULL(DT_DBDATE) : (DT_DBDATE)(SUBSTRING(OnSaleDate,1,4) + "/" + SUBSTRING(OnSaleDate,5,2) + "/" + SUBSTRING(OnSaleDate,7,2))

Did anyone come across this problem?

|||Please share all of your errors.

There's nothing wrong with that expression, syntactically.|||

I got the same problem and I fixed cleaning garbage data in the column that caused the error. Ex I expected a number and I got a "-" character.

Good luck

|||

Here is how i solved it.

LEN(TRIM(OnSaleDate)) > 1 ? (DT_DBDATE)(SUBSTRING(OnSaleDate,1,4) + "-" + SUBSTRING(OnSaleDate,5,2) + "-" + SUBSTRING(OnSaleDate,7,2)) : NULL(DT_DATE)

Importing Dates using derived column transformation

Hi All

I've got a flatfile which contains a column SALEDATE with this data

20070802

''

20070804

''

20070805

20070901

I've got a table with a column SALEDATE datetime.

I use a derived column with this expression

SALEDATE <add as new column> (DT_DBDATE)(SUBSTRING(SaleDate,1,4) + "-" + SUBSTRING(SaleDate,5,2) + "-" + SUBSTRING(SaleDate,7,2)) database date [DT_DBDATE]

This expression fails whenever it comes across a blank field.

How can evaluate my expression to insert a null if it comes across a blank field?

Thank You

SaleDate == "" || ISNULL(SaleDate) ? NULL(DT_DBDATE) : (DT_DBDATE)(SUBSTRING(SaleDate,1,4) + "-" + SUBSTRING(SaleDate,5,2) + "-" + SUBSTRING(SaleDate,7,2))|||

Phil thank you for replying.

There are some records that don't even have anything for SALEDATE

20070801

''

20070802

20070901

20070902

The package fails at the row 627 when the empty field comes across.

How should i modify the expression?

|||The package doesn't fail because of the expression -- check your logs -- it likely fails because of an invalid row (missing delimiter?). SSIS expects the metadata to be the same across the whole file.|||You're right

627 is the last row in the flatfile. it fails on that row.

I checked the file the last record contains data 20070831

We get flatfiles in a RaggedRight format.

There are no text qualifiers and Header row delimiter is set to {CR}{LF}

Here is the error log:

[Transform Columns [1]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "component "Transform Columns" (1)" failed because error code 0xC0049063 occurred, and the error row disposition on "output column "SALEDATE" (4334)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.

[DTS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Transform Columns" (1) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.

[DTS.Pipeline] Error: SSIS Error Code DTS_E_THREADFAILED. Thread "WorkThread0" has exited with error code 0xC0209029. There may be error messages posted before this with more information on why the thread has exited.

I'll try different options and see what i can come up with.

Thanks for you help.

|||Sounds like a short row and it is missing its row terminator|||

I can't seem to get any rows into a table.

Still getting the same error.

I tried changing the (DT_DBDATE) to (DT_STR) but it turns red.

OnSaleDate == "" || ISNULL(OnSaleDate) ? NULL(DT_DBDATE) : (DT_DBDATE)(SUBSTRING(OnSaleDate,1,4) + "/" + SUBSTRING(OnSaleDate,5,2) + "/" + SUBSTRING(OnSaleDate,7,2))

Did anyone come across this problem?

|||Please share all of your errors.

There's nothing wrong with that expression, syntactically.|||

I got the same problem and I fixed cleaning garbage data in the column that caused the error. Ex I expected a number and I got a "-" character.

Good luck

|||

Here is how i solved it.

LEN(TRIM(OnSaleDate)) > 1 ? (DT_DBDATE)(SUBSTRING(OnSaleDate,1,4) + "-" + SUBSTRING(OnSaleDate,5,2) + "-" + SUBSTRING(OnSaleDate,7,2)) : NULL(DT_DATE)

Importing Dates using derived column transformation

Hi All

I've got a flatfile which contains a column SALEDATE with this data

20070802

''

20070804

''

20070805

20070901

I've got a table with a column SALEDATE datetime.

I use a derived column with this expression

SALEDATE <add as new column> (DT_DBDATE)(SUBSTRING(SaleDate,1,4) + "-" + SUBSTRING(SaleDate,5,2) + "-" + SUBSTRING(SaleDate,7,2)) database date [DT_DBDATE]

This expression fails whenever it comes across a blank field.

How can evaluate my expression to insert a null if it comes across a blank field?

Thank You

SaleDate == "" || ISNULL(SaleDate) ? NULL(DT_DBDATE) : (DT_DBDATE)(SUBSTRING(SaleDate,1,4) + "-" + SUBSTRING(SaleDate,5,2) + "-" + SUBSTRING(SaleDate,7,2))|||

Phil thank you for replying.

There are some records that don't even have anything for SALEDATE

20070801

''

20070802

20070901

20070902

The package fails at the row 627 when the empty field comes across.

How should i modify the expression?

|||The package doesn't fail because of the expression -- check your logs -- it likely fails because of an invalid row (missing delimiter?). SSIS expects the metadata to be the same across the whole file.|||You're right

627 is the last row in the flatfile. it fails on that row.

I checked the file the last record contains data 20070831

We get flatfiles in a RaggedRight format.

There are no text qualifiers and Header row delimiter is set to {CR}{LF}

Here is the error log:

[Transform Columns [1]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "component "Transform Columns" (1)" failed because error code 0xC0049063 occurred, and the error row disposition on "output column "SALEDATE" (4334)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.

[DTS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Transform Columns" (1) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.

[DTS.Pipeline] Error: SSIS Error Code DTS_E_THREADFAILED. Thread "WorkThread0" has exited with error code 0xC0209029. There may be error messages posted before this with more information on why the thread has exited.

I'll try different options and see what i can come up with.

Thanks for you help.

|||Sounds like a short row and it is missing its row terminator|||

I can't seem to get any rows into a table.

Still getting the same error.

I tried changing the (DT_DBDATE) to (DT_STR) but it turns red.

OnSaleDate == "" || ISNULL(OnSaleDate) ? NULL(DT_DBDATE) : (DT_DBDATE)(SUBSTRING(OnSaleDate,1,4) + "/" + SUBSTRING(OnSaleDate,5,2) + "/" + SUBSTRING(OnSaleDate,7,2))

Did anyone come across this problem?

|||Please share all of your errors.

There's nothing wrong with that expression, syntactically.|||

I got the same problem and I fixed cleaning garbage data in the column that caused the error. Ex I expected a number and I got a "-" character.

Good luck

|||

Here is how i solved it.

LEN(TRIM(OnSaleDate)) > 1 ? (DT_DBDATE)(SUBSTRING(OnSaleDate,1,4) + "-" + SUBSTRING(OnSaleDate,5,2) + "-" + SUBSTRING(OnSaleDate,7,2)) : NULL(DT_DATE)

importing data to new database

hi,

i have 2 databases (DT1 and DT2), both contains one table customers with almost identical structure, the only difference is the firstname and lastname in DT1 are char(30) and varchar(50) in DT2. i've created both tables already. i tried using SQL 2005 managment studio to do a 'import data' and it worked. all the information in DT1 were copied to DT2.

now i have to repeat the process for many databases and it would be too tedious to do it one by one. my question is, is there a script i can use to do it so that i could make a small application to automate the process?

thanks in advance.

bob

hi bob,

you can save the script generated by the wizard as an ssis or dts package

you can automate from ssis or dts

regards,

|||

I addition to Joyea and the use of SSIS, which is very extensible you can also use the threepart notation to access the data from another database and write a stored procedure for this:

Select * From Database.Schema.Objectname

HTH, Jens Suessmeyer.

http://www.sqlserver2005.de

|||

thanks for your suggestions.

Bobby

Friday, March 9, 2012

Importing Data from XML file to tables.

hi All ,

I have two tables as Table1([ColA] , [ColB]) , Table2([Col1] , [Col2]) in my DB.

And i have a single XML file which contains the data for the two tables separately.

as <Table1> +

<Table2> +

Now i have to import this data from the XML to both the Tables in one shot. Plz suggest a possible way to get this done.

Thanks in advance for the help.

you can try this proc

CREATE PROCEDURE dbo.SaveXML
(
@.XMLDoc text
)
AS
DECLARE @.idoc int
DECLARE @.SQLStatement as nvarchar(4000)
SET NOCOUNT ON

EXEC sp_xml_preparedocument @.idoc OUTPUT, @.XMLDoc

DECLARE @.cTableName varchar(50)
DECLARE @.cXMLNode varchar(50)

DECLARE TableNamesCursor CURSOR FOR
SELECT TableName
FROM (
SELECT 'Table1' AS TableName UNION ALL
SELECT 'Table2' AS TableName
) TableNames

OPEN TableNamesCursor

FETCH NEXT FROM TableNamesCursor
INTO @.cTableName

SET @.SQLStatement = ''

WHILE @.@.FETCH_STATUS = 0
BEGIN
SET @.cXMLNode = '/' + @.cTableName

SET @.SQLStatement = @.SQLStatement + N' SELECT * INTO #tmp' + @.cTableName + ' FROM OPENXML(@.idoc,''' + @.cXMLNode + ''', 2)
WITH ' + @.cTableName + ';
INSERT INTO ' + @.cTableName + ' SELECT * FROM #tmp' + @.cTableName + ''

FETCH NEXT FROM TableNamesCursor
INTO @.cTableName
END

CLOSE TableNamesCursor
DEALLOCATE TableNamesCursor

EXECUTE sp_executesql @.SQLStatement, N'@.XMLDoc text,@.idoc int',@.XMLDoc, @.idoc

EXEC sp_xml_removedocument @.idoc

SET NOCOUNT OFF
GO|||

thanks for the quick reply Rhamille .

As in ur proc u r passing the XML as parameter.

But my prob is i have xml file saved at location c:\MyXML.xml .

How can i retrieve the required XML contents from this saved file .

Also the table names in DB are Table1 & Table2 . And the format of the XML file is something like this :

-<Report .......... >

- <table1>

- <Data>

<Record col1 = "A" col2 = "B" / >

<Record col1 = "C" col2 = "D" / >

<Record col1 = "E" col2 = "F" / >

</Data>

</table1>

- <table2>

- <Data>

<Record col1 = "A" col2 = "B" / >

<Record col1 = "C" col2 = "D" / >

<Record col1 = "E" col2 = "F" / >

</Data>

</table2>

</Report>

Plz suggest the necessary changes in the SP so that this can be achived.

|||

Here it is...(i used your xml as sample xml file)

Code Snippet

Create Table #Content

(

Lines Nvarchar(4000)

);

Insert Into #Content

Exec master..xp_cmdshell 'Type C:\data.xml' --Change the XML File Path

Declare @.xmlas NVarchar(4000);

Declare @.idoc as Int;

Set @.xml = ''

Select @.xml = @.xml + Ltrim(Rtrim(Lines)) From #Content Where Lines is NOT NULL

Select @.xml

EXEC sp_xml_preparedocument @.idoc OUTPUT, @.xml

--First table

Select

*

From

OpenXML(@.idoc, '/Report/table1/Data/Record',1)

With

(

col1 varchar(100),

col2 varchar(100)

);

--Second table

Select

*

From

OpenXML(@.idoc, '/Report/table2/Data/Record',1)

With

(

col1 varchar(100),

col2 varchar(100)

);

EXEC sp_xml_removedocument @.idoc

|||here's the updated sp from my previous post

CREATE PROCEDURE dbo.SaveXML
(
@.xmlPath nvarchar(100)
)
AS

DECLARE @.idoc int
DECLARE @.SQLStatement as nvarchar(4000)
DECLARE @.XMLDoc nvarchar(4000)

SET NOCOUNT ON

CREATE TABLE #XMLTemp (rowID int IDENTITY, xmlRow nvarchar(4000))
DECLARE @.cmd nvarchar(100)
SET @.cmd = N'TYPE ' + CONVERT(nvarchar,@.xmlPath)
INSERT #XMLTemp EXEC master.dbo.xp_cmdshell @.cmd
SELECT @.XMLDoc =
CASE WHEN rowID = 1 THEN
ISNULL(RTRIM(xmlRow), '')
ELSE
@.XMLDoc + ISNULL(RTRIM(xmlRow), '')
END
FROM #XMLTemp
WHERE xmlRow IS NOT NULL
ORDER BY
rowID ASC

DROP TABLE #XMLTemp

EXEC sp_xml_preparedocument @.idoc OUTPUT, @.XMLDoc

DECLARE @.cTableName varchar(50)
DECLARE @.cXMLNode varchar(50)

DECLARE TableNamesCursor CURSOR FOR
SELECT DestinationTables
FROM (
SELECT 'Table1' AS DestinationTables UNION ALL
SELECT 'Table2' AS DestinationTables
) XMLDumpTables

OPEN TableNamesCursor

FETCH NEXT FROM TableNamesCursor
INTO @.cTableName

SET @.SQLStatement = ''

WHILE @.@.FETCH_STATUS = 0
BEGIN
SET @.cXMLNode = '/Report/' + @.cTableName + '/Data/Record'

SET @.SQLStatement = @.SQLStatement + N' SELECT * INTO #tmp' + @.cTableName + ' FROM OPENXML(@.idoc,''' + @.cXMLNode + ''', 1)
WITH ' + @.cTableName + ';
INSERT INTO ' + @.cTableName + ' SELECT * FROM #tmp' + @.cTableName + ''

FETCH NEXT FROM TableNamesCursor
INTO @.cTableName
END

CLOSE TableNamesCursor
DEALLOCATE TableNamesCursor

EXECUTE sp_executesql @.SQLStatement, N'@.XMLDoc text,@.idoc int',@.XMLDoc, @.idoc

EXEC sp_xml_removedocument @.idoc

SET NOCOUNT OFF
GO

-- to use EXEC dbo.SaveXML 'C:\data.xml'|||

If you're using SQL Server 2005

Code Snippet

CREATE PROCEDURE dbo.LoadXML

AS

SET NOCOUNT ON

DECLARE @.xdoc xml

SELECT @.xdoc = bulkcolumn

from openrowset(bulk 'c:\myxml.xml', SINGLE_BLOB) as xdata --Your xml input file

insert into Table1

select t1.row.value('@.col1', 'char') as ColA, --Change 'char' to your datatype for each column

t1.row.value('@.col2', 'char') as ColB

from @.xdoc.nodes('/Report/table1/Data/Record') as t1(row)

insert into Table2

select t2.row.value('@.col1', 'char') as Col1,

t2.row.value('@.col2', 'char') as Col2

from @.xdoc.nodes('/Report/table2/Data/Record') as t2(row)

|||

Thanks Rhamille and Manivannan ( Mani ) for ur guidance..

Mission Accomplished !!!!

|||Thanks DaleJ for a new idea...i ll try this one too...

Wednesday, March 7, 2012

Importing Data from Excel to update existing fields

I have an excel file that contains column A with names of components and products followed by column B which has each respective quantity on hand. I want to import that data to our website's SQL database that has a products table with a column, Pf_ID, that has only product names not component names and In_Stock which contains out-dated information that I want updated from column B of the excel file.

I think I've figured out how to use DTS and update the two fields, but I'm afraid that when everything runs new entries will be created with component information. Is it possible to specify that only rows where Pf_ID matches some row in column A that same row's column B will be used to update the data in In_Stock. I may have just made things too confusing than they need to be, but I don't have much experience with EM or Excel.

I'm also considering trying to write a macro that will match Pf_IDs in an exported excel file of the products table and take rows out of the excel file with current quantity information putting them in a new excel file to import into the website's database.

Please help, this is getting really confusing.I would bcp the data into a staging table, then use sql to do whatever you have to do.

But that's just me

MOO|||I'm not sure what bcp stands for ... I've only seen dts. I think your idea of using a staging table to query data from sounds good, but I kind of want this to be as simple a process as possible. Something that relatively anyone with instructions could click buttons and have done routinely.|||use dts to bring the excel data into a staging table
(a table created just to store data in the middle of a coomplex import)
then use an update statement against the staged data to update the destination data
(use the transact sql task)

For a more straightforward (but more advanced) method you might consider using the data driven query task and you could eliminate the staging table.

once you get this running you can set it up as a batch file or script or even call it from a sp. so anyone can run it.|||I've called tech support and checked permissions on the product table, but niether of us can seem to get this query to work when importing data from an excel workbook, Peach$. Is there something wrong with my query?

Here's what the support said when he tried:

It doesnt appear to be working running as SA either. I get a Error Line 3 near ) and then the db is read-only.

I cleared active connections to the DB and even stopped access, but it still appears to be locked by something, or perhaps the table is protected somehow?

--Raj

Here is the actual query:

update ASI.dbo.Product

set ASI.dbo.Product.In_Stock = (select `Peach$`.`Quantity On Hand`

from `Peach$`

where ASI.dbo.Product.Pf_ID = `Peach$`.`Item ID`)

where exists

(select `Peach$`.`Quantity On Hand`

from `Peach$`

where ASI.dbo.Product.Pf_ID = `Peach$`.`Item ID`);

Sorry for the lag time with a reply. I hope this gets read now.

Friday, February 24, 2012

Importing CSV data into SQL database using DTS

I would like to import CSV data into a SQL data base using DTS. This
SQL database already contains the information however the CSV file
contains updated information.
data.csv
Computer Name | Serial Number | Processors | Processor Description |
Total Memory
computer 168 123456789 4 Intel 3.06 Ghz
3048 MB
SQL Table
ComputerName | SerialNumber | ProcNum | ProcessorType | Memory
Computer 168 123456789 2 Intel 2.8 ghz
2048 MB
As you can see the memory and processor and other data were updated in
the CSV which I need my import to:
- Find the computer name in the SQL Database table (even know the field
names are different), and then update the SerialNumber, ProcNum,
ProcessorType, Memory information.
- I also want to be sure if the asset name can't be found then it
should create a new row with the new information. Example if computer
218 is not in the SQL database it will populate the database with the
information.
Hi Wildatom
"Wildatom" wrote:

> I would like to import CSV data into a SQL data base using DTS. This
> SQL database already contains the information however the CSV file
> contains updated information.
> data.csv
> Computer Name | Serial Number | Processors | Processor Description |
> Total Memory
> computer 168 123456789 4 Intel 3.06 Ghz
> 3048 MB
> SQL Table
> ComputerName | SerialNumber | ProcNum | ProcessorType | Memory
> Computer 168 123456789 2 Intel 2.8 ghz
> 2048 MB
>
> As you can see the memory and processor and other data were updated in
> the CSV which I need my import to:
> - Find the computer name in the SQL Database table (even know the field
> names are different), and then update the SerialNumber, ProcNum,
> ProcessorType, Memory information.
You need something called a staging table to load this data into. You can
then update the existing records with the values from the staging table when
they match on a (primary) key value ( ComputerName )
UPDATE l
SET SerialNumber = s.SerialNumber
ProcNum = s.ProcNum
ProcessorType = s.ProcessorType
Memory = s.Memory
FROM livetable l
JOIN stagingtable s ON s.ComputerName = l.ComputerName
WHERE l.SerialNumber <> s.SerialNumber
OR l.ProcNum <> s.ProcNum
OR l.ProcessorType <> s.ProcessorType
OR l.Memory <> s.Memory

> - I also want to be sure if the asset name can't be found then it
> should create a new row with the new information. Example if computer
> 218 is not in the SQL database it will populate the database with the
> information.
>
Once you have updated the values you can insert anything that is not already
in the live table
INSERT livetable ( ComputerName, SerialNumber, ProcNum, ProcessorType,
Memory )l
SELECT s.ComputerName, s.SerialNumber, s.ProcNum, s.ProcessorType, s.Memory
FROM stagingtable s
LEFT JOIN livetable l ON s.ComputerName = l.ComputerName
WHERE l.ComputerName IS NULL
Alternatively you could use NOT EXISTS!
John

Importing CSV data into SQL database using DTS

I would like to import CSV data into a SQL data base using DTS. This
SQL database already contains the information however the CSV file
contains updated information.
data.csv
Computer Name | Serial Number | Processors | Processor Description |
Total Memory
computer 168 123456789 4 Intel 3.06 Ghz
3048 MB
SQL Table
ComputerName | SerialNumber | ProcNum | ProcessorType | Memory
Computer 168 123456789 2 Intel 2.8 ghz
2048 MB
As you can see the memory and processor and other data were updated in
the CSV which I need my import to:
- Find the computer name in the SQL Database table (even know the field
names are different), and then update the SerialNumber, ProcNum,
ProcessorType, Memory information.
- I also want to be sure if the asset name can't be found then it
should create a new row with the new information. Example if computer
218 is not in the SQL database it will populate the database with the
information.Hi Wildatom
"Wildatom" wrote:

> I would like to import CSV data into a SQL data base using DTS. This
> SQL database already contains the information however the CSV file
> contains updated information.
> data.csv
> Computer Name | Serial Number | Processors | Processor Description |
> Total Memory
> computer 168 123456789 4 Intel 3.06 Ghz
> 3048 MB
> SQL Table
> ComputerName | SerialNumber | ProcNum | ProcessorType | Memory
> Computer 168 123456789 2 Intel 2.8 ghz
> 2048 MB
>
> As you can see the memory and processor and other data were updated in
> the CSV which I need my import to:
> - Find the computer name in the SQL Database table (even know the field
> names are different), and then update the SerialNumber, ProcNum,
> ProcessorType, Memory information.
You need something called a staging table to load this data into. You can
then update the existing records with the values from the staging table when
they match on a (primary) key value ( ComputerName )
UPDATE l
SET SerialNumber = s.SerialNumber
ProcNum = s.ProcNum
ProcessorType = s.ProcessorType
Memory = s.Memory
FROM livetable l
JOIN stagingtable s ON s.ComputerName = l.ComputerName
WHERE l.SerialNumber <> s.SerialNumber
OR l.ProcNum <> s.ProcNum
OR l.ProcessorType <> s.ProcessorType
OR l.Memory <> s.Memory

> - I also want to be sure if the asset name can't be found then it
> should create a new row with the new information. Example if computer
> 218 is not in the SQL database it will populate the database with the
> information.
>
Once you have updated the values you can insert anything that is not already
in the live table
INSERT livetable ( ComputerName, SerialNumber, ProcNum, ProcessorType,
Memory )l
SELECT s.ComputerName, s.SerialNumber, s.ProcNum, s.ProcessorType, s.Memory
FROM stagingtable s
LEFT JOIN livetable l ON s.ComputerName = l.ComputerName
WHERE l.ComputerName IS NULL
Alternatively you could use NOT EXISTS!
John