Friday, February 24, 2012

importing data

how to import data into sqlserver2000 from oracle 7.please
give me a solutionYou can Still use DTS to download it using ODBC.
"rejurajeev" <rejurajeev@.yahoo.co.uk> wrote in message
news:132ca01c4125c$4532e0f0$a401280a@.phx
.gbl...
> how to import data into sqlserver2000 from oracle 7.please
> give me a solution|||This will give you all that you need to migrate from Orcale to SQL.
Chapter 7 - Migrating Oracle Databases to SQL Server 2000
http://www.microsoft.com/resources/.../reskit/en-us/p
art2/c0761.mspx
Also, if you are interested in knowing the differences between SQL and
Oracle, visit this link.
http://www.microsoft.com/technet/pr...y/sqlorpro.mspx
Sanchan [MSFT]
sanchans@.online.microsoft.com
This posting is provided "AS IS" with no warranties, and confers no rights.

Importing data

Hi Guys
Can someone give me some directions on importing data from another server
e.g.
INSERT INTO SQL2005.MyDB1.dbo.PostCode --this is SQL 2005
( Suburb, PostCode, State)
SELECT Suburb, PostCode, State
FROM SQL2000.MyDB2.dbo.PostCode --this is SQL 2000
Thanks heaps...BooksOnLine not too helpfulThe easiest way I've seen is to use the Import wizard. In the SQL Server
Management Studio Object Explorer, right-click on the target (or source)
database and select All Tasks-->Import Data (or Import Data).
Hope this helps.
Dan Guzman
SQL Server MVP
"Harry Strybos" <harry_NOSPAM@.ffapaysmart.com.au> wrote in message
news:3wIRg.14671$b6.160428@.nasal.pacific.net.au...
> Hi Guys
> Can someone give me some directions on importing data from another server
> e.g.
> INSERT INTO SQL2005.MyDB1.dbo.PostCode --this is SQL 2005
> ( Suburb, PostCode, State)
> SELECT Suburb, PostCode, State
> FROM SQL2000.MyDB2.dbo.PostCode --this is SQL 2000
> Thanks heaps...BooksOnLine not too helpful
>|||Harry
Have you created a linked server to SQL Server 2000 ?
"Harry Strybos" <harry_NOSPAM@.ffapaysmart.com.au> wrote in message
news:3wIRg.14671$b6.160428@.nasal.pacific.net.au...
> Hi Guys
> Can someone give me some directions on importing data from another server
> e.g.
> INSERT INTO SQL2005.MyDB1.dbo.PostCode --this is SQL 2005
> ( Suburb, PostCode, State)
> SELECT Suburb, PostCode, State
> FROM SQL2000.MyDB2.dbo.PostCode --this is SQL 2000
> Thanks heaps...BooksOnLine not too helpful
>

Importing data

Hi Guys
Can someone give me some directions on importing data from another server
e.g.
INSERT INTO SQL2005.MyDB1.dbo.PostCode --this is SQL 2005
( Suburb, PostCode, State)
SELECT Suburb, PostCode, State
FROM SQL2000.MyDB2.dbo.PostCode --this is SQL 2000
Thanks heaps...BooksOnLine not too helpfulThe easiest way I've seen is to use the Import wizard. In the SQL Server
Management Studio Object Explorer, right-click on the target (or source)
database and select All Tasks-->Import Data (or Import Data).
--
Hope this helps.
Dan Guzman
SQL Server MVP
"Harry Strybos" <harry_NOSPAM@.ffapaysmart.com.au> wrote in message
news:3wIRg.14671$b6.160428@.nasal.pacific.net.au...
> Hi Guys
> Can someone give me some directions on importing data from another server
> e.g.
> INSERT INTO SQL2005.MyDB1.dbo.PostCode --this is SQL 2005
> ( Suburb, PostCode, State)
> SELECT Suburb, PostCode, State
> FROM SQL2000.MyDB2.dbo.PostCode --this is SQL 2000
> Thanks heaps...BooksOnLine not too helpful
>|||Harry
Have you created a linked server to SQL Server 2000 ?
"Harry Strybos" <harry_NOSPAM@.ffapaysmart.com.au> wrote in message
news:3wIRg.14671$b6.160428@.nasal.pacific.net.au...
> Hi Guys
> Can someone give me some directions on importing data from another server
> e.g.
> INSERT INTO SQL2005.MyDB1.dbo.PostCode --this is SQL 2005
> ( Suburb, PostCode, State)
> SELECT Suburb, PostCode, State
> FROM SQL2000.MyDB2.dbo.PostCode --this is SQL 2000
> Thanks heaps...BooksOnLine not too helpful
>

Importing data

Hello everybody,
I need some help with importing data from txt to sql.
I have done that with BULK INSERT, but i had some limitations:
-i cant supress any field in source or destination.
Anyone can help me?
Thanks
Leandro Loureiro dos SantosLeandro,
> I need some help with importing data from txt to sql.
> I have done that with BULK INSERT, but i had some limitations:
> -i cant supress any field in source or destination.
You can do this with a bcp format using
bulk insert ...
with (formatfile = 'format_file_path')
Linda

Importing data

I think this is the proper forum for this problem.

I have tried to import data from Excel 2003 into SQL Server. I have tried with 2005, 2000 and MSDE. I couldn't get this to work, so I imported a table with 1946 rows into Access and then into SQL. After getting past the field problems, the import engine finally successfully completed, telling me it imported all 1946 rows. But the first time the table only had 1 row, and the second time only 2 rows.

What's the problem? Either the SQL import didn't import all the rows and only thinks it did, or the data went somewhere else. Can any one shed any light on this problem? I am desperate at this point and can't figure out how to get the data from A to B to C.

Thank you for any help.

http://support.microsoft.com/Default.aspx?kbid=321686|||

First of all, I did not appreciate the above comment. I don't know if this person is just trying to be clever or what.

Even though I have been programming, and have been in this business, for a long time, I consider myself to be a beginner when it comes to VB 2005 and SQL Server 2005, and therefore there may have been a reported bug in 2005 that I'm not aware of, or a more technical problem with importing data.

My workaround for this was to import from either Excel or Access and let SQL Server create a new table, and then write an Insert SQL command to transfer the data, and this worked just fine. However, I would still like the answer to the problem if someone knows what it is.

|||

this -> "Read the 9th amendment. . . you moron!"

that's to the moron that is acting as our president. . . not you. sorry for the confusion!

the link is how to directly import into SQL server from excel using openrowset

|||

Thank you for the clarification and the information.

|||

you gotta love the internet! total loss of context!

cheers!

importing data

we are trying to import data from a flat file using an uptick (`) as a column separator and {CR/LF} as a record terminator. There is a variable number of columns for each record. The initial record in the flat file has 3 columns. Upon processing this record, the import sets all records to 3 columns and does not read the column separators past the second column (even though there may be up to 7 columns in the record).

This method worked ok in DTS2000 and it works with Excel. Any suggestions?

Thank You

You'll need to set up as many columns as you can have in the data. It's tricky because SSIS doesn't handle variable columns very well without implementing some workarounds. (Read in entire row as one field, later split field up using substrings, or other solutions)

Take a look at this forum using the search feature. This has been discussed quite a bit recently.

|||You may try DataDefractor. It handles variable columns quite nicely.

Regards,
Ivan

Ivan Peev | http://www.cozyroc.com/

importing data

we are trying to import data from a flat file using an uptick (`) as a column separator and {CR/LF} as a record terminator. There is a variable number of columns for each record. The initial record in the flat file has 3 columns. Upon processing this record, the import sets all records to 3 columns and does not read the column separators past the second column (even though there may be up to 7 columns in the record).

This method worked ok in DTS2000 and it works with Excel. Any suggestions?

Thank You

You'll need to set up as many columns as you can have in the data. It's tricky because SSIS doesn't handle variable columns very well without implementing some workarounds. (Read in entire row as one field, later split field up using substrings, or other solutions)

Take a look at this forum using the search feature. This has been discussed quite a bit recently.

|||You may try DataDefractor. It handles variable columns quite nicely.

Regards,
Ivan

Ivan Peev | http://www.cozyroc.com/

importing data

how to import data into sqlserver2000 from oracle 7.please
give me a solution
You can Still use DTS to download it using ODBC.
"rejurajeev" <rejurajeev@.yahoo.co.uk> wrote in message
news:132ca01c4125c$4532e0f0$a401280a@.phx.gbl...
> how to import data into sqlserver2000 from oracle 7.please
> give me a solution
|||This will give you all that you need to migrate from Orcale to SQL.
Chapter 7 - Migrating Oracle Databases to SQL Server 2000
http://www.microsoft.com/resources/d...reskit/en-us/p
art2/c0761.mspx
Also, if you are interested in knowing the differences between SQL and
Oracle, visit this link.
http://www.microsoft.com/technet/pro.../sqlorpro.mspx
Sanchan [MSFT]
sanchans@.online.microsoft.com
This posting is provided "AS IS" with no warranties, and confers no rights.

Importing CVS to SQLServer 2005 in a web application

Importing CVS to SQLServer 2005 in a web application:

I need to import CSV file to a table. This is a .Net 2.0 web application running in the WebServer which has .Net 2.0 installed. SQL Server 2005 is a database in another machine. What are my options? Would SSIS work in WebServer if no database installed in that machine.

Thanks

>>.Importing CVS to SQLServer 2005 in a web application: << Do you mean CSV?

>>Would SSIS work in WebServer if no database installed in that machine.<< is SQL client and SQL Management studio was installed on the web server and you could log on to the web server.

>>What are my options? << You have specified if this is a on-off import or a regular job. One way of handling it would be to convert the csv to either a series in inserts to the table or calls to a stored procedure to do the job.

Importing cube using XMLA

Hi,

We're trying to import cubes from SASS 2005 into Oracle BI EE (Siebel Analytics) using XMLA.

I tried to use SDK for Analysis 1.1 with the file msxisapi.dll but it uses msolap.asp on the AS server and I don't get any response from it (written in log).

In this forum I got a reply to use msmdpump.dll in order to create http access to SASS but it won't work with Siebel (shows empty list of databases) becaues I must use XMLA protocol.

Moreover, I don't think it's a security issue because I had managed to access the cube via Excel.

I will be graeful to gain some help.

> In this forum I got a reply to use msmdpump.dll in order to create http access to SASS but it won't work with Siebel (shows empty list of databases) becaues I must use XMLA protocol.

AS2005 aways uses XMLA protocol when you connect over HTTP (i.e. through msmdpump.dll). You may want to pursue this issue with Oracle (Siebel) to understand why you are getting empty list of databases.

|||Hi. I'm experience the same problem that you described, do you any solution?
Thanks.
mreyes@.itsolutions.com.gt

Importing csv to sql database

Hi All,

Here is my code
Dim strCon As String = System.Configuration.ConfigurationSettings.AppSettings("OwnerTrader")
con = New SqlConnection(strCon)
con.Open()
Dim strselect As String = ""
Try
strselect = "INSERT INTO tbl_CSV(CSV_TIME,CSV_SIZE,CSV_LOCATION,CSV_COUNTRY,CSV_LAT,CSV_LON,CSV_COMMENTS)"
strselect = strselect & " SELECT * FROM OPENROWSET('MSDASQL','Driver={Microsoft Text Driver (*.txt; *.csv)};DEFAULTDIR=C:\UploadFiles;Extensions=CSV;','SELECT * FROM TestNew.csv')"
cmd = New SqlCommand(strselect, con)
cmd.ExecuteNonQuery()
con.Close()

i have defined connection string in web.config file and my csv file is inside C:\UploadFiles with the name TestNew.csv file.Can u please check it out the code?is it correct or not.I am getting this error

"SQL Server blocked access to STATEMENT 'OpenRowset/OpenDatasource' of component 'Ad Hoc Distributed Queries' because this component is turned off as part of the security configuration for this server. A system administrator can enable the use of 'Ad Hoc Distributed Queries' by using sp_configure. For more information about enabling 'Ad Hoc Distributed Queries', see "Surface Area Configuration" in SQL Server Books Online."

Regards

I've moved this to the SQL forum because this is not a FAQ. And the message you got isn't an error. It's simply a statement of what happened, why and how to fix it. Follow the instructions and you'll get past this message.

Jeff

Importing CSV files with SqlBulkCopy

I am trying to import a CSV file into an SQL Server table with the OleDbDataReader and SqlBulkCopy objects, like this:

using (OleDbConnection dconn =new OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=c:\\mystuff\\;Extended Properties=\"text;HDR=No;FMT=Delimited\"")){using (OleDbCommand dcmd =new OleDbCommand("select * from mytable.csv", dconn)){try{dconn.Open();using (OleDbDataReader dreader = dcmd.ExecuteReader()){try{using (SqlConnection dconn2 =new SqlConnection(@."data source=MyDBServer;initial catalog=MyDB;user id=mydbid;password=mydbpwd")){using (SqlBulkCopy bc =new SqlBulkCopy(dconn2)){try{dconn2.Open();bc.DestinationTableName ="dbo.mytable";bc.WriteToServer(dreader);}finally{dconn2.Close();}}}}finally{dreader.Close();}}}finally{dconn.Close();}}}

A couple of the columns for the destination table use abitdatatype. The CSV files uses the strings "1" and "0" to represent these.

When I run this code, it throws this exception:

Unhandled Exception: System.InvalidOperationException: The given value of type S
tring from the data source cannot be converted to type bit of the specified targ
et column. --> System.FormatException: Failed to convert parameter value from a
String to a Boolean. --> System.FormatException: String was not recognized as
a valid Boolean.
at System.Boolean.Parse(String value)
at System.String.System.IConvertible.ToBoolean(IFormatProvider provider)
at System.Convert.ChangeType(Object value, Type conversionType, IFormatProvid
er provider)
at System.Data.SqlClient.SqlParameter.CoerceValue(Object value, MetaType dest
inationType)
-- End of inner exception stack trace --
at System.Data.SqlClient.SqlParameter.CoerceValue(Object value, MetaType dest
inationType)
at System.Data.SqlClient.SqlBulkCopy.ConvertValue(Object value, _SqlMetaData
metadata)
-- End of inner exception stack trace --
at System.Data.SqlClient.SqlBulkCopy.ConvertValue(Object value, _SqlMetaData
metadata)
at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternal()
at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServer(Int32 columnCount
)
at System.Data.SqlClient.SqlBulkCopy.WriteToServer(IDataReader reader)
at MyClass.Main()

It appears not to accept "1" and "0" as valid strings to convert to booleans. The System.Convert.ToBoolean method appears to work the same way.

Is there any way to change this behavior? I discovered if you change the "1" to "true" and "0" to "false" in the CSV file it will accept them.

Hello my friend,

I believe this is due to the way that Microsoft Access uses numbers to represent bits. I think 0 is Yes and -1 is No. Try it with 0 and -1 instead of 1 and 0.

Kind regards

Scotty

|||I created a schema.ini file for the table and specified bit columns for it. Then it converted without any problems.

Importing CSV files

Hello,

I need to import a bunch of .csv files. The problem I am having is the
"non data" information in the files creating bogus rows and column
definitions. Here is an example of the csv file.

CBOT - End-of-Day Futures Bulk Download 2001.
2 Year U.S. Treasury Notes Futures

DateSymbolMonth CodeYear CodeOpen
20010103ZTH2001102.09375
20010104ZTH2001102.03125
20010105ZTH2001102.28125

In this case, there are bogues rows created with the text at the
beginning of the file, and also the column names get placed into a row
as well. My question is; how do you import the file and strip out the
"non-data" data? so that only the actual data gets inserted into the db?

Thanks,

TGru

*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!Write a dotnet or vb exe to loop thru your csv file and place valid
rows in a separate file, then run bcp using this new file.|||Use BCP or BULK INSERT with the FIRSTROW (-F) option.

--
David Portas
SQL Server MVP
--|||tgru (tgru@.devdex.com) writes:
> I need to import a bunch of .csv files. The problem I am having is the
> "non data" information in the files creating bogus rows and column
> definitions. Here is an example of the csv file.
> CBOT - End-of-Day Futures Bulk Download 2001.
> 2 Year U.S. Treasury Notes Futures
> Date Symbol Month Code Year Code Open
> 20010103 ZT H 2001 102.09375
> 20010104 ZT H 2001 102.03125
> 20010105 ZT H 2001 102.28125

That doesn't look like a CSV file to me...

> In this case, there are bogues rows created with the text at the
> beginning of the file, and also the column names get placed into a row
> as well. My question is; how do you import the file and strip out the
> "non-data" data? so that only the actual data gets inserted into the db?

Was it only the column headers, you could use -F 2 with BCP to specify
that the BCP is to start with the second record. (I believe -F is the
option. Double-check with Books Online.) But the introducing text is
more difficult to handle. Assuming that there are no tabs in the text,
BCP may consider the row as an error. Then again, you can control how
many errors BCP may accept, so if you only can make sure that you get
in sync, it may be possible.

However, hharry's suggestion that you write a program that strips the
header info, is probably a safer way.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||"tgru" <tgru@.devdex.com> wrote in message news:421f97a6$1_2@.127.0.0.1...
> Hello,
> I need to import a bunch of .csv files. The problem I am having is the
> "non data" information in the files creating bogus rows and column
> definitions. Here is an example of the csv file.
> CBOT - End-of-Day Futures Bulk Download 2001.
> 2 Year U.S. Treasury Notes Futures
> Date Symbol Month Code Year Code Open
> 20010103 ZT H 2001 102.09375
> 20010104 ZT H 2001 102.03125
> 20010105 ZT H 2001 102.28125
>
> In this case, there are bogues rows created with the text at the
> beginning of the file, and also the column names get placed into a row
> as well. My question is; how do you import the file and strip out the
> "non-data" data? so that only the actual data gets inserted into the db?
> Thanks,
> TGru
> *** Sent via Developersdex http://www.developersdex.com ***
> Don't just participate in USENET...get rewarded for it!

Take a look at AWK (I use the GNU version GAWK).

This is a old old text stream (file) processing utility that uses regular
expressions to let you filter and/or reformat data.

You could have it create a new import file that ignores blank lines and
headers (ie. starts with "CBOT').

Importing CSV file with URL using DTS?

I need to be able to create a DTS package that imports a CSV file which is loated at URL. I.E. HTTP://www.url.com/csv/thefile.xls I tried copying the URL an pasting it in the file location when in SQL wazird but I got an error message.

Please help point me to anywhere that I can find some help in doing this? If there is a better way please let me know

ThanksI would post this in the SQL Server Integration Services forum instead of this one. Also, to get a quicker response, please detail exact steps you took to hit the problem, as well as display the exact error message you got.

Importing csv file to SQL Server

Can someone please help me.
I need to import a csv fiel to sql server and I know that the column delimiter is \t
and the newline delimiter is \n but I don't know what the rowterminator is or the fieldterminator. How can I import the file into an empty table in an existing database.
Any suggestions would be greatly appreciated.Use DTS to import the file it can handle it all, including create a new empty table from the file.
You can do it directly in Enterprise Manager, select the database and use right click > All Tasks > Import Data

Importing csv File into SqlExpress .MDF file.

Hi,
I'm trying to use BCP to import a .csv file into a .mdf file.
At the commandline, I'm using the following command:
bcp commercedb.mdf.crmc_products in test.csv -T
I'm getting errors (below) telling me that I'm not successfully
connecting to the database.
Any help would be appreciated. If you can help me, please understand
I'm new to this so limited g speak would be most appreciated.
ERRORS:
SQLState = 08001, NativeError = 53
Error = [Microsoft][SQL Native Client]Named Pipes Provider: Could not
open a con
nection to SQL Server [53].
SQLState = HYT00, NativeError = 0
Error = [Microsoft][SQL Native Client]Login timeout expired
SQLState = 08001, NativeError = 53
Error = [Microsoft][SQL Native Client]An error has occurred while
establishing a
connection to the server. When connecting to SQL Server 2005, this
failure may
be caused by the fact that under the default settings SQL Server does
not allow
remote connections.Data access libraries do not directly access (MDF) SQL Server database files
the same way they can an (MDB) MS Access file. All data access must go
through the SQL Server service itself.
The .mdf database file must first be attached to SQL Server.
http://msdn2.microsoft.com/en-us/library/ms165673.aspx
Once done, you bulk copy specifying the server and database name:
http://msdn2.microsoft.com/en-us/library/ms162802.aspx
<tom.herz@.gmail.com> wrote in message
news:1138982708.766097.92140@.f14g2000cwb.googlegroups.com...
> Hi,
> I'm trying to use BCP to import a .csv file into a .mdf file.
> At the commandline, I'm using the following command:
> bcp commercedb.mdf.crmc_products in test.csv -T
> I'm getting errors (below) telling me that I'm not successfully
> connecting to the database.
> Any help would be appreciated. If you can help me, please understand
> I'm new to this so limited g speak would be most appreciated.
> ERRORS:
> SQLState = 08001, NativeError = 53
> Error = [Microsoft][SQL Native Client]Named Pipes Provider: Could not
> open a con
> nection to SQL Server [53].
> SQLState = HYT00, NativeError = 0
> Error = [Microsoft][SQL Native Client]Login timeout expired
> SQLState = 08001, NativeError = 53
> Error = [Microsoft][SQL Native Client]An error has occurred while
> establishing a
> connection to the server. When connecting to SQL Server 2005, this
> failure may
> be caused by the fact that under the default settings SQL Server does
> not allow
> remote connections.
>|||Hi,
Thanks. This is what I needed. I hadn't attached the database.
I haven't been successful yet, but that is a lack of experience with
all the switches and BCP overall.
Once I got the database attached, things started to gel.
I really appreciate your time, JT.
Have a good day.

Importing CSV file

I'm a newbie to SSIS, so there is probably an easy solution:

I'm importing from a csv file into a db table, but would like to remove the quotation marks from around the data before inserting into the table.

Are there any suggestions of how to remove the "data" marks? Would a Foreach Loop container work for this?

Set the text qualifier in the Flat FIle connection manager to "

Importing CSV data into SQL database using DTS

I would like to import CSV data into a SQL data base using DTS. This
SQL database already contains the information however the CSV file
contains updated information.
data.csv
Computer Name | Serial Number | Processors | Processor Description |
Total Memory
computer 168 123456789 4 Intel 3.06 Ghz
3048 MB
SQL Table
ComputerName | SerialNumber | ProcNum | ProcessorType | Memory
Computer 168 123456789 2 Intel 2.8 ghz
2048 MB
As you can see the memory and processor and other data were updated in
the CSV which I need my import to:
- Find the computer name in the SQL Database table (even know the field
names are different), and then update the SerialNumber, ProcNum,
ProcessorType, Memory information.
- I also want to be sure if the asset name can't be found then it
should create a new row with the new information. Example if computer
218 is not in the SQL database it will populate the database with the
information.
Hi Wildatom
"Wildatom" wrote:

> I would like to import CSV data into a SQL data base using DTS. This
> SQL database already contains the information however the CSV file
> contains updated information.
> data.csv
> Computer Name | Serial Number | Processors | Processor Description |
> Total Memory
> computer 168 123456789 4 Intel 3.06 Ghz
> 3048 MB
> SQL Table
> ComputerName | SerialNumber | ProcNum | ProcessorType | Memory
> Computer 168 123456789 2 Intel 2.8 ghz
> 2048 MB
>
> As you can see the memory and processor and other data were updated in
> the CSV which I need my import to:
> - Find the computer name in the SQL Database table (even know the field
> names are different), and then update the SerialNumber, ProcNum,
> ProcessorType, Memory information.
You need something called a staging table to load this data into. You can
then update the existing records with the values from the staging table when
they match on a (primary) key value ( ComputerName )
UPDATE l
SET SerialNumber = s.SerialNumber
ProcNum = s.ProcNum
ProcessorType = s.ProcessorType
Memory = s.Memory
FROM livetable l
JOIN stagingtable s ON s.ComputerName = l.ComputerName
WHERE l.SerialNumber <> s.SerialNumber
OR l.ProcNum <> s.ProcNum
OR l.ProcessorType <> s.ProcessorType
OR l.Memory <> s.Memory

> - I also want to be sure if the asset name can't be found then it
> should create a new row with the new information. Example if computer
> 218 is not in the SQL database it will populate the database with the
> information.
>
Once you have updated the values you can insert anything that is not already
in the live table
INSERT livetable ( ComputerName, SerialNumber, ProcNum, ProcessorType,
Memory )l
SELECT s.ComputerName, s.SerialNumber, s.ProcNum, s.ProcessorType, s.Memory
FROM stagingtable s
LEFT JOIN livetable l ON s.ComputerName = l.ComputerName
WHERE l.ComputerName IS NULL
Alternatively you could use NOT EXISTS!
John

Importing CSV data into SQL database using DTS

I would like to import CSV data into a SQL data base using DTS. This
SQL database already contains the information however the CSV file
contains updated information.
data.csv
Computer Name | Serial Number | Processors | Processor Description |
Total Memory
computer 168 123456789 4 Intel 3.06 Ghz
3048 MB
SQL Table
ComputerName | SerialNumber | ProcNum | ProcessorType | Memory
Computer 168 123456789 2 Intel 2.8 ghz
2048 MB
As you can see the memory and processor and other data were updated in
the CSV which I need my import to:
- Find the computer name in the SQL Database table (even know the field
names are different), and then update the SerialNumber, ProcNum,
ProcessorType, Memory information.
- I also want to be sure if the asset name can't be found then it
should create a new row with the new information. Example if computer
218 is not in the SQL database it will populate the database with the
information.Hi Wildatom
"Wildatom" wrote:

> I would like to import CSV data into a SQL data base using DTS. This
> SQL database already contains the information however the CSV file
> contains updated information.
> data.csv
> Computer Name | Serial Number | Processors | Processor Description |
> Total Memory
> computer 168 123456789 4 Intel 3.06 Ghz
> 3048 MB
> SQL Table
> ComputerName | SerialNumber | ProcNum | ProcessorType | Memory
> Computer 168 123456789 2 Intel 2.8 ghz
> 2048 MB
>
> As you can see the memory and processor and other data were updated in
> the CSV which I need my import to:
> - Find the computer name in the SQL Database table (even know the field
> names are different), and then update the SerialNumber, ProcNum,
> ProcessorType, Memory information.
You need something called a staging table to load this data into. You can
then update the existing records with the values from the staging table when
they match on a (primary) key value ( ComputerName )
UPDATE l
SET SerialNumber = s.SerialNumber
ProcNum = s.ProcNum
ProcessorType = s.ProcessorType
Memory = s.Memory
FROM livetable l
JOIN stagingtable s ON s.ComputerName = l.ComputerName
WHERE l.SerialNumber <> s.SerialNumber
OR l.ProcNum <> s.ProcNum
OR l.ProcessorType <> s.ProcessorType
OR l.Memory <> s.Memory

> - I also want to be sure if the asset name can't be found then it
> should create a new row with the new information. Example if computer
> 218 is not in the SQL database it will populate the database with the
> information.
>
Once you have updated the values you can insert anything that is not already
in the live table
INSERT livetable ( ComputerName, SerialNumber, ProcNum, ProcessorType,
Memory )l
SELECT s.ComputerName, s.SerialNumber, s.ProcNum, s.ProcessorType, s.Memory
FROM stagingtable s
LEFT JOIN livetable l ON s.ComputerName = l.ComputerName
WHERE l.ComputerName IS NULL
Alternatively you could use NOT EXISTS!
John

Importing CSV Data into a SQL Server 2005 Database

I need to import csv data into a SQL Server 2005 database using SQL Server Management Studio Express. I can't find any menu options for accomplishing this. Is this a limitation of the Express edition, or am I missing something when I try to find this feature?

Thanks for any help provided.

You are missing DTS which comes with service pack 1 the thread below covers how to get it, and a CVS file is comma delimited so SQL Server sees null values so you have to import into a temp table before the destination if you have primary key defined in the destination table. Hope this helps.

http://forums.asp.net/thread/1407898.aspx

|||

Caddre: Thanks for the info. I obtained Service Pack 1 and located DTS. As I run the wizard, it looks like it is set up primarily to transfer data between databases or tables, rather than importing from csv. The only way that I see for importing from csv is to write a sql query. Is there a more direct way using the Import/Export Wizard? Or will I have to write a SQL query?

|||

That cannot be it is a smaller version of a billion dollar ETL(extraction transformation and loading) tool even the previous version can do it, so the link below shows you the steps with the 2000 version. It is not complicated just choose a temp table first or a table without primary key. Hope this helps.

http://www.sqldts.com/default.aspx?276,4

|||

The screen capture images from the link you provided look very different from what I am seeing. I did download my version as described in the other thread and the filename/path matched. I captured some screen images that show what I am seeing. Here is the URL:

http://www.vagarden.com/importwizard.html

I can see how the program shown that you linked to would do the job for me, if I can get it.

Thanks for your help.

|||

I have been searching the Internet for a downloadable version of DTS without any luck.

Since it was not provided with the Express version of SQL Server 2005, I wonder if it is part of the full (Pro) version?

If not, does anyone know where DTS can be obtained?

If not, is it possible to write a sql query that would insert data to a MS SQL Server table from a csv file? If so, what might the query look like?

Thank for any help that can be provided.

|||

Hi sorry,

There is a single query way with the OPENROWSET function but I was hoping to get you the DTS info so I forgot about this function. Try the link below for details. Hope this helps.

http://www.databasejournal.com/features/mssql/article.php/3584751

|||

Caddre, thanks for the info. This should work for my project.

Importing Crystal Reports into Reporting Services

I see that you can import Access reports into Reporting Services. Is that
an add-in or something that will allow me to import Crystal Reports reports
into Reporting Services?
Any help is greatly appreciated.Not out of the box, but there are companies, develop tools for the same.
Goolge and see you will get some tools.
Amarnath
"Greg Smith" wrote:
> I see that you can import Access reports into Reporting Services. Is that
> an add-in or something that will allow me to import Crystal Reports reports
> into Reporting Services?
> Any help is greatly appreciated.
>
>

Importing Comma Delimited Data.

Hello,
I am trying to import files (hundreds) with Comma Delimited Data, because
some columns within the data contain comma's themselves those fields have
been double qouted.
Small example.
1234, hello John, mega
2345, " Hello Andrew, Marie", tall
I have tried to use Bulk insert with a Formatfile in xml and suggested by
Uri Dimant a formatted file.
But with the bulk insert, the field with the comma breaks into two sepperate
fields and this generetes
sometimes an error because the fields are not in sync.
With the formatted file the lines without quotes do not work correctly. I
tried to read the help files on this, but could not find a solution to the
problem.
In total it's about millions of rows, in hundreds of files, for a number off
tables.
Can anybody give advise how to solve this, or where to look in the help
system?
Thanks for your time and attention,
Ben Brugman> Can anybody give advise how to solve this, or where to look in the help
> system?
SSIS will better handle enclosed fields. You can use the import data wizard
from SQL Server Management Studio to generate the package. Right-click the
database in object explorer and select Tasks-->Import Data.
Hope this helps.
Dan Guzman
SQL Server MVP
"ben brugman" <ben@.niethier.nl> wrote in message
news:32e3c$473aef81$53557893$24038@.cache
100.multikabel.net...
> Hello,
> I am trying to import files (hundreds) with Comma Delimited Data, because
> some columns within the data contain comma's themselves those fields have
> been double qouted.
> Small example.
> 1234, hello John, mega
> 2345, " Hello Andrew, Marie", tall
> I have tried to use Bulk insert with a Formatfile in xml and suggested by
> Uri Dimant a formatted file.
> But with the bulk insert, the field with the comma breaks into two
> sepperate fields and this generetes
> sometimes an error because the fields are not in sync.
> With the formatted file the lines without quotes do not work correctly. I
> tried to read the help files on this, but could not find a solution to the
> problem.
> In total it's about millions of rows, in hundreds of files, for a number
> off tables.
> Can anybody give advise how to solve this, or where to look in the help
> system?
> Thanks for your time and attention,
> Ben Brugman
>|||Thanks Dan,
At the moment the target system does not have SSIS installed.
(I'll will ask for SSIS to be installed on the target system.)
Two questions:
Is there another solution, because I should think that normal import
procedures like bulk insert should be able to handle Comma delimited files?
(To my knowledge my situation is fairly common).
Because I allready tried two differend methods, which I could not get to
work I am a bit apprihencive to try a third method, is SSIS going to work
for this kind of situations, or is this something I just have to try?
This is actually my first project really working with SQLserver 2005. With
the previous version I normally succeeded with importing everything I wanted
to import, so at the moment I am feeling a bit stupid. Spending time on the
methods I could not get to work is rather frustrating,
Thanks for your attention and advise.
Ben Brugman
"Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> schreef in bericht
news:%2369Uz5rJIHA.280@.TK2MSFTNGP03.phx.gbl...
> SSIS will better handle enclosed fields. You can use the import data
> wizard from SQL Server Management Studio to generate the package.
> Right-click the database in object explorer and select Tasks-->Import
> Data.
> --
> Hope this helps.
> Dan Guzman
> SQL Server MVP
> "ben brugman" <ben@.niethier.nl> wrote in message
> news:32e3c$473aef81$53557893$24038@.cache
100.multikabel.net...
>|||> Is there another solution, because I should think that normal import
> procedures like bulk insert should be able to handle Comma delimited
> files?
BCP and BULK INSERT do a good job of parsing files as long as field
terminators do not occur in the data. For more advanced parsing, SSIS is a
better choice.

> Because I allready tried two differend methods, which I could not get to
> work I am a bit apprihencive to try a third method, is SSIS going to work
> for this kind of situations, or is this something I just have to try?
I understand your frustration. However, I can tell you that I've
successfully used SSIS to import many CSV files in a loop using a ForEach
task that executes a DataFlow for each file. There is some learning curve
if you've never used SSIS before so you might find an examples helpful. An
example of this technique at http://www.sqlis.com/55.aspx. Just be sure to
specify quotes (") as the text qualifier for the flat file connection of the
data flow task.
Hope this helps.
Dan Guzman
SQL Server MVP
"ben brugman" <ben@.niethier.nl> wrote in message
news:8b4e8$473af645$53557893$23893@.cache
120.multikabel.net...
> Thanks Dan,
> At the moment the target system does not have SSIS installed.
> (I'll will ask for SSIS to be installed on the target system.)
> Two questions:
> Is there another solution, because I should think that normal import
> procedures like bulk insert should be able to handle Comma delimited
> files?
> (To my knowledge my situation is fairly common).
> Because I allready tried two differend methods, which I could not get to
> work I am a bit apprihencive to try a third method, is SSIS going to work
> for this kind of situations, or is this something I just have to try?
> This is actually my first project really working with SQLserver 2005. With
> the previous version I normally succeeded with importing everything I
> wanted to import, so at the moment I am feeling a bit stupid. Spending
> time on the methods I could not get to work is rather frustrating,
> Thanks for your attention and advise.
> Ben Brugman
>
> "Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> schreef in bericht
> news:%2369Uz5rJIHA.280@.TK2MSFTNGP03.phx.gbl...
>|||BULK INSERT and BCP are designed for speed not versatility in handling
different formats. If you don't want to try SSIS, another solution I often
use is to pre-process the files into an even simpler format before feeding
them to BCP/BULK INSERT.
Linchi
"ben brugman" wrote:

> Thanks Dan,
> At the moment the target system does not have SSIS installed.
> (I'll will ask for SSIS to be installed on the target system.)
> Two questions:
> Is there another solution, because I should think that normal import
> procedures like bulk insert should be able to handle Comma delimited files
?
> (To my knowledge my situation is fairly common).
> Because I allready tried two differend methods, which I could not get to
> work I am a bit apprihencive to try a third method, is SSIS going to work
> for this kind of situations, or is this something I just have to try?
> This is actually my first project really working with SQLserver 2005. With
> the previous version I normally succeeded with importing everything I want
ed
> to import, so at the moment I am feeling a bit stupid. Spending time on th
e
> methods I could not get to work is rather frustrating,
> Thanks for your attention and advise.
> Ben Brugman
>
> "Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> schreef in bericht
> news:%2369Uz5rJIHA.280@.TK2MSFTNGP03.phx.gbl...
>

Importing Comma Delimited Data.

Hello,
I am trying to import files (hundreds) with Comma Delimited Data, because
some columns within the data contain comma's themselves those fields have
been double qouted.
Small example.
1234, hello John, mega
2345, " Hello Andrew, Marie", tall
I have tried to use Bulk insert with a Formatfile in xml and suggested by
Uri Dimant a formatted file.
But with the bulk insert, the field with the comma breaks into two sepperate
fields and this generetes
sometimes an error because the fields are not in sync.
With the formatted file the lines without quotes do not work correctly. I
tried to read the help files on this, but could not find a solution to the
problem.
In total it's about millions of rows, in hundreds of files, for a number off
tables.
Can anybody give advise how to solve this, or where to look in the help
system?
Thanks for your time and attention,
Ben Brugman> Can anybody give advise how to solve this, or where to look in the help
> system?
SSIS will better handle enclosed fields. You can use the import data wizard
from SQL Server Management Studio to generate the package. Right-click the
database in object explorer and select Tasks-->Import Data.
--
Hope this helps.
Dan Guzman
SQL Server MVP
"ben brugman" <ben@.niethier.nl> wrote in message
news:32e3c$473aef81$53557893$24038@.cache100.multikabel.net...
> Hello,
> I am trying to import files (hundreds) with Comma Delimited Data, because
> some columns within the data contain comma's themselves those fields have
> been double qouted.
> Small example.
> 1234, hello John, mega
> 2345, " Hello Andrew, Marie", tall
> I have tried to use Bulk insert with a Formatfile in xml and suggested by
> Uri Dimant a formatted file.
> But with the bulk insert, the field with the comma breaks into two
> sepperate fields and this generetes
> sometimes an error because the fields are not in sync.
> With the formatted file the lines without quotes do not work correctly. I
> tried to read the help files on this, but could not find a solution to the
> problem.
> In total it's about millions of rows, in hundreds of files, for a number
> off tables.
> Can anybody give advise how to solve this, or where to look in the help
> system?
> Thanks for your time and attention,
> Ben Brugman
>|||Thanks Dan,
At the moment the target system does not have SSIS installed.
(I'll will ask for SSIS to be installed on the target system.)
Two questions:
Is there another solution, because I should think that normal import
procedures like bulk insert should be able to handle Comma delimited files?
(To my knowledge my situation is fairly common).
Because I allready tried two differend methods, which I could not get to
work I am a bit apprihencive to try a third method, is SSIS going to work
for this kind of situations, or is this something I just have to try?
This is actually my first project really working with SQLserver 2005. With
the previous version I normally succeeded with importing everything I wanted
to import, so at the moment I am feeling a bit stupid. Spending time on the
methods I could not get to work is rather frustrating,
Thanks for your attention and advise.
Ben Brugman
"Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> schreef in bericht
news:%2369Uz5rJIHA.280@.TK2MSFTNGP03.phx.gbl...
>> Can anybody give advise how to solve this, or where to look in the help
>> system?
> SSIS will better handle enclosed fields. You can use the import data
> wizard from SQL Server Management Studio to generate the package.
> Right-click the database in object explorer and select Tasks-->Import
> Data.
> --
> Hope this helps.
> Dan Guzman
> SQL Server MVP
> "ben brugman" <ben@.niethier.nl> wrote in message
> news:32e3c$473aef81$53557893$24038@.cache100.multikabel.net...
>> Hello,
>> I am trying to import files (hundreds) with Comma Delimited Data, because
>> some columns within the data contain comma's themselves those fields have
>> been double qouted.
>> Small example.
>> 1234, hello John, mega
>> 2345, " Hello Andrew, Marie", tall
>> I have tried to use Bulk insert with a Formatfile in xml and suggested by
>> Uri Dimant a formatted file.
>> But with the bulk insert, the field with the comma breaks into two
>> sepperate fields and this generetes
>> sometimes an error because the fields are not in sync.
>> With the formatted file the lines without quotes do not work correctly. I
>> tried to read the help files on this, but could not find a solution to
>> the problem.
>> In total it's about millions of rows, in hundreds of files, for a number
>> off tables.
>> Can anybody give advise how to solve this, or where to look in the help
>> system?
>> Thanks for your time and attention,
>> Ben Brugman
>|||> Is there another solution, because I should think that normal import
> procedures like bulk insert should be able to handle Comma delimited
> files?
BCP and BULK INSERT do a good job of parsing files as long as field
terminators do not occur in the data. For more advanced parsing, SSIS is a
better choice.
> Because I allready tried two differend methods, which I could not get to
> work I am a bit apprihencive to try a third method, is SSIS going to work
> for this kind of situations, or is this something I just have to try?
I understand your frustration. However, I can tell you that I've
successfully used SSIS to import many CSV files in a loop using a ForEach
task that executes a DataFlow for each file. There is some learning curve
if you've never used SSIS before so you might find an examples helpful. An
example of this technique at http://www.sqlis.com/55.aspx. Just be sure to
specify quotes (") as the text qualifier for the flat file connection of the
data flow task.
Hope this helps.
Dan Guzman
SQL Server MVP
"ben brugman" <ben@.niethier.nl> wrote in message
news:8b4e8$473af645$53557893$23893@.cache120.multikabel.net...
> Thanks Dan,
> At the moment the target system does not have SSIS installed.
> (I'll will ask for SSIS to be installed on the target system.)
> Two questions:
> Is there another solution, because I should think that normal import
> procedures like bulk insert should be able to handle Comma delimited
> files?
> (To my knowledge my situation is fairly common).
> Because I allready tried two differend methods, which I could not get to
> work I am a bit apprihencive to try a third method, is SSIS going to work
> for this kind of situations, or is this something I just have to try?
> This is actually my first project really working with SQLserver 2005. With
> the previous version I normally succeeded with importing everything I
> wanted to import, so at the moment I am feeling a bit stupid. Spending
> time on the methods I could not get to work is rather frustrating,
> Thanks for your attention and advise.
> Ben Brugman
>
> "Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> schreef in bericht
> news:%2369Uz5rJIHA.280@.TK2MSFTNGP03.phx.gbl...
>> Can anybody give advise how to solve this, or where to look in the help
>> system?
>> SSIS will better handle enclosed fields. You can use the import data
>> wizard from SQL Server Management Studio to generate the package.
>> Right-click the database in object explorer and select Tasks-->Import
>> Data.
>> --
>> Hope this helps.
>> Dan Guzman
>> SQL Server MVP
>> "ben brugman" <ben@.niethier.nl> wrote in message
>> news:32e3c$473aef81$53557893$24038@.cache100.multikabel.net...
>> Hello,
>> I am trying to import files (hundreds) with Comma Delimited Data,
>> because some columns within the data contain comma's themselves those
>> fields have been double qouted.
>> Small example.
>> 1234, hello John, mega
>> 2345, " Hello Andrew, Marie", tall
>> I have tried to use Bulk insert with a Formatfile in xml and suggested
>> by Uri Dimant a formatted file.
>> But with the bulk insert, the field with the comma breaks into two
>> sepperate fields and this generetes
>> sometimes an error because the fields are not in sync.
>> With the formatted file the lines without quotes do not work correctly.
>> I tried to read the help files on this, but could not find a solution to
>> the problem.
>> In total it's about millions of rows, in hundreds of files, for a number
>> off tables.
>> Can anybody give advise how to solve this, or where to look in the help
>> system?
>> Thanks for your time and attention,
>> Ben Brugman
>>
>|||BULK INSERT and BCP are designed for speed not versatility in handling
different formats. If you don't want to try SSIS, another solution I often
use is to pre-process the files into an even simpler format before feeding
them to BCP/BULK INSERT.
Linchi
"ben brugman" wrote:
> Thanks Dan,
> At the moment the target system does not have SSIS installed.
> (I'll will ask for SSIS to be installed on the target system.)
> Two questions:
> Is there another solution, because I should think that normal import
> procedures like bulk insert should be able to handle Comma delimited files?
> (To my knowledge my situation is fairly common).
> Because I allready tried two differend methods, which I could not get to
> work I am a bit apprihencive to try a third method, is SSIS going to work
> for this kind of situations, or is this something I just have to try?
> This is actually my first project really working with SQLserver 2005. With
> the previous version I normally succeeded with importing everything I wanted
> to import, so at the moment I am feeling a bit stupid. Spending time on the
> methods I could not get to work is rather frustrating,
> Thanks for your attention and advise.
> Ben Brugman
>
> "Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> schreef in bericht
> news:%2369Uz5rJIHA.280@.TK2MSFTNGP03.phx.gbl...
> >> Can anybody give advise how to solve this, or where to look in the help
> >> system?
> >
> > SSIS will better handle enclosed fields. You can use the import data
> > wizard from SQL Server Management Studio to generate the package.
> > Right-click the database in object explorer and select Tasks-->Import
> > Data.
> >
> > --
> > Hope this helps.
> >
> > Dan Guzman
> > SQL Server MVP
> >
> > "ben brugman" <ben@.niethier.nl> wrote in message
> > news:32e3c$473aef81$53557893$24038@.cache100.multikabel.net...
> >> Hello,
> >>
> >> I am trying to import files (hundreds) with Comma Delimited Data, because
> >> some columns within the data contain comma's themselves those fields have
> >> been double qouted.
> >>
> >> Small example.
> >>
> >> 1234, hello John, mega
> >> 2345, " Hello Andrew, Marie", tall
> >>
> >> I have tried to use Bulk insert with a Formatfile in xml and suggested by
> >> Uri Dimant a formatted file.
> >> But with the bulk insert, the field with the comma breaks into two
> >> sepperate fields and this generetes
> >> sometimes an error because the fields are not in sync.
> >>
> >> With the formatted file the lines without quotes do not work correctly. I
> >> tried to read the help files on this, but could not find a solution to
> >> the problem.
> >>
> >> In total it's about millions of rows, in hundreds of files, for a number
> >> off tables.
> >>
> >> Can anybody give advise how to solve this, or where to look in the help
> >> system?
> >>
> >> Thanks for your time and attention,
> >> Ben Brugman
> >>
> >
>|||All thanks for you participation,
Dan and Linchi, thanks for your comments.
At the moment we have adopted the solution of dropping the column which was
causing all the problems.
When we do need the column, we will probably end it with a not used symbol
or set of symbols, or quoting all strings for this column.
Thanks for you time and attention,
Ben
"Linchi Shea" <LinchiShea@.discussions.microsoft.com> schreef in bericht
news:270A3E1B-6300-43F0-B30C-C27B8291912C@.microsoft.com...
> BULK INSERT and BCP are designed for speed not versatility in handling
> different formats. If you don't want to try SSIS, another solution I often
> use is to pre-process the files into an even simpler format before feeding
> them to BCP/BULK INSERT.
> Linchi
> "ben brugman" wrote:
>> Thanks Dan,
>> At the moment the target system does not have SSIS installed.
>> (I'll will ask for SSIS to be installed on the target system.)
>> Two questions:
>> Is there another solution, because I should think that normal import
>> procedures like bulk insert should be able to handle Comma delimited
>> files?
>> (To my knowledge my situation is fairly common).
>> Because I allready tried two differend methods, which I could not get to
>> work I am a bit apprihencive to try a third method, is SSIS going to work
>> for this kind of situations, or is this something I just have to try?
>> This is actually my first project really working with SQLserver 2005.
>> With
>> the previous version I normally succeeded with importing everything I
>> wanted
>> to import, so at the moment I am feeling a bit stupid. Spending time on
>> the
>> methods I could not get to work is rather frustrating,
>> Thanks for your attention and advise.
>> Ben Brugman
>>
>> "Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> schreef in bericht
>> news:%2369Uz5rJIHA.280@.TK2MSFTNGP03.phx.gbl...
>> >> Can anybody give advise how to solve this, or where to look in the
>> >> help
>> >> system?
>> >
>> > SSIS will better handle enclosed fields. You can use the import data
>> > wizard from SQL Server Management Studio to generate the package.
>> > Right-click the database in object explorer and select Tasks-->Import
>> > Data.
>> >
>> > --
>> > Hope this helps.
>> >
>> > Dan Guzman
>> > SQL Server MVP
>> >
>> > "ben brugman" <ben@.niethier.nl> wrote in message
>> > news:32e3c$473aef81$53557893$24038@.cache100.multikabel.net...
>> >> Hello,
>> >>
>> >> I am trying to import files (hundreds) with Comma Delimited Data,
>> >> because
>> >> some columns within the data contain comma's themselves those fields
>> >> have
>> >> been double qouted.
>> >>
>> >> Small example.
>> >>
>> >> 1234, hello John, mega
>> >> 2345, " Hello Andrew, Marie", tall
>> >>
>> >> I have tried to use Bulk insert with a Formatfile in xml and suggested
>> >> by
>> >> Uri Dimant a formatted file.
>> >> But with the bulk insert, the field with the comma breaks into two
>> >> sepperate fields and this generetes
>> >> sometimes an error because the fields are not in sync.
>> >>
>> >> With the formatted file the lines without quotes do not work
>> >> correctly. I
>> >> tried to read the help files on this, but could not find a solution to
>> >> the problem.
>> >>
>> >> In total it's about millions of rows, in hundreds of files, for a
>> >> number
>> >> off tables.
>> >>
>> >> Can anybody give advise how to solve this, or where to look in the
>> >> help
>> >> system?
>> >>
>> >> Thanks for your time and attention,
>> >> Ben Brugman
>> >>
>> >
>>

importing chinese text

Hello,
I have some multibyte characters and I want to put them into table fields.
how I can do this? Just storing them will result in a ? when reading /
showing them.
thanks & regards
MarkI'm doing similar thing without any problems.

My Chinese characters are in a text file, I used DTS to import them
into my SQL DB, they are shown on web pages perfectly.

How did you read/show them?

SL at http://www.source4book.com

Importing binary files into sql server database

I work for a company that makes heat transfers for the imprinted apparel market. We're developing a database of merchandise images for all of our non-design inventory. Using Access we're going to be inserting thumbnails of psd (photoshop) files. We're wondering if there is any way to import multiple psd's into the sql server database into matching records like matching a column named "filename" and the actual filename of the file without having to upload each file individually. We want to be able to dump the files from the database of the matching records, also. This way, once our catalog designer has found which designs they need to put into the new catalog, it will dump the psd's for us. The same for our staffer who does color separations.

Any suggestions out there? If you need me to post further of what we're trying here, I will. This is for the bossman.store the location and file names and not the files themselves. you may be able to use text\image\blobs but I would reccomend against it. it will bloat your database, probably hamper performance and likely cause unneeded difficulty.|||store the location and file names and not the files themselves. you may be able to use text\image\blobs but I would reccomend against it. it will bloat your database, probably hamper performance and likely cause unneeded difficulty.

As mentioned here (and in other posts, do a search), the general consensus among the community is to store a pointer (link) in the database and store the binary (image file) on a web server or somewhere else.

However, I have run into situations where this was not permitted (courtesy of some rabid security folks) or impractical. I currently manage a pair of 200 GB image repositories and SQL does just fine storing the blobs. Do a google search for "pure asp upload" or similar variants and you will find some sample code for uploading binary streams to a database.

Regards,

hmscott|||at my last gig we measured our image data in the terabytes and there is no way I am bloating my database and my database server with that stuff and we got new stuff all of the time. One week managment came in with 2 terabytes from the Smithsonian. If I store that stuff in SQL Server not only would I have I needed a new machine, which I did, but I would have had to migrate all of my DBs onto a new machine as well.

Besides logisitics, technically blob data is just a pain to handle. I was going back over some Ken Henderson last night and as he points out if you store data in blobs, the data is not really in the row. What you get is a sixteen bit pointer to other data pages and if you do not insert the blob at record creation you do not get a pointer. So it is reccomended to insert a default value. My two cents regarding this is now you are wasting space and inserting junk data. Ken also points out that BLOB data creates a transaction log conundrum. Do you log it and bloat the log or not and compromise your db's integrity.|||Thanks for these suggestions. This is a pretty new trail for us to blaze. Glad to know there are forums like this where we can get questions answered.

Importing binary data tables?

My apologies in advance for hitting the forum with such a basic question...

I've got a large number of raw binary data tables generated by some external software that I'd like to import into a SQL database. I've set up the database tables to hold the information and assigned the table columns the appropriate data types, but I can't figure out how to import the data.

From what I can see there isn't a data import 'wizard' that supports raw binary, and my limited experience with bcp has given me the impression that it only supports ASCII tables. I find it hard to believe that there isn't a DTS procedure to handle this... but I'm definitely stuck...

Thanks,
-fpsivYou can import data by using bcp or bulk insert to text or image fields and then parse data if it needs.|||Thanks for the quick reply snail...

I was working with BULK INSERT via the Query Analyzer... but ran into a brick wall in trying to set up the format file. I can't seem to get the 'column collation' parameter set correctly. I thought that 'collation' applied to character data... is there a correct collation for raw binary?

Thanks,
-fpsiv|||Check this : import image, I was trying to the same for binary file - it works too.

http://www.dbforums.com/t926676.html|||Thanks snail... got it working! I really appreciate your help!

Cheers
-fpsiv|||hmmml ! I tried the same query but it didn't worked :

CREATE TABLE Pictures
(
Pic_id smallint
IDENTITY(1,1)
PRIMARY KEY CLUSTERED,
Images nvarchar(5000)
)

BULK INSERT Northwind.dbo.[pictures] FROM 'c:\bridge.jpg'
WITH (
FORMATFILE = 'c:\bcp.fmt'
)

------------
Server: Msg 156, Level 15, State 1, Line 1
Incorrect syntax near the keyword 'from'.

Did I missing something?

Thanks

Importing Big Text fields with <CR><LF> in them

Env : SLQSERVER2000 - DTS

I want to import a text file with 3 fields in it ...

F1: Ident char(3)
F2: Note (Text) char(32,000) yeah big eh!
F3: Date

Field F2: has <CR><LF> thru it, this means the DTS import thinks its a
multiple records to deal with as each row is also delimited by <CR><LF>,
of course this causes problems when loading to the receiving table. The
import creates new records.

Hence is there a way to filter these formatting characters out easily
without disrupting the row delimiter. Using a VbScript, if anyone has
done this I'd be grateful of a copy of the script or utility.

The source of the file (IBM Iseries base app) cannot change the output
format.

Thanks in advance

Paul

*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!"Paul Scotchford" <paul.scotchford-eds@.eds.com> wrote in message
news:4091da3f$0$206$75868355@.news.frii.net...
> Env : SLQSERVER2000 - DTS
> I want to import a text file with 3 fields in it ...
> F1: Ident char(3)
> F2: Note (Text) char(32,000) yeah big eh!
> F3: Date
> Field F2: has <CR><LF> thru it, this means the DTS import thinks its a
> multiple records to deal with as each row is also delimited by <CR><LF>,
> of course this causes problems when loading to the receiving table. The
> import creates new records.
> Hence is there a way to filter these formatting characters out easily
> without disrupting the row delimiter. Using a VbScript, if anyone has
> done this I'd be grateful of a copy of the script or utility.
> The source of the file (IBM Iseries base app) cannot change the output
> format.
> Thanks in advance
> Paul
> *** Sent via Developersdex http://www.developersdex.com ***
> Don't just participate in USENET...get rewarded for it!

If the input file has fixed-width columns, then DTS should still work,
although you may need to set up the text file connection definition from a
script rather than with the interface. Alternatively, you could run the
input file through a script written in Perl, VB, C# etc. to replace the
delimiters with something else before loading it with DTS.

Simon

Importing big data with exception.

Hi, there;

I use ASP.NEP to create a SSIS package to import data from .dbf file. When I import data from a big file (216,173KB) my package throw exception:

An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Communication link failure".
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "TCP Provider: The specified network name is no longer available.
".
The "input "OLE DB Destination Input" (71)" failed because error code 0xC020907B occurred, and the error row disposition on "input "OLE DB Destination Input" (71)" specifies failure on error. An error occurred on the specified object of the specified component.
The ProcessInput method on component "OLE DB Destination" (58) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
The PrimeOutput method on component "OLE DB Source" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
Thread "WorkThread0" has exited with error code 0xC0209029.
Thread "SourceThread0" has exited with error code 0xC0047038.

An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Communication link failure".
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "TCP Provider: The specified network name is no longer available.
".
The "input "OLE DB Destination Input" (77)" failed because error code 0xC020907B occurred, and the error row disposition on "input "OLE DB Destination Input" (77)" specifies failure on error. An error occurred on the specified object of the specified component.
The ProcessInput method on component "OLE DB Destination" (64) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
Thread "WorkThread0" has exited with error code 0xC0209029.
The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
The PrimeOutput method on component "OLE DB Source" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
Thread "SourceThread0" has exited with error code 0xC0047038

Does anybody know what the problem is?

Thanks.

Sounds like the connection to your destination database timed out and was closed. Most likely the SQL Server instance severed the connection. Are you performing a large sort or an aggregate in the dataflow? You might first try going to a raw file instead of the database table and then create a second data flow that uses the raw file as the source straight into the database table.|||Thanks.

My structure is: my data source files (*.dbf file)are under another machine, some files are very big (216,173KB, 283,845KB, some are even bigger as time goes on). I want to import these data to my new SQL2005 server which is empty now. I create a table before importing data (Works now). In my source component I set " oSrcInstance.SetComponentProperty("SqlCommand", "SELECT * FROM [" + sourceTable+"]");", there is no other aggregate operation between source and destination, just simply pump all data from source to destination. I can do this with my SQL2000 DTS (same as a ASP.NEP project).
So simple as Source (*.dbf)--> Destination (SQL2005).

Thanks.
|||

Are you creating the new table through an Exec SQL task? Is it using the same connection manager as your data flow task? Trying to verify that the package is connecting successfully. Also, how long is the package running before you received this error? Is it occuring if you try a smaller file?

You might try setting the OLE DB Destination data access mode property to one of the fast load options, and specify Rows Per Batch as 10,000 or 20,000, to see if you get different results.

|||

No, I don't use ExecSQL task to create table, I just use SqlConnection to create table and then close+dispose the connection.

Actually I am runnning the package now. It looks like it is running slower and slower. I can see the row numbers increaing in the SQL2005 database.

I am using OLE DB Destination,

oDestInstace.SetComponentProperty("CommandTimeout", 0);

oDestInstace.SetComponentProperty("OpenRowset", "[dbo].[" + destinationTable+"]");

oDestInstace.SetComponentProperty("OpenRowsetVariable", null);

// oDestInstace.SetComponentProperty("SqlCommand", null);

oDestInstace.SetComponentProperty("DefaultCodePage", 1252);

oDestInstace.SetComponentProperty("AlwaysUseDefaultCodePage", false);

oDestInstace.SetComponentProperty("AccessMode", 0);

oDestInstace.SetComponentProperty("FastLoadKeepIdentity", false);

oDestInstace.SetComponentProperty("FastLoadKeepNulls", false);

// oDestInstace.SetComponentProperty("FastLoadOptions", null);

oDestInstace.SetComponentProperty("FastLoadMaxInsertCommitSize", 0);

The exception happens after package runs for 30 minutes or more. It doesn't occur for small files.

Thanks

|||

Try setting FastLoadOptions to TABLOCK, CHECK_CONSTRAINTS, ROWS_PER_BATCH=1000

and FastLoadMaxInsertCommitSize to 1000.

|||

Yes, you are right. I did that and it works perfectly. Apart from FastLoadMaxInsertCommitSize = 0.

Thanks.

Importing ASCII file using DTS with no row delimiter

Hi,

I am trying to import a file using DTS that does not have any row delimiter. It is a file that was exported from an OLD OLD OLD Macintosh database program (now defunct) called TouchBase. The file uses quotes as text seperators, and tabs for the column delimiters.

Unfortunately, from what I can tell, there are now row delimiters, so the data just keeps going and going, ie like this:

"Bob Smith" "1234 Market Street" "San Diego" "CA" "Josh Smith" "1212 Anywhere Street" "San Diego" "CA" "Jane Smith" "1234 Jane Street" "San Diego" "CA"

Any idea how I can get this to import, or some way to add the delimiter to the file?May be you can use third party tools to the text file to add row delimiter.

And refer to this SQLDTS (http://www.sqldts.com/default.aspx?6,101,257,0,1) link to accomplish the task.|||I can't seem to find any 3rd party tools using google. I've opened the file using a freeware hex editor, and used global search and replace to add one, but then I found out that some records meet the search pattern i chose in the middle of the record, not the end, so that screwed the whole file up..

Very frustrating because I have multiple problems with the file:

#1 - The User used CRLF in a field called "Notes" which is just a free flowing field where the user can type as much or as little as they want in a little diary on the record.

#2 - The User also used tabs in the notes fields, which is throwing off the delimiter for the column.

#3 - The User used double quotes " inside the notes field so that is also throwing it off.

UGH.. Not sure how to fix this mess.|||how big is the file?|||File is about 2.5 megs.

Finally got it.

Used a freeware hex/ascii editor called xvi (search google for hex editor, it's the first one that comes up).

What I did first was stip all CRLF out of the file, since they were only in places where they shouldn't be, not at the end of each row.

Then I went through and did a global replace on all "<TAB>" and changed it to {~} (I first made sure she wasn't using those characters any where in the file!)

Then, I went and stripped all the tabs and all the quotes from the file.

Then i went back and replaced the {~} with "<TAB>" so they would be where they were supposed to.

Finally, I went and I added the CRLF to the end of each record. The way I did this was the last field in each record was a "Date last modified". I asked the client if they needed this, and they said no.

So, I took and used the "wildcard" feature in XVI and did a global replace. Searched for:

"MM/DD/YY"<TAB> and replaced with "MM/DD/YY"<CR><LF>

However, it was kinda tricky, I had to actually do 4 global replaces, because it's an old mac database, and it stored the data in M/D/YY format, MM/D/YY format, M/DD/YY format, and MM/DD/YY format, so there were 4 possible combinations of the way the date was stored for date last modified.

Thank god it worked!

I had 4 errors when importing, and of course SQL server aborts the import if there are any errors. I had to go back in the file and scroll down to the aproximate part where the error was, and look for the problem. Turns out there were 4 records that were missing a field, and that was throwing the whole import off, so I added the field in where apropriate, or just deleted the record.

All in all, this job took me about 15 hours to scrub the file, do all my home work, and get everything together.. Definately a great learning experience for me, and VERY rewarding that I finally got her data to her!!!!|||If you need to do alot of this type of, I would get Codewright. I think Borland now owns the product. It is pricey but well worth it.

Importing AS 2000 cube into SSAS 2005

Is there any way to import a cube created in AS 2000 into SSAA 2005? If not a direct import is there any way to make that job easier? Any suggestions/pointer/comments would be helpful.

I have a lot of AS 2000 cubes and I am really sick of that development environment!

You can use the Migration Wizard (search for MigrationWizard.exe in the SQL installation folder, or right click on the server node in SQL Management Studio and use 'Migrate ...'). The wizard requires AS2000 to be running and can generate a migration script (that can be later run on AS2005 with SQL Management Studio) or can migrate directly to AS2005.

You cannot restore AS2000 backup files (.cab) on AS2005.

Adrian Dumitrascu

Importing and merging csv using sql

Hi Gurus.

My client drops for me many files like this on a shared drive M: daily
1_Test.csv
2_Test.csv
3_Test.csv

I would like you to advice on how to write a SQL code (that can include DTS if possible) that will take this files, merge them into one (since they have same columns) and send them to another client as one file. But it must only take the files of the same date and must not resend files i have send already. I need this to be an automated process since the files are dumped into M:drive midnite and I need this code so that I can schedule it as a job and run around 4h00am.OK..

I would do all (and have...now where is that code) in a stored procedure.

I would

1. us xp_cmdshell to interogate the directory and store all of the contents in a temp table
2. I would then parse it out and look for the files I want
3. I would then load these file to a temp table and audit them 1 at a time using bcp
4. When everything looked good, I would insert from the temp table to the final destination
5. I would then use xp_cmdshell again to vreate an datetime stamped sub archive folder, and would move EVERYTHING to that archive (i'm a neat freak...well...)
6. Thoughout the process I would echo out messages as to what;s going on to a file, and on the way out I would bcp that file to a log table.

That's what I would do.

If you want I can dig up some code.|||I would also go the route that Brett suggested, there might be some "features" that you could skip, but on the other hand you get quite flexible code if you follow his suggestions!

The only thing that I would do differently is that I would create one folder for the archived files and store them there (with dates as filenames), but this is just a matter of personal preference. :)

Why would you like to include DTS to this?
...I know that DTS is good for a lot of things, but personally I find scritping much more portable (this might have something to do with that I haven't used DTS that much, so my opinion could change in the future).

Check out this page from Nigel Rivett:
http://www.nigelrivett.net/ImportTextFiles.html
...you should at least get an idea of how to aproach the problem.

Pati|||I would do like Brett Kaiser and query the directory where these files are kept. And store all the contents in a #temp table. What I would do differently is that I would use a directory structure. I would for example keep the files in C:\CSV and once the file has been processed I would move the file to another directory say C:\CSV\sent by using a stored procedure.

In the database say TEST, I would create a table test_csv with the same fields as the .csv file.

I will then use a procedure like this one, I have written it specially for you. The input parameter will come from the #temp table well which means this procedure will be called from another procedure that will have loop for the filenames field and the procedure csv_test will be called from inside the loop.

-- SQL CODING BEGINS HERE
-- csv_test 'C:\CSV\Book2.csv'
create proc csv_test
@.filename varchar(70)
as
begin

declare @.cmd as varchar(100)
declare @.cmd_insert as varchar(300)

SET @.filename = replace(@.filename, '''', ''')
set @.cmd_insert = 'BULK INSERT TEST.dbo.test_csv FROM ''' + @.filename + ''' WITH ( FIELDTERMINATOR = '','', ROWTERMINATOR = ''\n'' )'
exec (@.cmd_insert)

set @.cmd = 'xp_cmdshell ''move ' + @.filename + ' C:\CSV\SENT'''
exec (@.cmd)

end

-- SQL CODING ENDS HERE

And then at last transfer the table to the destination table and truncate csv_test. Remember you can do everything using a procedure and schedule a DTS to run at a specified time every day.
:)

Roshmi Choudhury

importing and exporting tables

Hello Everybody,
Not sure if this is the correct Forum, anyway lemme me know your views. I have SQL Server running in two remote machines (in different geographical locations). I need to import a database from one SQL Server installation to another installation. What is best possible solution:
1) Transfering the .mdf and .ldf file from one installation to another installation
2) Generating the records of all the tables in the database to a common file format CSV and then loading it in destination database. If this is ok then how to generate import and export data using CSV format
3) Any other options?
Srikanth

If you need all data and structures, then the best method is a backup. MDF and LDF will do much the same, it covers everything too, but those files also include free space within the DB, so normally are much larger than required just for data transfer, and for running a system that is correct, just not so nice for copying files around.

|||

also when you back up or move your database files ( I would suggest a backup, it is cleaner) do a database and log shrink to reduce the size of the backup.

Performing a zip file afterwards on the bak will reduce the size more for travel.

|||

I think shrinking database and logs is a very bad idea. It can have serious impact on a production system, such as block transactions if you go too small, and consuming lots of IO. It can also cause fragmentation in the files themselves, and on disk. Good management of files is essential for performance and availability, and shinking does not normally fit with that. Saying that for a small system it may not matter, but it should not be seen as a easy or cheap option.

Zipping will help a lot, you may even want to consider some of third-party tools that compress backups, and genarlly run faster as well compared to native backup. Quest has LiteSpeed, Idera has something and Red Gate have SQL Backup.

|||Thanks a lot for your suggestions. How is CSV formating of the tables in the database. Is it possible? If so how?
|||

You can export data from a table to a CSV, but I think we said backups would be better for all tables etc.

The simplest way to export a table to CSV would be to use the SSIS Import/Export Wizard. There are several ways to invoke this, but try right-clicking the database node in SSMS and select Tasks - Export Data

Importing and Exporting of XML Files (as Instance Data) in SQL Server 2000

Hello,
I have a SQL Server 2000 Database that feature individual records
which are stored in a table with data (essentially a large XML file
full of data) in the InstanceData field as XML.
I would appreciate it if someone could place some code that
demonstrates how to do importing and exporting? For 2 simple examples:
- Importing to a Excel Spreadsheet
- Export from a Excel Spreadsheet
Your sample code would be very much appreciated.
Also, is it possible to simply configure a ODBC Data Source for
linkage to other databases or applications?
Thank you for your kind support and assistance.
Cheers,
Colin.
You may get a better answer if you as this information in the Excel specific
newsgroup.
The data can easily be retrieved from the database or stored there. The main
issue will be to hook it up with Excel for which you will need an Excel
expert...
Best regards
Michael
"Colin Sheppard" <luddite1812@.yahoo.com> wrote in message
news:c652818c.0412010324.6ff80edd@.posting.google.c om...
> Hello,
> I have a SQL Server 2000 Database that feature individual records
> which are stored in a table with data (essentially a large XML file
> full of data) in the InstanceData field as XML.
> I would appreciate it if someone could place some code that
> demonstrates how to do importing and exporting? For 2 simple examples:
> - Importing to a Excel Spreadsheet
> - Export from a Excel Spreadsheet
> Your sample code would be very much appreciated.
> Also, is it possible to simply configure a ODBC Data Source for
> linkage to other databases or applications?
> Thank you for your kind support and assistance.
> Cheers,
> Colin.

importing and excel file

How do i import a Excel file into a table i have created in my database in SQL server 2005?

http://support.microsoft.com/kb/321686

They have listed various method choose the one which suits u most

|||

u can do it like this

Try

Dim excelConnectionStringAsString =String.Format("Provider=Microsoft.Jet.OLEDB.4.0;Data Source={0};Extended Properties=Excel 8.0", location)

'"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=location;Extended Properties=Excel 8.0"

' Create Connection to Excel Workbook

Using connectionAs Data.OleDb.OleDbConnection =New Data.OleDb.OleDbConnection(excelConnectionString)Dim commandAs Data.OleDb.OleDbCommand =New Data.OleDb.OleDbCommand("Select CompanyName,CompanyContact,Title,AddressLine1,AddressLine2,City,State,Zip,Telephone,Web FROM [Sheet1$]", connection)

connection.Open()

' Create DbDataReader to Data Worksheet

Using drAs Data.Common.DbDataReader = command.ExecuteReader()

' SQL Server Connection String

Dim sqlConnectionStringAsString = System.Configuration.ConfigurationManager.AppSettings(APPSETTING_DBCONNECTION)'"Data Source=.; Initial Catalog=Test;Integrated Security=True"

Dim myconnectionAsNew SqlConnection(System.Configuration.ConfigurationManager.AppSettings(APPSETTING_DBCONNECTION))

Dim queryAsString ="Truncate Table FiservSpecific"

myconnection.Open()

Dim cmdAsNew SqlCommand(query, myconnection)

cmd.CommandType = CommandType.Text

cmd.ExecuteScalar()

myconnection.Close()

' Bulk Copy to SQL Server

Using bulkCopyAs SqlBulkCopy =New SqlBulkCopy(sqlConnectionString)

bulkCopy.DestinationTableName ="FiservSpecific"

bulkCopy.WriteToServer(dr)

EndUsing

EndUsing

connection.Close()

EndUsing

|||

they don't have a basic just import the file into the table and then get rid of the excel file, and edit the table via VWD or anything like that?

|||

u can use a DTS package, but i am importing the contents of the excel file directly to the sql server database

|||

DTS package?? well...i have this list of information that is in the excel file, i have a table created on the server...the excel file is on the server's desktop, now all i want to do is to take what is in the excel file and put it in the server?? what is the easiest way?

|||

If u want to do it directly thru sql server, then u can

Go to the database -- > right click and then select import data and u would get wizard select the Datasource thats the excel and just follow the steps.

If u want to do it from a asp.net website just add a upload control to ur site and on click of a button save the file if u want or just directly open the excel file and dump the data to sql server.

hope this helps.

Regards

Karen

|||

which one of these works? it ask me to connect through ODBC?? or a system DSN?? it's a local file so??

|||

these are the steps.

1) write click on the Database

2) select Task - ImportData

3) select microsoft Excel from the drop down list and then browse and select the file u want to import. click next.

4) then select the database, give the username and password if any to access the databse or sql server. click next.

5) select the first option, click next, click on the sheet name of the sheet u want the data from and then in the destination select the table where u wanna put the data too. Click on edit mappings and see going where it has to go .

6) click next and finsh , u can save the package for future use and just replace the file and run the package again.

Hope this helps.

Regards,

Karen

|||

For all those questions and answers about excel import., into a database., using bulkcopy.

1. I had used a webserver model, where user would upload excel files., which would further processed using bulkcopy to make data available with database. ( packet was c# +sqlserver ) as asp.net application. but excel contained worksheets, which has to be parsed sheet after sheet. i did pass the sheet names dynamically and use bulkcopy inside a loop till all sheets were processed. but it so happened the data columns containted uneven datatypes, ( double instead of datetime), the bulkcopy failed after processing first sheet. ( again users are users )

2. I tried using DTS, same, all data was ported by this time into different tables, but some dates were invalid ( because they were not in datetime format )., still i get useless data.

Now my question is., should the data across the worksheet be in same format ? for bulk copy to work ? because i cannot expect users to have formatted data in each and every cell. if one cell has different data, 1st method fails, 2nd method turns those data into dates like 01-01-1900 :00:00:00, how to go about this problem ??

|||

When the user sends in the spread sheet make sure its been formatted as Date and Let them select the date format as with the regional settings for eg, 3/12/2007

Hope this helps.

Regards

Karen

importing and archiving

I was wondering how i could use dts to import accessfiles and then archive
them to another folder. I've read some examples on sqldts.com but i still
can't figure it out.

Basically this is what i want:

- import an accessfile which has a name like this <companyname>_<today's
date>.mdb. The importfolder is called in my case d:\import

- for updating some fields, the dts should use the company's name from the
accessfile and check that with a lookuptable to translate the name into a
int value.

- after some processing, the dts should place the file to another folder. In
my case it is d:\archive

If someone can help me with this, i would be very greatful.

--

----------------
This mailbox protected from junk email by MailFrontier Desktop
from MailFrontier, Inc. http://info.mailfrontier.comI tend to avoid DTS. I prefer to use xp_cmdshell (to perform Command
line commands) and linked servers. In this case, use xp_cmdshell to
FTP the files down. Create a linked server to the Access MDB.
Process. Then use xp_cmdshell to move the file to the archive folder.

//FTP SP
CREATE PROCEDURE FTP
(
@.ftpserver varchar(50)='myserver' ,
@.ftpuser varchar(50)='myusername' ,
@.ftppwd varchar(50)='mypassword' ,
@.batchfile varchar(100)='\\myserver\path\ftpcmd.txt' ,
@.destfile varchar(100)='\\myserver\path\mydestfile.txt' ,
@.sourcefile varchar(50)='mysourcefile',
@.mode varchar(3)='get' -- get | put
)
as
DECLARE @.cmd varchar(1000)

SELECT @.cmd = 'echo ' + 'open ' + @.ftpserver + ' > ' + @.batchfile
EXEC master..xp_cmdshell @.cmd
SELECT @.cmd = 'echo ' + @.ftpuser + '>> ' + @.batchfile
EXEC master..xp_cmdshell @.cmd
SELECT @.cmd = 'echo ' + @.ftppwd + '>> ' + @.batchfile
EXEC master..xp_cmdshell @.cmd

SELECT @.cmd = 'echo ' + @.mode + ' ' + @.sourcefile + ' ' + @.destfile + '
>> ' + @.batchfile
EXEC master..xp_cmdshell @.cmd
SELECT @.cmd = 'echo ' + 'quit' + ' >> ' + @.batchfile
EXEC master..xp_cmdshell @.cmd
SELECT @.cmd = 'ftp -s:' + @.batchfile
EXEC master..xp_cmdshell @.cmd
SELECT @.cmd = 'del ' + @.batchfile
EXEC master..xp_cmdshell @.cmd|||"Ezekil" <ezekil@.lycos.com> wrote in message
news:41ffd97b$0$71093$cd19a363@.news.wanadoo.nl...
> I was wondering how i could use dts to import accessfiles and then archive
> them to another folder. I've read some examples on sqldts.com but i still
> can't figure it out.
> Basically this is what i want:
> - import an accessfile which has a name like this <companyname>_<today's
> date>.mdb. The importfolder is called in my case d:\import

Generate the current Access filename and assign it to a package global
variable in an ActiveX step, then use a Dynamic Properties task to set the
Access Connection properties.

> - for updating some fields, the dts should use the company's name from the
> accessfile and check that with a lookuptable to translate the name into a
> int value.

It depends if you're going to load the Access data into MSSQL or not - if
so, then an Execute SQL task could join the tables or call a stored proc.
Otherwise, ActiveX will work again.

> - after some processing, the dts should place the file to another folder.
> In my case it is d:\archive

Another ActiveX step, or just an Execute Process task to move the file.

Simon

Importing an XML file with hirarchial data

Hello,

I would like to import an XML file containing hierarchial data into a table in SQLserver. I guess that I am supposed to use the XML source editor and connect it to the xml file containing the following:

<?xml version="1.0"?>
<VariantFamilies>
<VariantFamily FamilyID="XXX">
<FamilyDescription Language="en-GB">Variant Family Description in English</FamilyDescription>
<FamilyDescription Language="sv-SE">Variantfamiljsbeskrivning p? svenska</FamilyDescription>
<Variants>
<Variant VariantID="1XX">
<VariantDescription Language="en-GB">Variant Description 1 in English</VariantDescription>
<VariantDescription Language="sv-SE">Variantbeskrivning 1 p? svenska</VariantDescription>
</Variant>
<Variant VariantID="2XX">
<VariantDescription Language="en-GB">Variant Description 2 in English</VariantDescription>
<VariantDescription Language="sv-SE">Variantbeskrivning 2 p? svenska</VariantDescription>
</Variant>
</Variants>
</VariantFamily>
<VariantFamily FamilyID="XYY">
<FamilyDescription Language="en-GB">Variant Family 2 Description in English</FamilyDescription>
<FamilyDescription Language="sv-SE">Variantfamiljsbeskrivning 2 p? svenska</FamilyDescription>
<Variants>
<Variant VariantID="1YY">
<VariantDescription Language="en-GB">Variant Description 1 in English</VariantDescription>
<VariantDescription Language="sv-SE">Variantbeskrivning 1 p? svenska</VariantDescription>
</Variant>
<Variant VariantID="2YY">
<VariantDescription Language="en-GB">Variant Description 2 in English</VariantDescription>
<VariantDescription Language="sv-SE">Variantbeskrivning 2 p? svenska</VariantDescription>
</Variant>
</Variants>
</VariantFamily>
</VariantFamilies>

Following columns exist in the table: WK, VariantFamily, VariantFamilyDescription, VariantID, VariantDescription.
wk is automatic,
Variant family should contain XYY,
VariantFamilyDescription should contain Variant Family 2 Description in English,
VariantID should contain 2YY,
VariantDescription should contain Variant Description 2 in English

in the outputname I can only choose one of the attributes at a time (the system has created two new items for each ID to identify the "parent". Now how do I go about to put the actual information into the columns? NOT the system created IDs. I would like to have one row per smalles grain (VariantID) hence having redundant data in Variant family column.

does it have anything to do with how the XSD is defined? or do I need to use the xml task and how do I do then?

Thankful for descriptions of how to go about.

Sincerely,Hanna,
XMLSrc adapter builds up its metadata based on the schema, a complexType in xsd will be read as a table (which is an output at xmlsrc) and elements in that complexType will be columns of that table. Hirarchial data will become multiple tables with _id columns in each related table to track their relationship.
From the xml above, I don't see you can import it to a single table using XMLSrc. Depends on your xsd, you may need further steps in XMLSrc downstreams, unionAll? merge?, to accomplish your scenario.

Thanks
Wenyang|||

I seem to be in a similar conundrum. However, what I want to do is move data elements from my XML file into multiple SQL Server database tables. I've created the XML data flow source and multiple data flow destinations, each pointing to their own table.

As Wenyang Hu pointed out, the XMLSrc adapter builds up its metadata based on the schema, a complexType in xsd will be read as a table. I've now got over 50 tables, but can only select one (from the XML Source Editor columns form). Subsequently, when I go to add a data flow path to my data flow destination, I can only select a single output, which doesn't provide me with all the data I need to load a single database table.

I hope this makes sense and look forward to a solution.

Thanks,

Johnny

|||

That's true one xmlsrc output can be hooked with only one downstream, hence if you want several xmlsrc outputs to be loaded to one single destination, then as said above you'll need other transformation(s) either in between XMLSrc and destination, or, in another dataflow to manipulate.

For example, you can use a dataflow like

XMLSrc (output1)->Sort->MergeJoin->Sort->MergeJoin>Destination

(output2)->Sort--^ ^

(output3)->Sort-|

To Sort on various key columns then join on those key columns at MergeJoin to eventually merge these 3 xmlsrc outputs into one destination. Since you have 50 outputs from xmlsrc, you can also save them into 50 destionation tables first, then use another dataflow with those tables as sources, then use other SSIS transforms to process based on your specific data hierarchy.

Thanks

Wenyang

|||Either way, that will be quite a mapping exercise. I'm wondering if the 'Union All' transformation task might work. The help on that task indicates it can have multiple inputs. If I map multiple inputs from my XMLSrc outputs, then maybe this will work. Then I can have about 10 union all task to map to my 10 SQL tables. Thoughts? I'm hoping to understand the union all as the help doesn't indicate that it does anything unusual.
thanks.
|||

UnionAll unions all its input rows but a) the column metadata will be built up based only on the primary input(the first one hooked up with unionAll) b)it does not "join" on key columns which you may need to merge those hierarchical data from xml file into one destination without loosing your data logic. So unless your multiple xmlsrc outputs have identical column structure, unionall won't fit your need I think.

Thanks

Wenyang

|||

I have a similar problem, except the XML source outputs would be fine if I just could reference the id of the parent node.

Example:

<Property>
<PropertyId>TheIdINeed</PropertyId>
....
....
<Buildings>
<Building id="123">
.....
......
<Rooms>
<Room id ="ABC">
.....
.....
<Beds>
<Bed id="cfg">
.....
.....
</Bed>
</Beds>
</Room>
</Rooms>
</Building>
</Buildings>
</Property>

My problem is that I can't get the PropertyId from the Xml Source output for Building, Room,Bed.
I have tried to make a reference i the XSD and a new External column, but that was just Null.

Best regards
Claus