Showing posts with label columns. Show all posts
Showing posts with label columns. Show all posts

Friday, March 30, 2012

Importing XML File using SSIS (DTS)

I can use MS Access to import an XML file and it builds the columns and loads the data. I am trying to use SQL Server 2005 to do the same thing and there doesn't seen to be a way to easily accomplish the same task that works so well in Access.

Has anyone tried this? If so and you were successful, what were the steps to set up the package?

Thanks.

Have you tried to use XML Source adapter?

- Grab a Data Flow task from the toolbox

- Double-click on the added task

- Double-click on the XML Source in the data flow toolbox

- Double-click added component on the diagram

- Set-up your XML sorce

HTH.

|||

Hi HTH,

After reading your post, i was trying to dothe same thing using the below xml:

<country
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="country.xsd">
<name>France</name>
<population>59.7</population>
</country>

and i even generated the XSD using SSIS and got this :

<?xml version="1.0"?>
<xs:schema xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" attributeFormDefault="unqualified" elementFormDefault="qualified" xmlns:xs="http://www.w3.org/2001/XMLSchema">
<xs:element name="country">
<xs:complexType>
<xs:sequence>
<xs:element minOccurs="0" name="name" type="xs:string" />
<xs:element minOccurs="0" name="population" type="xs:decimal" />
</xs:sequence>
</xs:complexType>
</xs:element>
</xs:schema>

and now after pointing things correctly in my connection manager when i go to my columns all i get is one column which says Name..

why is it so? ideally i should get Name and Pollution right..? Can you advice..?

|||

Try wrapping the xml in a root node:

<root>
<country
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="country.xsd">
<name>France</name>
<population>59.7</population>
</country>
</root>

Thanks
Mark

|||

Mark,

Thanks It worked.

So the impression that i get is that all XML's should be waped in a root node. Is it True

Importing XML File using SSIS (DTS)

I can use MS Access to import an XML file and it builds the columns and loads the data. I am trying to use SQL Server 2005 to do the same thing and there doesn't seen to be a way to easily accomplish the same task that works so well in Access.

Has anyone tried this? If so and you were successful, what were the steps to set up the package?

Thanks.

Have you tried to use XML Source adapter?

- Grab a Data Flow task from the toolbox

- Double-click on the added task

- Double-click on the XML Source in the data flow toolbox

- Double-click added component on the diagram

- Set-up your XML sorce

HTH.

|||

Hi HTH,

After reading your post, i was trying to dothe same thing using the below xml:

<country
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="country.xsd">
<name>France</name>
<population>59.7</population>
</country>

and i even generated the XSD using SSIS and got this :

<?xml version="1.0"?>
<xs:schema xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" attributeFormDefault="unqualified" elementFormDefault="qualified" xmlns:xs="http://www.w3.org/2001/XMLSchema">
<xs:element name="country">
<xs:complexType>
<xs:sequence>
<xs:element minOccurs="0" name="name" type="xs:string" />
<xs:element minOccurs="0" name="population" type="xs:decimal" />
</xs:sequence>
</xs:complexType>
</xs:element>
</xs:schema>

and now after pointing things correctly in my connection manager when i go to my columns all i get is one column which says Name..

why is it so? ideally i should get Name and Pollution right..? Can you advice..?

|||

Try wrapping the xml in a root node:

<root>
<country
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="country.xsd">
<name>France</name>
<population>59.7</population>
</country>
</root>

Thanks
Mark

|||

Mark,

Thanks It worked.

So the impression that i get is that all XML's should be waped in a root node. Is it True

Wednesday, March 28, 2012

Importing text file

Hi all,
I'm importing a text-file into a SQL table, The catch is, I want to run a UDF on one of the columns before it gets added to the table... I would like to do the importing, and conversion in one step, and i'd prefer to do this using bulk insert, or bcp if I
have to, as this will run frequently on the data. ANY help will be greatly appreciated.
Rival
Your best bet would be to use SQL Server Data Transforation Services (DTS),
as you would be able to perform the transformations required and import all
in one step. For more information refer to Data Transformation Services in
SQL Server Books Online (BOL). A good DTS reference site is www.sqldts.com.
--
PETER WARD
WARDY Inc.
www.wardyinc.com
"Rival" <anonymous@.discussions.microsoft.com> wrote in message
news:5D60CE30-BFA2-4F13-BE8D-64059DE91D82@.microsoft.com...
> Hi all,
> I'm importing a text-file into a SQL table, The catch is, I want to run a
UDF on one of the columns before it gets added to the table... I would like
to do the importing, and conversion in one step, and i'd prefer to do this
using bulk insert, or bcp if I have to, as this will run frequently on the
data. ANY help will be greatly appreciated.
|||Rival,
Bear in mind that some UDFs will not be set based, ie they will operate row
by row and will be poor performers. Look into DTS, or bcp into a staging
table and then run a single INSERT...SELECT statement to your destination.
Mark Allison, SQL Server MVP
http://www.markallison.co.uk
"Rival" <anonymous@.discussions.microsoft.com> wrote in message
news:5D60CE30-BFA2-4F13-BE8D-64059DE91D82@.microsoft.com...
> Hi all,
> I'm importing a text-file into a SQL table, The catch is, I want to run a
UDF on one of the columns before it gets added to the table... I would like
to do the importing, and conversion in one step, and i'd prefer to do this
using bulk insert, or bcp if I have to, as this will run frequently on the
data. ANY help will be greatly appreciated.
sql

Importing text file

Hi all
I'm importing a text-file into a SQL table, The catch is, I want to run a UDF on one of the columns before it gets added to the table... I would like to do the importing, and conversion in one step, and i'd prefer to do this using bulk insert, or bcp if I have to, as this will run frequently on the data. ANY help will be greatly appreciated.Rival
Your best bet would be to use SQL Server Data Transforation Services (DTS),
as you would be able to perform the transformations required and import all
in one step. For more information refer to Data Transformation Services in
SQL Server Books Online (BOL). A good DTS reference site is www.sqldts.com.
--
--
PETER WARD
WARDY Inc.
www.wardyinc.com
--
"Rival" <anonymous@.discussions.microsoft.com> wrote in message
news:5D60CE30-BFA2-4F13-BE8D-64059DE91D82@.microsoft.com...
> Hi all,
> I'm importing a text-file into a SQL table, The catch is, I want to run a
UDF on one of the columns before it gets added to the table... I would like
to do the importing, and conversion in one step, and i'd prefer to do this
using bulk insert, or bcp if I have to, as this will run frequently on the
data. ANY help will be greatly appreciated.|||Rival,
Bear in mind that some UDFs will not be set based, ie they will operate row
by row and will be poor performers. Look into DTS, or bcp into a staging
table and then run a single INSERT...SELECT statement to your destination.
--
Mark Allison, SQL Server MVP
http://www.markallison.co.uk
"Rival" <anonymous@.discussions.microsoft.com> wrote in message
news:5D60CE30-BFA2-4F13-BE8D-64059DE91D82@.microsoft.com...
> Hi all,
> I'm importing a text-file into a SQL table, The catch is, I want to run a
UDF on one of the columns before it gets added to the table... I would like
to do the importing, and conversion in one step, and i'd prefer to do this
using bulk insert, or bcp if I have to, as this will run frequently on the
data. ANY help will be greatly appreciated.

Importing text file

Hi all,
I'm importing a text-file into a SQL table, The catch is, I want to run a UD
F on one of the columns before it gets added to the table... I would like to
do the importing, and conversion in one step, and i'd prefer to do this usi
ng bulk insert, or bcp if I
have to, as this will run frequently on the data. ANY help will be greatly a
ppreciated.Rival
Your best bet would be to use SQL Server Data Transforation Services (DTS),
as you would be able to perform the transformations required and import all
in one step. For more information refer to Data Transformation Services in
SQL Server Books Online (BOL). A good DTS reference site is www.sqldts.com.
--
PETER WARD
WARDY Inc.
www.wardyinc.com
--
"Rival" <anonymous@.discussions.microsoft.com> wrote in message
news:5D60CE30-BFA2-4F13-BE8D-64059DE91D82@.microsoft.com...
> Hi all,
> I'm importing a text-file into a SQL table, The catch is, I want to run a
UDF on one of the columns before it gets added to the table... I would like
to do the importing, and conversion in one step, and i'd prefer to do this
using bulk insert, or bcp if I have to, as this will run frequently on the
data. ANY help will be greatly appreciated.|||Rival,
Bear in mind that some UDFs will not be set based, ie they will operate row
by row and will be poor performers. Look into DTS, or bcp into a staging
table and then run a single INSERT...SELECT statement to your destination.
Mark Allison, SQL Server MVP
http://www.markallison.co.uk
"Rival" <anonymous@.discussions.microsoft.com> wrote in message
news:5D60CE30-BFA2-4F13-BE8D-64059DE91D82@.microsoft.com...
> Hi all,
> I'm importing a text-file into a SQL table, The catch is, I want to run a
UDF on one of the columns before it gets added to the table... I would like
to do the importing, and conversion in one step, and i'd prefer to do this
using bulk insert, or bcp if I have to, as this will run frequently on the
data. ANY help will be greatly appreciated.

Monday, March 26, 2012

Importing Selected csv data into SQL tables

Hi,

I hope you can help me.

I've got a csv that I need to import. It's about 74 columns wide and 2500 rows long. What I need to do is select the first 3 columns from the columns from the csv and put them into one table and then put every 7 columns (plus the 1st column as the primary key) into a different table.

I've not explained that very well, basically it's like this...

A B C D E F G H I J K L M N O P

1

2

3

4

5

Table 1 Data

cs# A1
fe B1
cl C1

Table 2

cs# A1
desc D1
date E1
pay F1
amount G1
vat H1
total I1
cReq J1

I'm not really sure were to start. I want to import the data nightly into the tables. I've tried to use the import wizard but I can't seem to select the data I want and exclude the data I don't.

Steve

Which version of SQL are you using? If its 2005 you should be able to do this neatly using Integration Services (SSIS).

For 2000 you might be best off dumping the whole thing into a holding table and then manipulating the data from there. This could be handled by several steps of a SQL Job.

HTH!

|||Thanks for the reply,

Yeah I'm using 2005 sp2. Where can I find the integration services? I can't seem to find it.

Steve|||You need to install SSIS(integration services) @. the time of installing sql server 2005 (you need to choose the services you need to install)........go to startall programsmicrosoft sql server 2005configuration toolssql server configuration manager.....under that you can see the list of sql 2005 services just check if SSIS is present else install it.......after installing connect to SSMS choose integration services instead of database engine to connect to SSIS..........

Thanxx
|||

Follow Deepaks instructions to get it installed if you haven't already. Then, you can open up the Business Intellignece Studio (Visual Studio skin) and select an Integration Services project. This will then give you a graphical interface in which you can drag and drop different datasources (eg Excel & SQL Server) and dataflow methods (eg copy column and export column )

Take a look at the Books Online walkthroughs that will give you a flavour of the different tasks you can achieve and hopefully that will give you a good steer on what to do.

Good luck!

|||Thanks very much.

I feel more at home in Visual Studio!

I'll probably be back when I get stuck again.

Steve

Friday, March 23, 2012

Importing NULL into table SQL Server 2005

I've tried to import a text file into a table using the Import/export wizard.

My problem starts with some columns that should have NULL values (i.e. zero-length string) but the wizard doesn't recognize that it's NULL.

How do I solve the problem?

Thanks,

Mich

First, a zero-length is not NULL. NULL is a special character.

If you want NULLs to be inserted, open up the package and edit it (that the import/export wizard created for you) and add a derived column transformation before going to the destination.

Use an expression similar to this for each of your columns:

[myColumn] == "" ? NULL(DT_WSTR,20) : [myColumn]

I'm going by memory, but I believe that would work. The above assumes myColumn has a length of 20 bytes.

|||

Thank you for your answer.

My knowledge of the import/export wizard is limited.

I would like to know whether the expression should be added in the "Suggest Types" button or elsewhere.

Is the expression related to the destination column names or the columns the Wizard created?

I'd appreciate it if you could give me a step-by-step guide on how to do it, since I found the sql help file very unhelpful.

Thank you again.

Wednesday, March 21, 2012

importing from access error

Let me describe my situation

I am running SQL Server 2005. I created all the tables/columns/keys I needed. I export the data to Access.

In Access, I add all the data for the tables. Then I use SSIS to import-append the data back in to SQL Server 2005.

I get this error:
-
- Pre-execute (Error)
Messages
Error 0xc0202009: Data Flow Task: An OLE DB error has occurred. Error code: 0x80040E21.
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E21 Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".
(SQL Server Import and Export Wizard)
Error 0xc0202025: Data Flow Task: Cannot create an OLE DB accessor. Verify that the column metadata is valid.
(SQL Server Import and Export Wizard)
Error 0xc004701a: Data Flow Task: component "Destination - tbl_name" (19) failed the pre-execute phase and returned error code 0xC0202025.
(SQL Server Import and Export Wizard)
--

I can use the import tool (SSIS) to create new tables, but not append the data. Creating new tables makes all the tables to lose information about field lengths etc, so this is not an option

any help would be greatly appreciated.

Thanks!

Could you verify the metadata of your existing tables matches the tables you want to import. The error message hints a problem in that area.

Thanks.

|||yea..turns out the metadata was in error...but the access DB's metadata is the result of an export of the SQL Server DB.

Why doesn't Microsoft let you import what you export? just seems to make sense to me.
|||

It is because the data type spectrum is much richer for SQL Server than for Access. By transferring data from SQL Server to Access you are basically downgrading richness of your data and it makes the reverse process difficult without the user intervention.

Even for equally rich data type spectrums of heterogeneous data sources you will end up having to map more than one type to the single destination type. When going in the opposite direction you will have to pick one of possible types as a best match, and the best match will not always be the proper one.

HTH.

Monday, March 19, 2012

Importing Excel Data

Here is the scenario: I have an excel spreadsheet that contains 182 columns, and I need to move this data into a semi-normalized database for reporting. The SQL Server database schema has 11 tables. Some of the tables are going to use identity columns for their PK, other tables are using a value that comes from this spreadsheet for their PK values.

Anyway, I have never done a DTS package of any significance before, and know I most likely need to write some VBScript to handle sticking data into the proper data tables, etc.

I am just hoping someone can point me at a good resource, give me an alternative means of doing this (this is a process that will need to happen whenever a new Excel spreadsheet is dropped into a folder or on a schedule, either one). I would love to write some C# code to handle these things, but a DTS package would probably be the best, I just don't know where to start.

Thanks,Here's a generic Microsoft KB article:

http://support.microsoft.com/default.aspx?scid=kb;en-us;Q321686|||Darrell, thank you for taking the time to respond, I really appreciate it; I will have a look. Thanks!

importing excel column with multiple values separated by '/'

I would like to import two columns from an excel file into a sql server
table as an area code - time zone look up.
The area code column sometimes has multiple area codes in the area code
cell. eg. 207/208/209.
What is a good way to import those two columns so that 3 table rows are
created for each of those Excel rows that contain these multiple values
separated by the '/' character?
Thank you,
GregOn Wed, 1 Mar 2006 07:32:52 -0800, hazz wrote:

>I would like to import two columns from an excel file into a sql server
>table as an area code - time zone look up.
>The area code column sometimes has multiple area codes in the area code
>cell. eg. 207/208/209.
>What is a good way to import those two columns so that 3 table rows are
>created for each of those Excel rows that contain these multiple values
>separated by the '/' character?
Hi Greg,
Some useful techniques are disccussed at
http://www.sommarskog.se/arrays-in-sql.html
Hugo Kornelis, SQL Server MVP|||Thank you Hugo, I'll take a look !
"Hugo Kornelis" <hugo@.perFact.REMOVETHIS.info.INVALID> wrote in message
news:spdc0210aq8daomldhm1lmbpviqf7hp4gi@.
4ax.com...
> On Wed, 1 Mar 2006 07:32:52 -0800, hazz wrote:
>
> Hi Greg,
> Some useful techniques are disccussed at
> http://www.sommarskog.se/arrays-in-sql.html
> --
> Hugo Kornelis, SQL Server MVP

Importing Error

Hi all,

I am facing problem on importing csv data file into table,


For Ex:


csv file which has 2 columns & 4 records are as follows

EmpNo EmpName

1 a
2 b
3 c
1 a

I created new table called T4, structure of T4 is

EmpNo Varchar(50) with Primary Key
EmpName Varchar(50) disabled Allow Nulls

Now I started importing csv data to T4 table, during importing it couldn't finish entire process. Throwing error, when I remove primary key with disable Allow Nulls for both constraints of a table T4. Importing data successfull.
My question is if I have above said structure for the table T4 (with primary Key), during importing instead of throwing error let it be import first 3 rows then throw error as primary key enabled could not able to import. Is this can be possibe, if so please suggest me to solve the problem.Can I use Bulk Insert or SQL Scripts to solve this problem.

Thanks in advance
Karna

I believe that I understand your current situation, but I do not think I understand what it is you are trying to do.

If you're trying to insert the three "good rows" and "fail" the one "bad" row without failing the package as a whole, you may be able to accomplish this by adding an error output to the destination to which you're writing these records. Then the "bad" records will be redirected to that portion of the data flow, where you can count them, log them, or ignore them if you want.

Is this what you're trying to accomplish? If so, please let us know if this technique gives you the solution you need. If not, please see if you can rephrase your goal, and I can try again.

|||Hi Mathew,

Thanks for your reply. I am executing wizard in SQL Server 2005. Actual problem is "I am preparing a job which can import CSV data file to the table. Table already has primary key & disabled allow null option. Here in CSV file it has duplicate records & empty rows. During importing of job is not successfully done, it is throwing error." My question is if the duplicate data lies in 1000th record, let job be success for until 999 record. Then let it can show error for 1000 record & logged .
Please suggest me to how to succeed the job.

Thanks in advance
Karna|||

I'm honestly not certain how I would go about doing that. For most situations where I'm using SSIS, the behavior you're trying to achieve would be inappropriate for my needs (I would either need the entire load to fail with no rows written to the destination, or else have error rows redirected to a log file or table) so I've never tried to set things up this way. I'm also pretty unfamiliar with the Input/Output Wizard, so that's two strikes against me.

With that said, the first thing I would try would be to update the destination component in the package (i do not know if this is exposed through the wizard - i assume it is not) to turn off fast load. This will slow things down quite a bit as each row will be inserted into the destination database one by one, but it might give you what you need.

Perhaps someone else has more experience with this problem than I...

|||Hi Mathew,

Actually I have to run a job for which it has to import csv data file to table. CSV has duplicate records. My job should not import duplicate records, it should append only actual records. If I set primary key & disabled allow null option. After this when I run the job, job is not even appending actual records (which is not duplicated), job is throwing error which is not importing any data from csv file.
Please suggest me to solve this issue.

Thanks in advance
Karna|||Hi Mathew,

Actually I have to run a job for which it has to import csv data file to table. CSV has duplicate records. My job should not import duplicate records, it should append only actual records. If I set primary key & disabled allow null option. After this when I run the job, job is not even appending actual records (which is not duplicated), job is throwing error which is not importing any data from csv file.
Please suggest me to solve this issue.

Thanks in advance
Karna|||

Karna,

I think you best option use the error output in the destination component to redirect the errors. If you don't change the default value of the error configuration; then the package will fail as soon as the 1st duplicate hit the destination table. This is nothing diffrent from what Matthew has already suggested.

|||Hi,

Thanks for reply. My problem is I need to import CSV data file to one of the table in the database. During importing it shouldn't import null rows & same rows to the table. The method which should checks the data in the table before importing, if data exists it shouldn't import any data else it should. Please suggest me to solve this problem.

Thanks in advance
Karna|||

See this link for a number of different ways to see if a row exists:

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1211340&SiteID=1

Importing EBCDIC file with COMP3 fields

Hi All,

I have a file with several columns in Comp-3.

I have downloaded the UnPack Decimal component and, as it needs byte stream (dt_bytes) as input, so I set up an appropriate Flat File Source columns.

But I get this error:

[Flat File Source [2201]] Error: Data conversion failed. The data conversion for column "DTCDC" returned status value 2 and status text "The value could not be converted because of a potential loss of data.".

[Flat File Source [2201]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "output column "DTCDC" (2250)" failed because error code 0xC0209084 occurred, and the error row disposition on "output column "DTCDC" (2250)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.

DTCDC is first of the columns packed. These are mostly date columns packed into 5 bytes - should be unpacked to normal SQL date.

I've tried different locale , as suggested in other threads, but it didn't help.

Can anybody help me with this issue how can I do it in (VB.NET script ,importing as String?).

Thanks in advance

Michal

Did you remember to go into the advanced editor and mark the fields as binary with the correct column width?

|||

Hi EWisdahl,

Indeed it helped Smile

UnPackDecimal does pretty good job too.

Thanks,

Michal

|||Just as a reminder, if your question is answered, mark the reply as the answer. It helps out those who are looking for answers to the same type of questions later, as well as those who try to look through all of the unaswered posts...|||If your file gets too complicated there is a custom source component at www.aminosoftware.com that is supposed to do conversion to ascii on the fly including comp-3 with redefines, occurs, and other nasty EBCDIC nuances.

Wednesday, March 7, 2012

Importing Data from Excel Sheets into SQL Server Database

Hi,

Would like some help on how do I go about coverting an Excel File with columns of info into my SQL Server Database. The excel file will be uploaded from a user from my web application. I completely have no idea on where to start so any form of help is much appreciated thanks.

bump|||bump|||

Any help at all? Just to get me headed in the right direction.

Friday, February 24, 2012

Importing Comma Delimited Data.

Hello,
I am trying to import files (hundreds) with Comma Delimited Data, because
some columns within the data contain comma's themselves those fields have
been double qouted.
Small example.
1234, hello John, mega
2345, " Hello Andrew, Marie", tall
I have tried to use Bulk insert with a Formatfile in xml and suggested by
Uri Dimant a formatted file.
But with the bulk insert, the field with the comma breaks into two sepperate
fields and this generetes
sometimes an error because the fields are not in sync.
With the formatted file the lines without quotes do not work correctly. I
tried to read the help files on this, but could not find a solution to the
problem.
In total it's about millions of rows, in hundreds of files, for a number off
tables.
Can anybody give advise how to solve this, or where to look in the help
system?
Thanks for your time and attention,
Ben Brugman> Can anybody give advise how to solve this, or where to look in the help
> system?
SSIS will better handle enclosed fields. You can use the import data wizard
from SQL Server Management Studio to generate the package. Right-click the
database in object explorer and select Tasks-->Import Data.
Hope this helps.
Dan Guzman
SQL Server MVP
"ben brugman" <ben@.niethier.nl> wrote in message
news:32e3c$473aef81$53557893$24038@.cache
100.multikabel.net...
> Hello,
> I am trying to import files (hundreds) with Comma Delimited Data, because
> some columns within the data contain comma's themselves those fields have
> been double qouted.
> Small example.
> 1234, hello John, mega
> 2345, " Hello Andrew, Marie", tall
> I have tried to use Bulk insert with a Formatfile in xml and suggested by
> Uri Dimant a formatted file.
> But with the bulk insert, the field with the comma breaks into two
> sepperate fields and this generetes
> sometimes an error because the fields are not in sync.
> With the formatted file the lines without quotes do not work correctly. I
> tried to read the help files on this, but could not find a solution to the
> problem.
> In total it's about millions of rows, in hundreds of files, for a number
> off tables.
> Can anybody give advise how to solve this, or where to look in the help
> system?
> Thanks for your time and attention,
> Ben Brugman
>|||Thanks Dan,
At the moment the target system does not have SSIS installed.
(I'll will ask for SSIS to be installed on the target system.)
Two questions:
Is there another solution, because I should think that normal import
procedures like bulk insert should be able to handle Comma delimited files?
(To my knowledge my situation is fairly common).
Because I allready tried two differend methods, which I could not get to
work I am a bit apprihencive to try a third method, is SSIS going to work
for this kind of situations, or is this something I just have to try?
This is actually my first project really working with SQLserver 2005. With
the previous version I normally succeeded with importing everything I wanted
to import, so at the moment I am feeling a bit stupid. Spending time on the
methods I could not get to work is rather frustrating,
Thanks for your attention and advise.
Ben Brugman
"Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> schreef in bericht
news:%2369Uz5rJIHA.280@.TK2MSFTNGP03.phx.gbl...
> SSIS will better handle enclosed fields. You can use the import data
> wizard from SQL Server Management Studio to generate the package.
> Right-click the database in object explorer and select Tasks-->Import
> Data.
> --
> Hope this helps.
> Dan Guzman
> SQL Server MVP
> "ben brugman" <ben@.niethier.nl> wrote in message
> news:32e3c$473aef81$53557893$24038@.cache
100.multikabel.net...
>|||> Is there another solution, because I should think that normal import
> procedures like bulk insert should be able to handle Comma delimited
> files?
BCP and BULK INSERT do a good job of parsing files as long as field
terminators do not occur in the data. For more advanced parsing, SSIS is a
better choice.

> Because I allready tried two differend methods, which I could not get to
> work I am a bit apprihencive to try a third method, is SSIS going to work
> for this kind of situations, or is this something I just have to try?
I understand your frustration. However, I can tell you that I've
successfully used SSIS to import many CSV files in a loop using a ForEach
task that executes a DataFlow for each file. There is some learning curve
if you've never used SSIS before so you might find an examples helpful. An
example of this technique at http://www.sqlis.com/55.aspx. Just be sure to
specify quotes (") as the text qualifier for the flat file connection of the
data flow task.
Hope this helps.
Dan Guzman
SQL Server MVP
"ben brugman" <ben@.niethier.nl> wrote in message
news:8b4e8$473af645$53557893$23893@.cache
120.multikabel.net...
> Thanks Dan,
> At the moment the target system does not have SSIS installed.
> (I'll will ask for SSIS to be installed on the target system.)
> Two questions:
> Is there another solution, because I should think that normal import
> procedures like bulk insert should be able to handle Comma delimited
> files?
> (To my knowledge my situation is fairly common).
> Because I allready tried two differend methods, which I could not get to
> work I am a bit apprihencive to try a third method, is SSIS going to work
> for this kind of situations, or is this something I just have to try?
> This is actually my first project really working with SQLserver 2005. With
> the previous version I normally succeeded with importing everything I
> wanted to import, so at the moment I am feeling a bit stupid. Spending
> time on the methods I could not get to work is rather frustrating,
> Thanks for your attention and advise.
> Ben Brugman
>
> "Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> schreef in bericht
> news:%2369Uz5rJIHA.280@.TK2MSFTNGP03.phx.gbl...
>|||BULK INSERT and BCP are designed for speed not versatility in handling
different formats. If you don't want to try SSIS, another solution I often
use is to pre-process the files into an even simpler format before feeding
them to BCP/BULK INSERT.
Linchi
"ben brugman" wrote:

> Thanks Dan,
> At the moment the target system does not have SSIS installed.
> (I'll will ask for SSIS to be installed on the target system.)
> Two questions:
> Is there another solution, because I should think that normal import
> procedures like bulk insert should be able to handle Comma delimited files
?
> (To my knowledge my situation is fairly common).
> Because I allready tried two differend methods, which I could not get to
> work I am a bit apprihencive to try a third method, is SSIS going to work
> for this kind of situations, or is this something I just have to try?
> This is actually my first project really working with SQLserver 2005. With
> the previous version I normally succeeded with importing everything I want
ed
> to import, so at the moment I am feeling a bit stupid. Spending time on th
e
> methods I could not get to work is rather frustrating,
> Thanks for your attention and advise.
> Ben Brugman
>
> "Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> schreef in bericht
> news:%2369Uz5rJIHA.280@.TK2MSFTNGP03.phx.gbl...
>

Importing Comma Delimited Data.

Hello,
I am trying to import files (hundreds) with Comma Delimited Data, because
some columns within the data contain comma's themselves those fields have
been double qouted.
Small example.
1234, hello John, mega
2345, " Hello Andrew, Marie", tall
I have tried to use Bulk insert with a Formatfile in xml and suggested by
Uri Dimant a formatted file.
But with the bulk insert, the field with the comma breaks into two sepperate
fields and this generetes
sometimes an error because the fields are not in sync.
With the formatted file the lines without quotes do not work correctly. I
tried to read the help files on this, but could not find a solution to the
problem.
In total it's about millions of rows, in hundreds of files, for a number off
tables.
Can anybody give advise how to solve this, or where to look in the help
system?
Thanks for your time and attention,
Ben Brugman> Can anybody give advise how to solve this, or where to look in the help
> system?
SSIS will better handle enclosed fields. You can use the import data wizard
from SQL Server Management Studio to generate the package. Right-click the
database in object explorer and select Tasks-->Import Data.
--
Hope this helps.
Dan Guzman
SQL Server MVP
"ben brugman" <ben@.niethier.nl> wrote in message
news:32e3c$473aef81$53557893$24038@.cache100.multikabel.net...
> Hello,
> I am trying to import files (hundreds) with Comma Delimited Data, because
> some columns within the data contain comma's themselves those fields have
> been double qouted.
> Small example.
> 1234, hello John, mega
> 2345, " Hello Andrew, Marie", tall
> I have tried to use Bulk insert with a Formatfile in xml and suggested by
> Uri Dimant a formatted file.
> But with the bulk insert, the field with the comma breaks into two
> sepperate fields and this generetes
> sometimes an error because the fields are not in sync.
> With the formatted file the lines without quotes do not work correctly. I
> tried to read the help files on this, but could not find a solution to the
> problem.
> In total it's about millions of rows, in hundreds of files, for a number
> off tables.
> Can anybody give advise how to solve this, or where to look in the help
> system?
> Thanks for your time and attention,
> Ben Brugman
>|||Thanks Dan,
At the moment the target system does not have SSIS installed.
(I'll will ask for SSIS to be installed on the target system.)
Two questions:
Is there another solution, because I should think that normal import
procedures like bulk insert should be able to handle Comma delimited files?
(To my knowledge my situation is fairly common).
Because I allready tried two differend methods, which I could not get to
work I am a bit apprihencive to try a third method, is SSIS going to work
for this kind of situations, or is this something I just have to try?
This is actually my first project really working with SQLserver 2005. With
the previous version I normally succeeded with importing everything I wanted
to import, so at the moment I am feeling a bit stupid. Spending time on the
methods I could not get to work is rather frustrating,
Thanks for your attention and advise.
Ben Brugman
"Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> schreef in bericht
news:%2369Uz5rJIHA.280@.TK2MSFTNGP03.phx.gbl...
>> Can anybody give advise how to solve this, or where to look in the help
>> system?
> SSIS will better handle enclosed fields. You can use the import data
> wizard from SQL Server Management Studio to generate the package.
> Right-click the database in object explorer and select Tasks-->Import
> Data.
> --
> Hope this helps.
> Dan Guzman
> SQL Server MVP
> "ben brugman" <ben@.niethier.nl> wrote in message
> news:32e3c$473aef81$53557893$24038@.cache100.multikabel.net...
>> Hello,
>> I am trying to import files (hundreds) with Comma Delimited Data, because
>> some columns within the data contain comma's themselves those fields have
>> been double qouted.
>> Small example.
>> 1234, hello John, mega
>> 2345, " Hello Andrew, Marie", tall
>> I have tried to use Bulk insert with a Formatfile in xml and suggested by
>> Uri Dimant a formatted file.
>> But with the bulk insert, the field with the comma breaks into two
>> sepperate fields and this generetes
>> sometimes an error because the fields are not in sync.
>> With the formatted file the lines without quotes do not work correctly. I
>> tried to read the help files on this, but could not find a solution to
>> the problem.
>> In total it's about millions of rows, in hundreds of files, for a number
>> off tables.
>> Can anybody give advise how to solve this, or where to look in the help
>> system?
>> Thanks for your time and attention,
>> Ben Brugman
>|||> Is there another solution, because I should think that normal import
> procedures like bulk insert should be able to handle Comma delimited
> files?
BCP and BULK INSERT do a good job of parsing files as long as field
terminators do not occur in the data. For more advanced parsing, SSIS is a
better choice.
> Because I allready tried two differend methods, which I could not get to
> work I am a bit apprihencive to try a third method, is SSIS going to work
> for this kind of situations, or is this something I just have to try?
I understand your frustration. However, I can tell you that I've
successfully used SSIS to import many CSV files in a loop using a ForEach
task that executes a DataFlow for each file. There is some learning curve
if you've never used SSIS before so you might find an examples helpful. An
example of this technique at http://www.sqlis.com/55.aspx. Just be sure to
specify quotes (") as the text qualifier for the flat file connection of the
data flow task.
Hope this helps.
Dan Guzman
SQL Server MVP
"ben brugman" <ben@.niethier.nl> wrote in message
news:8b4e8$473af645$53557893$23893@.cache120.multikabel.net...
> Thanks Dan,
> At the moment the target system does not have SSIS installed.
> (I'll will ask for SSIS to be installed on the target system.)
> Two questions:
> Is there another solution, because I should think that normal import
> procedures like bulk insert should be able to handle Comma delimited
> files?
> (To my knowledge my situation is fairly common).
> Because I allready tried two differend methods, which I could not get to
> work I am a bit apprihencive to try a third method, is SSIS going to work
> for this kind of situations, or is this something I just have to try?
> This is actually my first project really working with SQLserver 2005. With
> the previous version I normally succeeded with importing everything I
> wanted to import, so at the moment I am feeling a bit stupid. Spending
> time on the methods I could not get to work is rather frustrating,
> Thanks for your attention and advise.
> Ben Brugman
>
> "Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> schreef in bericht
> news:%2369Uz5rJIHA.280@.TK2MSFTNGP03.phx.gbl...
>> Can anybody give advise how to solve this, or where to look in the help
>> system?
>> SSIS will better handle enclosed fields. You can use the import data
>> wizard from SQL Server Management Studio to generate the package.
>> Right-click the database in object explorer and select Tasks-->Import
>> Data.
>> --
>> Hope this helps.
>> Dan Guzman
>> SQL Server MVP
>> "ben brugman" <ben@.niethier.nl> wrote in message
>> news:32e3c$473aef81$53557893$24038@.cache100.multikabel.net...
>> Hello,
>> I am trying to import files (hundreds) with Comma Delimited Data,
>> because some columns within the data contain comma's themselves those
>> fields have been double qouted.
>> Small example.
>> 1234, hello John, mega
>> 2345, " Hello Andrew, Marie", tall
>> I have tried to use Bulk insert with a Formatfile in xml and suggested
>> by Uri Dimant a formatted file.
>> But with the bulk insert, the field with the comma breaks into two
>> sepperate fields and this generetes
>> sometimes an error because the fields are not in sync.
>> With the formatted file the lines without quotes do not work correctly.
>> I tried to read the help files on this, but could not find a solution to
>> the problem.
>> In total it's about millions of rows, in hundreds of files, for a number
>> off tables.
>> Can anybody give advise how to solve this, or where to look in the help
>> system?
>> Thanks for your time and attention,
>> Ben Brugman
>>
>|||BULK INSERT and BCP are designed for speed not versatility in handling
different formats. If you don't want to try SSIS, another solution I often
use is to pre-process the files into an even simpler format before feeding
them to BCP/BULK INSERT.
Linchi
"ben brugman" wrote:
> Thanks Dan,
> At the moment the target system does not have SSIS installed.
> (I'll will ask for SSIS to be installed on the target system.)
> Two questions:
> Is there another solution, because I should think that normal import
> procedures like bulk insert should be able to handle Comma delimited files?
> (To my knowledge my situation is fairly common).
> Because I allready tried two differend methods, which I could not get to
> work I am a bit apprihencive to try a third method, is SSIS going to work
> for this kind of situations, or is this something I just have to try?
> This is actually my first project really working with SQLserver 2005. With
> the previous version I normally succeeded with importing everything I wanted
> to import, so at the moment I am feeling a bit stupid. Spending time on the
> methods I could not get to work is rather frustrating,
> Thanks for your attention and advise.
> Ben Brugman
>
> "Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> schreef in bericht
> news:%2369Uz5rJIHA.280@.TK2MSFTNGP03.phx.gbl...
> >> Can anybody give advise how to solve this, or where to look in the help
> >> system?
> >
> > SSIS will better handle enclosed fields. You can use the import data
> > wizard from SQL Server Management Studio to generate the package.
> > Right-click the database in object explorer and select Tasks-->Import
> > Data.
> >
> > --
> > Hope this helps.
> >
> > Dan Guzman
> > SQL Server MVP
> >
> > "ben brugman" <ben@.niethier.nl> wrote in message
> > news:32e3c$473aef81$53557893$24038@.cache100.multikabel.net...
> >> Hello,
> >>
> >> I am trying to import files (hundreds) with Comma Delimited Data, because
> >> some columns within the data contain comma's themselves those fields have
> >> been double qouted.
> >>
> >> Small example.
> >>
> >> 1234, hello John, mega
> >> 2345, " Hello Andrew, Marie", tall
> >>
> >> I have tried to use Bulk insert with a Formatfile in xml and suggested by
> >> Uri Dimant a formatted file.
> >> But with the bulk insert, the field with the comma breaks into two
> >> sepperate fields and this generetes
> >> sometimes an error because the fields are not in sync.
> >>
> >> With the formatted file the lines without quotes do not work correctly. I
> >> tried to read the help files on this, but could not find a solution to
> >> the problem.
> >>
> >> In total it's about millions of rows, in hundreds of files, for a number
> >> off tables.
> >>
> >> Can anybody give advise how to solve this, or where to look in the help
> >> system?
> >>
> >> Thanks for your time and attention,
> >> Ben Brugman
> >>
> >
>|||All thanks for you participation,
Dan and Linchi, thanks for your comments.
At the moment we have adopted the solution of dropping the column which was
causing all the problems.
When we do need the column, we will probably end it with a not used symbol
or set of symbols, or quoting all strings for this column.
Thanks for you time and attention,
Ben
"Linchi Shea" <LinchiShea@.discussions.microsoft.com> schreef in bericht
news:270A3E1B-6300-43F0-B30C-C27B8291912C@.microsoft.com...
> BULK INSERT and BCP are designed for speed not versatility in handling
> different formats. If you don't want to try SSIS, another solution I often
> use is to pre-process the files into an even simpler format before feeding
> them to BCP/BULK INSERT.
> Linchi
> "ben brugman" wrote:
>> Thanks Dan,
>> At the moment the target system does not have SSIS installed.
>> (I'll will ask for SSIS to be installed on the target system.)
>> Two questions:
>> Is there another solution, because I should think that normal import
>> procedures like bulk insert should be able to handle Comma delimited
>> files?
>> (To my knowledge my situation is fairly common).
>> Because I allready tried two differend methods, which I could not get to
>> work I am a bit apprihencive to try a third method, is SSIS going to work
>> for this kind of situations, or is this something I just have to try?
>> This is actually my first project really working with SQLserver 2005.
>> With
>> the previous version I normally succeeded with importing everything I
>> wanted
>> to import, so at the moment I am feeling a bit stupid. Spending time on
>> the
>> methods I could not get to work is rather frustrating,
>> Thanks for your attention and advise.
>> Ben Brugman
>>
>> "Dan Guzman" <guzmanda@.nospam-online.sbcglobal.net> schreef in bericht
>> news:%2369Uz5rJIHA.280@.TK2MSFTNGP03.phx.gbl...
>> >> Can anybody give advise how to solve this, or where to look in the
>> >> help
>> >> system?
>> >
>> > SSIS will better handle enclosed fields. You can use the import data
>> > wizard from SQL Server Management Studio to generate the package.
>> > Right-click the database in object explorer and select Tasks-->Import
>> > Data.
>> >
>> > --
>> > Hope this helps.
>> >
>> > Dan Guzman
>> > SQL Server MVP
>> >
>> > "ben brugman" <ben@.niethier.nl> wrote in message
>> > news:32e3c$473aef81$53557893$24038@.cache100.multikabel.net...
>> >> Hello,
>> >>
>> >> I am trying to import files (hundreds) with Comma Delimited Data,
>> >> because
>> >> some columns within the data contain comma's themselves those fields
>> >> have
>> >> been double qouted.
>> >>
>> >> Small example.
>> >>
>> >> 1234, hello John, mega
>> >> 2345, " Hello Andrew, Marie", tall
>> >>
>> >> I have tried to use Bulk insert with a Formatfile in xml and suggested
>> >> by
>> >> Uri Dimant a formatted file.
>> >> But with the bulk insert, the field with the comma breaks into two
>> >> sepperate fields and this generetes
>> >> sometimes an error because the fields are not in sync.
>> >>
>> >> With the formatted file the lines without quotes do not work
>> >> correctly. I
>> >> tried to read the help files on this, but could not find a solution to
>> >> the problem.
>> >>
>> >> In total it's about millions of rows, in hundreds of files, for a
>> >> number
>> >> off tables.
>> >>
>> >> Can anybody give advise how to solve this, or where to look in the
>> >> help
>> >> system?
>> >>
>> >> Thanks for your time and attention,
>> >> Ben Brugman
>> >>
>> >
>>