Wednesday, March 28, 2012

Importing TXT files problems

Hello,
I'm fairly new to SQL and hopefully this is a simple question to answer.
I've been trying to import a TXT file into an SQL database and I'm having trouble making it work correctly. It is a ASCII text file where the fields vary by the number of characters. This can be 2 characters up to 40 (STATE would be 2 characters, CITY is
32 characters, etc.)
I can import the file with DTS. I go in and select exactly where I want the field breaks to be. Then it imports everything as Characters with column headers of Col001, Col002 etc. I don't want everything as characters or Col001 etc. Some columns would be
INT, NUMERIC(x,x), etc. but everytime I change these values in DTS it won't import the data correctly.
Also, I have an SQL script that I wrote for a table where I can create the field lengths, data type etc., FWIW. This seems to be going nowhere fast.
What am I doing wrong? Should I be using something else than DTS?
Any suggestions are greatly appreciated.
Thanks,
JB
We have had really good luck with BULK INSERT using TAB DELIMITED data.
Maybe you can get it to be TAB DELIMITED using EXCEL - IMPORT the text and
save as .TDF.
We also are a VB shop, so on top of that we have developed several different
versions of a TEXTIMPORT program - some that take the lengths of each SQL
column and cut the input data based on that, others that look for specific
items while cutting up the data. This way we can also deal with odd data
issues for various columns (the TEXTIMPORT uses a little .INI file that
controls specifics like that).
I believe that BULK INSERT and BCP have control files to allow greater
flexibility - maybe check BOL (books online)...
"JonBlack" <anonymous@.discussions.microsoft.com> wrote in message
news:AE462262-F98D-40A1-BD57-6E2066BE2A6D@.microsoft.com...
> Hello,
> I'm fairly new to SQL and hopefully this is a simple question to answer.
> I've been trying to import a TXT file into an SQL database and I'm having
trouble making it work correctly. It is a ASCII text file where the fields
vary by the number of characters. This can be 2 characters up to 40 (STATE
would be 2 characters, CITY is 32 characters, etc.)
> I can import the file with DTS. I go in and select exactly where I want
the field breaks to be. Then it imports everything as Characters with column
headers of Col001, Col002 etc. I don't want everything as characters or
Col001 etc. Some columns would be INT, NUMERIC(x,x), etc. but everytime I
change these values in DTS it won't import the data correctly.
> Also, I have an SQL script that I wrote for a table where I can create the
field lengths, data type etc., FWIW. This seems to be going nowhere fast.
> What am I doing wrong? Should I be using something else than DTS?
> Any suggestions are greatly appreciated.
> Thanks,
> JB
|||Hi Steve,
Thanks for your reply. I wouldn't mind importing it into EXCEL and saving it in a better format. Unfortunately I have over 100,000 records, which rules EXCEL out. I think BULK IMPORT and BCP can't import files based on character widths but could use tabs
as you suggested. Using VB to write an import script could be an option though.
Thanks again,
JB
-- Steve Z wrote: --
We have had really good luck with BULK INSERT using TAB DELIMITED data.
Maybe you can get it to be TAB DELIMITED using EXCEL - IMPORT the text and
save as .TDF.
We also are a VB shop, so on top of that we have developed several different
versions of a TEXTIMPORT program - some that take the lengths of each SQL
column and cut the input data based on that, others that look for specific
items while cutting up the data. This way we can also deal with odd data
issues for various columns (the TEXTIMPORT uses a little .INI file that
controls specifics like that).
I believe that BULK INSERT and BCP have control files to allow greater
flexibility - maybe check BOL (books online)...
|||Is it a column a fix length or delimited by coma ?
JonBlack wrote:
> Hello,
> I'm fairly new to SQL and hopefully this is a simple question to answer.
> I've been trying to import a TXT file into an SQL database and I'm having trouble making it work correctly. It is a ASCII text file where the fields vary by the number of characters. This can be 2 characters up to 40 (STATE would be 2 characters, CITY i
s 32 characters, etc.)
> I can import the file with DTS. I go in and select exactly where I want the field breaks to be. Then it imports everything as Characters with column headers of Col001, Col002 etc. I don't want everything as characters or Col001 etc. Some columns would b
e INT, NUMERIC(x,x), etc. but everytime I change these values in DTS it won't import the data correctly.
> Also, I have an SQL script that I wrote for a table where I can create the field lengths, data type etc., FWIW. This seems to be going nowhere fast.
> What am I doing wrong? Should I be using something else than DTS?
> Any suggestions are greatly appreciated.
> Thanks,
> JB
|||The columns are fixed length. For example Column1 is 5 characters long, Column2 is 2 characters long, ColumnC is 30 characters long, ColumnD is 10 characters long, etc. Also, some of the columns are numeric and integer as well but are represented by chara
cters until imported.
Thanks,
JB

No comments:

Post a Comment