Showing posts with label txt. Show all posts
Showing posts with label txt. Show all posts

Wednesday, March 28, 2012

Importing txt files with sql server 2000

HI!

I am importing .txt files. How can i check the errors? I have created a
log file, but the problem is that i lose some characters.
I import for example:

Code
ABC
FZH
JHN

from a text file, but sometimes Code can be 4 caracters long
I import this 3 characters long now. When i add the same structured
text file with some rows lenght 4, it skips the last character, but i
get nothing in the log file.

please help
xgirlSo you already have a table after importing items with 3 characters,
and then later you import items with 4 characters?

Perhaps it set the field width to 3 based on what it found in the first
import.

Sounds like you may need to manually widen the field. Open the table
in design view and make sure the field width is large enough for 4
characters.|||If you are using a DTS for the import, you may want to check the
Transform Data Task. If you are using fixed width delimiting then that
may be the problem.|||I changed from varchar ->nvarchar if you mean that but still i get no
errors of lost characters.

The problem is i have a lot of .txt files and i will get in the future
the same files with different data. if i automaticlly import every
file, how can i be sure the data are not longer and i didn't lost some
characters.

thank you
xgirl|||I changed from varchar ->nvarchar if you mean that but still i get no
errors of lost characters.

The problem is i have a lot of .txt files and i will get in the future
the same files with different data. if i automaticlly import every
file, how can i be sure the data are not longer and i didn't lost some
characters.

thank you
xgirl

Importing TXT files problems

Hello,
I'm fairly new to SQL and hopefully this is a simple question to answer.
I've been trying to import a TXT file into an SQL database and I'm having trouble making it work correctly. It is a ASCII text file where the fields vary by the number of characters. This can be 2 characters up to 40 (STATE would be 2 characters, CITY is
32 characters, etc.)
I can import the file with DTS. I go in and select exactly where I want the field breaks to be. Then it imports everything as Characters with column headers of Col001, Col002 etc. I don't want everything as characters or Col001 etc. Some columns would be
INT, NUMERIC(x,x), etc. but everytime I change these values in DTS it won't import the data correctly.
Also, I have an SQL script that I wrote for a table where I can create the field lengths, data type etc., FWIW. This seems to be going nowhere fast.
What am I doing wrong? Should I be using something else than DTS?
Any suggestions are greatly appreciated.
Thanks,
JB
We have had really good luck with BULK INSERT using TAB DELIMITED data.
Maybe you can get it to be TAB DELIMITED using EXCEL - IMPORT the text and
save as .TDF.
We also are a VB shop, so on top of that we have developed several different
versions of a TEXTIMPORT program - some that take the lengths of each SQL
column and cut the input data based on that, others that look for specific
items while cutting up the data. This way we can also deal with odd data
issues for various columns (the TEXTIMPORT uses a little .INI file that
controls specifics like that).
I believe that BULK INSERT and BCP have control files to allow greater
flexibility - maybe check BOL (books online)...
"JonBlack" <anonymous@.discussions.microsoft.com> wrote in message
news:AE462262-F98D-40A1-BD57-6E2066BE2A6D@.microsoft.com...
> Hello,
> I'm fairly new to SQL and hopefully this is a simple question to answer.
> I've been trying to import a TXT file into an SQL database and I'm having
trouble making it work correctly. It is a ASCII text file where the fields
vary by the number of characters. This can be 2 characters up to 40 (STATE
would be 2 characters, CITY is 32 characters, etc.)
> I can import the file with DTS. I go in and select exactly where I want
the field breaks to be. Then it imports everything as Characters with column
headers of Col001, Col002 etc. I don't want everything as characters or
Col001 etc. Some columns would be INT, NUMERIC(x,x), etc. but everytime I
change these values in DTS it won't import the data correctly.
> Also, I have an SQL script that I wrote for a table where I can create the
field lengths, data type etc., FWIW. This seems to be going nowhere fast.
> What am I doing wrong? Should I be using something else than DTS?
> Any suggestions are greatly appreciated.
> Thanks,
> JB
|||Hi Steve,
Thanks for your reply. I wouldn't mind importing it into EXCEL and saving it in a better format. Unfortunately I have over 100,000 records, which rules EXCEL out. I think BULK IMPORT and BCP can't import files based on character widths but could use tabs
as you suggested. Using VB to write an import script could be an option though.
Thanks again,
JB
-- Steve Z wrote: --
We have had really good luck with BULK INSERT using TAB DELIMITED data.
Maybe you can get it to be TAB DELIMITED using EXCEL - IMPORT the text and
save as .TDF.
We also are a VB shop, so on top of that we have developed several different
versions of a TEXTIMPORT program - some that take the lengths of each SQL
column and cut the input data based on that, others that look for specific
items while cutting up the data. This way we can also deal with odd data
issues for various columns (the TEXTIMPORT uses a little .INI file that
controls specifics like that).
I believe that BULK INSERT and BCP have control files to allow greater
flexibility - maybe check BOL (books online)...
|||Is it a column a fix length or delimited by coma ?
JonBlack wrote:
> Hello,
> I'm fairly new to SQL and hopefully this is a simple question to answer.
> I've been trying to import a TXT file into an SQL database and I'm having trouble making it work correctly. It is a ASCII text file where the fields vary by the number of characters. This can be 2 characters up to 40 (STATE would be 2 characters, CITY i
s 32 characters, etc.)
> I can import the file with DTS. I go in and select exactly where I want the field breaks to be. Then it imports everything as Characters with column headers of Col001, Col002 etc. I don't want everything as characters or Col001 etc. Some columns would b
e INT, NUMERIC(x,x), etc. but everytime I change these values in DTS it won't import the data correctly.
> Also, I have an SQL script that I wrote for a table where I can create the field lengths, data type etc., FWIW. This seems to be going nowhere fast.
> What am I doing wrong? Should I be using something else than DTS?
> Any suggestions are greatly appreciated.
> Thanks,
> JB
|||The columns are fixed length. For example Column1 is 5 characters long, Column2 is 2 characters long, ColumnC is 30 characters long, ColumnD is 10 characters long, etc. Also, some of the columns are numeric and integer as well but are represented by chara
cters until imported.
Thanks,
JB

Importing TXT files problems

Hello,
I'm fairly new to SQL and hopefully this is a simple question to answer.
I've been trying to import a TXT file into an SQL database and I'm having tr
ouble making it work correctly. It is a ASCII text file where the fields var
y by the number of characters. This can be 2 characters up to 40 (STATE woul
d be 2 characters, CITY is
32 characters, etc.)
I can import the file with DTS. I go in and select exactly where I want the
field breaks to be. Then it imports everything as Characters with column hea
ders of Col001, Col002 etc. I don't want everything as characters or Col001
etc. Some columns would be
INT, NUMERIC(x,x), etc. but everytime I change these values in DTS it won't
import the data correctly.
Also, I have an SQL script that I wrote for a table where I can create the f
ield lengths, data type etc., FWIW. This seems to be going nowhere fast.
What am I doing wrong? Should I be using something else than DTS?
Any suggestions are greatly appreciated.
Thanks,
JBWe have had really good luck with BULK INSERT using TAB DELIMITED data.
Maybe you can get it to be TAB DELIMITED using EXCEL - IMPORT the text and
save as .TDF.
We also are a VB shop, so on top of that we have developed several different
versions of a TEXTIMPORT program - some that take the lengths of each SQL
column and cut the input data based on that, others that look for specific
items while cutting up the data. This way we can also deal with odd data
issues for various columns (the TEXTIMPORT uses a little .INI file that
controls specifics like that).
I believe that BULK INSERT and BCP have control files to allow greater
flexibility - maybe check BOL (books online)...
"JonBlack" <anonymous@.discussions.microsoft.com> wrote in message
news:AE462262-F98D-40A1-BD57-6E2066BE2A6D@.microsoft.com...
> Hello,
> I'm fairly new to SQL and hopefully this is a simple question to answer.
> I've been trying to import a TXT file into an SQL database and I'm having
trouble making it work correctly. It is a ASCII text file where the fields
vary by the number of characters. This can be 2 characters up to 40 (STATE
would be 2 characters, CITY is 32 characters, etc.)
> I can import the file with DTS. I go in and select exactly where I want
the field breaks to be. Then it imports everything as Characters with column
headers of Col001, Col002 etc. I don't want everything as characters or
Col001 etc. Some columns would be INT, NUMERIC(x,x), etc. but everytime I
change these values in DTS it won't import the data correctly.
> Also, I have an SQL script that I wrote for a table where I can create the
field lengths, data type etc., FWIW. This seems to be going nowhere fast.
> What am I doing wrong? Should I be using something else than DTS?
> Any suggestions are greatly appreciated.
> Thanks,
> JB|||Hi Steve,
Thanks for your reply. I wouldn't mind importing it into EXCEL and saving it
in a better format. Unfortunately I have over 100,000 records, which rules
EXCEL out. I think BULK IMPORT and BCP can't import files based on character
widths but could use tabs
as you suggested. Using VB to write an import script could be an option thou
gh.
Thanks again,
JB
-- Steve Z wrote: --
We have had really good luck with BULK INSERT using TAB DELIMITED data.
Maybe you can get it to be TAB DELIMITED using EXCEL - IMPORT the text and
save as .TDF.
We also are a VB shop, so on top of that we have developed several different
versions of a TEXTIMPORT program - some that take the lengths of each SQL
column and cut the input data based on that, others that look for specific
items while cutting up the data. This way we can also deal with odd data
issues for various columns (the TEXTIMPORT uses a little .INI file that
controls specifics like that).
I believe that BULK INSERT and BCP have control files to allow greater
flexibility - maybe check BOL (books online)...|||Is it a column a fix length or delimited by coma ?
JonBlack wrote:
> Hello,
> I'm fairly new to SQL and hopefully this is a simple question to answer.
> I've been trying to import a TXT file into an SQL database and I'm having trouble
making it work correctly. It is a ASCII text file where the fields vary by the numbe
r of characters. This can be 2 characters up to 40 (STATE would be 2 characters, CIT
Y i
s 32 characters, etc.)
> I can import the file with DTS. I go in and select exactly where I want the field
breaks to be. Then it imports everything as Characters with column headers of Col001
, Col002 etc. I don't want everything as characters or Col001 etc. Some columns woul
d b
e INT, NUMERIC(x,x), etc. but everytime I change these values in DTS it won't import the dat
a correctly.
> Also, I have an SQL script that I wrote for a table where I can create the
field lengths, data type etc., FWIW. This seems to be going nowhere fast.
> What am I doing wrong? Should I be using something else than DTS?
> Any suggestions are greatly appreciated.
> Thanks,
> JB|||The columns are fixed length. For example Column1 is 5 characters long, Colu
mn2 is 2 characters long, ColumnC is 30 characters long, ColumnD is 10 chara
cters long, etc. Also, some of the columns are numeric and integer as well b
ut are represented by chara
cters until imported.
Thanks,
JB

importing txt file to multiple table in sql 2000

Hi There,

I am looking for information on how to import the txt or csv file to
the multiple table in sql 2000. If you have any kind of inf. please
let me know wheather we can do this and how.

below is the detail information.

I received txt file every day which contain the information from 3
different related table in my sql 2000 database. Right now we are
keyin the information from the web site (which is link to the txt
file) to our database, but i am wondering if we can import those
record in the tables.

the header of the file goes to table1 and when we insert the record in
table1, it should generate the autoidentityrecord (PK), and that PK is
link to other table2 and table3 where rest of the information from txt
file goes. For table2 and table3 there are multiple record per txt
files.

in our txt file each row is separated with row header, like HTC100
WITH ROW NO. 1,2,3.., which indecate this information goes to table
and 1,2...are the different row number.

Please let me know whether we can achive this task or not.

Thanks for all you help in advance.

Indra.

I have pasted my txt file below:
========

"FHS000",20041029,0900,,"10",1,"TRAILB10_20041029_1B",3,"2.20","Pason-DataHub",,"#
Well 1098831406 Tour 2004/10/29 Trailblazer 10 148",1,"EDR_3-0-10_HF2
ETS 2.2"
"CON000",1,0000,0759
"CON000",2,0800,1559
"CON000",3,1600,2359
"HWI010","0312857","COMPTON BRANT 15-7-18-24","COMPTON PETROLEUM
CORP.","TRAILBLAZER DRILLING
CORP.","15-07-018-24W4","100/15-07-018-24W4/00","HANK
PARANYCH","CURTIS FIESEL",20041029,,,"10",20041027,0600,,,"148","DD04485","VERT.","NO",,
"HCO030",1,"Daily Walk Around Inspection","HP","CF"
"HCO030",2,"Detailed Inspection - Weekly (using checklist)","HP","CF"
"HCO030",3,"H2S Signs Posted (if required)",,
"HCO030",4,"Well License & Stick Diagram Posted","HP","CF"
"HCO030",5,"Flare Lines Staked","HP","CF"
"HCO030",6,"BOP Drills Performed","HP","CF"
"HCO030",7,"Visually Inspect BOP's - Flarelines and Degasser
Lines","HP","CF"
"HDC040",1,"Rig Site Health and Safety Meeting (one/crew/month)","CF"
"HDC040",2,"C.A.O.D.C. Rig Safety Inspection Checklist
(one/rig/month)","CF"
"HDC040",3,"Mast Inspection Before Raising or Lowering","CF"
"HDC040",4,"Crown Saver Checked","CF"
"HDC040",5,"Motor Kills Checked","CF"
"HFU050",2300,2100,,
"HWE060",-5,"Deg C","COOL","WEST","SLIPPERY",,
"HCS070",1,177.8,,"mm",25.3,"STELCO","J-55",8,108.44,3.84,108.44,
"HCS070",2,114.3,,"mm",14.14,"STELCO","J-55",72,979.50,3.84,979.0,
"HDP080",1,127,79.4,"kg/m","E",57,127,"mm","3 1/2 IF",10,"DC",
"HDP080",2,89,19.7,"kg/m","E",68,120,"mm","3 1/2 IF",15,"DP",
"HPU090",1,"F-800","EMSCO",254,"mm",,,,
"HPU090",2,"F-800","EMSCO",254,"mm",,,,
"HTC100",1,"Rig up and tear down"
"HTC100",2,"Drill Actual"
"HTC100",3,"Reaming"
"HTC100",4,"Coring"
"HTC100",5,"Condition Mud & Circulate"
"HTC100",6,"Trips"
"HTC100",7,"Rig Service"
"HTC100",8,"Repair Rig"
"HTC100",9,"Cut off drilling line"
"HTC100",10,"Deviation Survey"
"HTC100",11,"Wire Line Logs"
"HTC100",12,"Run Case & Cement"
"HTC100",13,"Wait on Cement"
"HTC100",14,"Nipple up B.O.P."
"HTC100",15,"Test B.O.P."
"HTC100",16,"Drill Stem Test"
"HTC100",17,"Plug Back"
"HTC100",18,"Squeeze Cement"
"HTC100",19,"Fishing"
"HTC100",20,"Directional Work"
"HTC100",21,"Safety Meeting"
"HTC100",24,"WOD"
"HSS110",1,1,"SWACO","N","110",,"84",,
"HPA130","COMPTON BRANT 15-7-18-24",20041029,"COMPTON PETROLEUM
CORP.","TRAILBLAZER DRILLING CORP.","CURTIS
FIESEL","10","ALBERTA","N",253
"TCP130",1,,,,"kPa",140,,,,"mm",,
"TCP130",2,,,,"kPa",140,,,,"mm",,
"TCP130",3,,,,"kPa",140,,,,"mm",,
"TTL160",1,1,0.00,0.25,0.25,21,"SAFETY MEETING WITH TONG HAND"
"TTL160",1,2,0.25,1.75,1.50,12,"RIG TO AND RUN CASING"
"TTL160",1,3,1.75,2.00,0.25,7,"RIG SERVICE"
"TTL160",1,4,2.00,2.50,0.50,5,"CONDITION MUD & CIRC."
"TTL160",1,5,2.50,2.75,0.25,21,"SAFETY MEETING WITH CEMENTERS"
"TTL160",1,6,2.75,3.50,0.75,12,"RIG TO AND CEMENT CASING"
"TTL160",1,7,3.50,6.00,2.50,1,"SET SLIPS, TEAR OUT RIG, CLEAN TANKS"
"TTL160",1,8,6.00,8.00,2.00,24,"WAIT ON DAYLIGHT/TRUCKS"
"TTL160",1,9,,,,,"CEMENT WITH BJ USING 13 TONNES OF BVF-1500 NP + .7%
FL-5,GIVING 15.5 m3 OF GOOD"
"TTL160",1,10,,,,,"SLURRY @. 1718 kg/m3,PLUG BUMPED & HELD @. 03:30 HRS
OCT 29/04."
"TTL160",1,11,,,,,"RIG RELEASED @. 08:00 HRS OCT 29/04"
"TTL160",1,12,,,,,"MOVE TO 12-3-18-25W4"
"TDI170",1,"JEFF CASE",8,10,475,"Deg C",,,"RUNNING CASING",,,,,
"TLN175",1,"VISUALLY INSPECT PINS, RAMS AND STOOLS PRIOR TO LAYING
OVER DERRICK"
"TPA180",1,1,"DRILLER",647172865,"JEFF CASE",8,,,"JC"
"TPA180",1,2,"DERRICK HAND",648519056,"BRYAN VANHAM",8,,,"BV"
"TPA180",1,3,"MOTOR HAND",651056533,"NEIL WILLIAMS",8,,,"NW"
"TPA180",1,4,"FLOOR HAND",640352662,"TARAS WOITAS",8,,,"TW"
"TPI190",1,"REG",25,,,,,,
"TPI190",2,"REG",25,,,,,,
"TPI190",3,"REG",25,,,,,,
=====[posted and mailed, please reply in news]

Indra (itimilsina@.savannaenergy.com) writes:
> I am looking for information on how to import the txt or csv file to
> the multiple table in sql 2000. If you have any kind of inf. please
> let me know wheather we can do this and how.
> below is the detail information.
> I received txt file every day which contain the information from 3
> different related table in my sql 2000 database. Right now we are
> keyin the information from the web site (which is link to the txt
> file) to our database, but i am wondering if we can import those
> record in the tables.
> the header of the file goes to table1 and when we insert the record in
> table1, it should generate the autoidentityrecord (PK), and that PK is
> link to other table2 and table3 where rest of the information from txt
> file goes. For table2 and table3 there are multiple record per txt
> files.
> in our txt file each row is separated with row header, like HTC100
> WITH ROW NO. 1,2,3.., which indecate this information goes to table
> and 1,2...are the different row number.
> Please let me know whether we can achive this task or not.

Of course, it is possible. However, as far as I can see only by means of
writing a program that reads and parses the file. The standard tools for
loading files into SQL Server are BCP and DTS. BCP cannot cope with your
file, because there is a mix of record formats. BCP can only import files
with uniform records.

DTS is more versatile than BCP, but I as far as can see, you will still
have to write code to have DTS to import the file. I need to add the
disclaimer that I have zero experience of DTS.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||I agree with Erland. Could it be done with BCP and TSQL? Yes. Would it be
an efficient, maintainable, error resilient code? No.

It would be far better to write a piece of code or better get the sender to
supply X number of files, one for each table. The original creator of the
file obviously has control over the file format.

Danny

"Erland Sommarskog" <esquel@.sommarskog.se> wrote in message
news:Xns95998D7498D77Yazorman@.127.0.0.1...
> [posted and mailed, please reply in news]
> Indra (itimilsina@.savannaenergy.com) writes:
>> I am looking for information on how to import the txt or csv file to
>> the multiple table in sql 2000. If you have any kind of inf. please
>> let me know wheather we can do this and how.
>>
>> below is the detail information.
>>
>> I received txt file every day which contain the information from 3
>> different related table in my sql 2000 database. Right now we are
>> keyin the information from the web site (which is link to the txt
>> file) to our database, but i am wondering if we can import those
>> record in the tables.
>>
>> the header of the file goes to table1 and when we insert the record in
>> table1, it should generate the autoidentityrecord (PK), and that PK is
>> link to other table2 and table3 where rest of the information from txt
>> file goes. For table2 and table3 there are multiple record per txt
>> files.
>>
>> in our txt file each row is separated with row header, like HTC100
>> WITH ROW NO. 1,2,3.., which indecate this information goes to table
>> and 1,2...are the different row number.
>>
>> Please let me know whether we can achive this task or not.
> Of course, it is possible. However, as far as I can see only by means of
> writing a program that reads and parses the file. The standard tools for
> loading files into SQL Server are BCP and DTS. BCP cannot cope with your
> file, because there is a mix of record formats. BCP can only import files
> with uniform records.
> DTS is more versatile than BCP, but I as far as can see, you will still
> have to write code to have DTS to import the file. I need to add the
> disclaimer that I have zero experience of DTS.
> --
> Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
> Books Online for SQL Server SP3 at
> http://www.microsoft.com/sql/techin.../2000/books.asp|||Danny (istdrs@.flash.net) writes:
> I agree with Erland. Could it be done with BCP and TSQL? Yes.

No. :-) Since there is a variable number of fields on each row, BCP would
be lost.

> It would be far better to write a piece of code or better get the sender
> to supply X number of files, one for each table. The original creator
> of the file obviously has control over the file format.

Yes, fixing the file format may very well be a good idea.

Interesting enough, it could still be one file, if the shorter rows were
padded with extra fields. Note that just adding ,,,,, would not be enough,
since the " too are delimiters as far as BCP is concerned, so the usage of
" must be consistent from record to record. (Which does not seem to be the
case in the current file.)
--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||Hi Erland,

I am still trying to solve this problem of inserting txt file and
manupulating data to insert in differet table in sql 2000. I have
problem of inserting nvarchar which is coming from txt file to datetime
field in sql table.

If you look through my txt file there is a row start with TTL160 AND WE
HAVE COL4 AND COL5 WITH DATA LIKE 1.00 AND 12.23 ETC,(WHICH IS IN
NVARCHAR), I NEED TO INSERT THIS VALUE AS 1:00 OR 12:23 IN MY ANOTHER
TABEL IN DATETIME FIEDL. I COULD USED REPLACE(COLUMN, '.',':') TO
REPLACE FROM 12.23 TO 12:23 BUT WHEN I INSERT THIS TO SQL DATETIME FIELD
ITS GIVING ME ERROR " Arithmetic overflow error converting expression
to data type datetime".I try to use Cast function, still same error,
could MICROSOFT have a look please.

Thanks for help.

Indra.

*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!

Wednesday, March 21, 2012

importing from .txt

orange May 14, 5:16 pm show options
Newsgroups: comp.databases.ms-access
From: "orange" <orange...@.mail.ru> - Find messages by this author
Date: 14 May 2005 14:16:49 -0700
Local: Sat,May 14 2005 5:16 pm
Subject: import complex data
Reply | Reply to Author | Forward | Print | Individual Message | Show
original | Remove | Report Abuse

I've got a database in .txt file similar to this:

BookName;Author;Year;ReviewedBy;Rating;Pages
Nemesis;Isaac Asimov;1989;13,31,24;good;110
Sense & Sensibility;Jane Austen;1970;45,32;great;120
The Bicentennial Man;Isaac Asimov;1965;14;excellent;124
...

the data in field 'ReviewedBy' should be connected to names of people
that read the book. I'd like to have a separate table in Access like
this:

Id;Reviewer
13;Bob
...
24;Jim
...
31;Tom

How do I import that data into database?Hi

This is a SQL Server newsgroup, therefore you may want to post to a
newsgroup more specific to your subject.

You can import a text file from the Insert/table menu. For your 'ReviewedBy'
field you can define relationships using the Tools/Relationships menu. This
will make sure that they are only existing users.
To copy them into subsequent tables then you can write Append queries that
will produce queries like INSERT INTO NewTable SELECT Id, ReviewedBy FROM
LoadedDataTable if you view the SQL.

HTH

John

<orangeKDS@.mail.ru> wrote in message
news:1116107071.499762.101180@.g43g2000cwa.googlegr oups.com...
> orange May 14, 5:16 pm show options
> Newsgroups: comp.databases.ms-access
> From: "orange" <orange...@.mail.ru> - Find messages by this author
> Date: 14 May 2005 14:16:49 -0700
> Local: Sat,May 14 2005 5:16 pm
> Subject: import complex data
> Reply | Reply to Author | Forward | Print | Individual Message | Show
> original | Remove | Report Abuse
> I've got a database in .txt file similar to this:
> BookName;Author;Year;ReviewedBy;Rating;Pages
> Nemesis;Isaac Asimov;1989;13,31,24;good;110
> Sense & Sensibility;Jane Austen;1970;45,32;great;120
> The Bicentennial Man;Isaac Asimov;1965;14;excellent;124
> ..
> the data in field 'ReviewedBy' should be connected to names of people
> that read the book. I'd like to have a separate table in Access like
> this:
> Id;Reviewer
> 13;Bob
> ..
> 24;Jim
> ..
> 31;Tom
> How do I import that data into database?

Importing Fixed-width txt file - problem

Hi,

I was trying to import a fixed-width file to a sql 2005 table.
The total record lenght is 1500. I was trying to import it to a single column.

The strange thing that's happening is: SSIS is inserting only the first 32 chars of the record and the remaining are gone. I tried using nvarchar(max) and varchar(max) but of no use.
I think something somewhere is going wrong but I was unable to figure it out. Earlier I was able to load a similar file into a single column table.

My Header row delimiter is {CR}{LF}
The preview pane shows the complete record but when it transfers to the table, I'm getting 32 chars only.


Can anybody suggest any ideas to figure this out?


Thanks,
Siva.

Are there NULL characters in your data?|||Yes.........NULLs are there..|||You need to get rid of NULLs. NULLs are string terminators, which is likely why your data "stops" at a specific position on each record.

Your other alternative is to read in the data as binary using a script task, searching for NULLs and replacing them with spaces (or some other valid string character of your choosing). If you write this code, let me know. I need to do just this and haven't had the time to write it.

Monday, March 12, 2012

Importing data using import service/DTS

Hi,

I'm using Sql server 2005 and I need to import about 15000 records from a txt file which is delimated using " | " character. Part of the problem is that the file has about 109 fields. The other part of the problem is that there aren't any headers defined. The field name and their respective data types are defined in a sepearte excel sheet. When I'm trying to import it using Sql Server 2005 import wizard, I would have to modify the names and the length manually and change the data types which is a time consuming job. My boss has claimed that he did it in Sql Server 2000 without using BCP utility and he got the table columns from the excel file (he didn't have to modify it manually somehow he import it) but he can't really remember !!!!!!!!

Now i'm really confused and a bit ashamed as well as he is around 62 years old.

Any idea of how it could be done using Sql Server 2005 or Sql Server 2000?

Thanks in advance.

Hi,

Can somebody please reply to this thread .......

Thanks

|||

Using the Bulk Insert task, you can specify a format file. This could be easily generated from the Execl sheet, and would be quicker than configuring the text file connection manager. Note this is not using teh Wizard, but the full designer.

The DTS Wizard had no way of using an Excel file or other format file either.

|||

DarrenSQLIS wrote:

Using the Bulk Insert task, you can specify a format file. This could be easily generated from the Execl sheet, and would be quicker than configuring the text file connection manager. Note this is not using teh Wizard, but the full designer.

The DTS Wizard had no way of using an Excel file or other format file either.

Thanks Darren. Can you please specify on how can i do it and which full designer. I'm sorry but i'm not use to Sql Server 2005.

|||

The designer, aka Business Intelligence Development Studio (BIDS) aka Visual Studio (VS).

Create a new Integration Services project type. This is the SSIS package designer.

Drag tasks from the toolbox, and I suggest you start with the Bulk Insert Task.

There is a tutorials section in the SQL help, Books Online, that may be worth a look as well.

Friday, March 9, 2012

Importing data from txt file

I have 30 text file with same format that I have to make SSIS package so that I can populate my permanent tables. Because they are just record keeping data we are not ver concerned about validating the data. We just want to copy and dump it in permanent table. Which should mean the least amount of work, because I used one file as a sample, did advance formating to give column names. Now, I did one simple dataflow task that takes the flat file and tranfers the data to OLEDB SQL server database table. Unfortunately, it keeps giving me error. I believe this is because of NULL. I want to take care of this without making staging tables and validating data. Any ideas?

are u running this manually from within your SSIS Package? if so, you should get reasonably good information as to the prob in the Execution Results tab.

Are u sure your incoming ASCII fields are set wide enough to capture the data? If not, truncation errors will kill the process.

Are your table fields setup to accept NULLs?

Seth J Hersh

|||

well, we couldn't find set solution so, as a get arround we used VB.Net script to pad the row to a set width. This seems to have solved the problem for now.

Thank You

Anjali

|||

anj755 wrote:

I have 30 text file with same format that I have to make SSIS package so that I can populate my permanent tables. Because they are just record keeping data we are not ver concerned about validating the data. We just want to copy and dump it in permanent table. Which should mean the least amount of work, because I used one file as a sample, did advance formating to give column names. Now, I did one simple dataflow task that takes the flat file and tranfers the data to OLEDB SQL server database table. Unfortunately, it keeps giving me error. I believe this is because of NULL. I want to take care of this without making staging tables and validating data. Any ideas?

Ummm, it might help if you posted the error. This is a real simple task you are trying to do. (I think.)

Friday, February 24, 2012

Importing data

Hello everybody,
I need some help with importing data from txt to sql.
I have done that with BULK INSERT, but i had some limitations:
-i cant supress any field in source or destination.
Anyone can help me?
Thanks
Leandro Loureiro dos SantosLeandro,
> I need some help with importing data from txt to sql.
> I have done that with BULK INSERT, but i had some limitations:
> -i cant supress any field in source or destination.
You can do this with a bcp format using
bulk insert ...
with (formatfile = 'format_file_path')
Linda

Sunday, February 19, 2012

Importing a TXT file to MSSQL

hi, I have a TXT file with several fields with semicolom between fields.

But after Importing is done, Decimal field Dont appears with
comma.

in other words , if I type in a DataBase using Enterprise Manager 123,45678 I get exactily 123,45678, but from
Txt file I get 1234568.

what do i missing?
Tks
Carlos Lages
ps. Txt files is "12345678" even I put Txt file "123,45678"
I get the same result in databaseIn your table definition is the field a character field or numeric? What data type is it? This is just a thought, but maybe during the import job, DTS? is treating the field as some form of numeric field and ignoring the "," as that is not relevant..

Hope this helps, or am i on the wrong track?

Originally posted by Carlos lages
hi, I have a TXT file with several fields with semicolom between fields.

But after Importing is done, Decimal field Dont appears with
comma.

in other words , if I type in a DataBase using Enterprise Manager 123,45678 I get exactily 123,45678, but from
Txt file I get 1234568.

what do i missing?
Tks
Carlos Lages
ps. Txt files is "12345678" even I put Txt file "123,45678"
I get the same result in database

Importing a tab delimited file into SQL Server

I am trying to impport a tab demlied txt file, into SQL SERVER, using OPENROWSET. Howver, the file gets imports, into SQL, BUT ONLY in 1 column... The source file really has 3 columns, but when it gets isnerted into SQL, it only oputs it all in 1 column.

select * into #tmp from OpenRowset('MSDASQL', 'Driver={Microsoft Text Driver (*.txt; *.csv)};
DefaultDir=\\c:\SQLRSReports;','select * from vwConstrLendingmapLoanDefiCdDesc.txt')

How do I insert the data into 3 seperate colms, instead of 1?

Try

BULK INSERT
http://msdn2.microsoft.com/en-us/library/ms188365.aspx

or
BCP
http://msdn2.microsoft.com/en-us/library/aa174646(SQL.80).aspx|||I would like to use the OPENROWSET Only. Is that possible with Openrowset?|||

Code Snippet

CREATE TABLE #newIn
(
Ident varchar(max),
subIdent int identity(1,1)
subItem varchar(max)
)

DECLARE @.posA int,
@.posB int,
@.srch char(1),
@.in varchar(max)

SET @.srch = ',' --or char(9) for tab

DECLARE csr CURSOR FOR
SELECT item from #tmp

OPEN csr

FETCH NEXT FROM csr
INTO @.in

WHILE @.@.FETCH_STATUS = 0
BEGIN

SET @.in = LTRIM(RTRIM(@.in))+@.srch
IF len(@.in)>1
BEGIN
SET @.posB = CHARINDEX(@.srch, @.in)
SET @.posA = 0
WHILE @.posB > 0
BEGIN
INSERT INTO #newIn (Ident, subItem) VALUES (@.in, substring(@.in, @.posA, @.posB-@.posA))
SET @.posA = @.posB + 1
SET @.posB = CHARINDEX(@.srch, @.in, @.posA)
END
END
END

CLOSE csr
DEALLOCATE csr

select *
from #newIn

DROP TABLE #newIn


This converts the single column into multiple rows that you can then extract back as you want.

If not exactly what you need, then it's a starting point for other variations.|||

I created a schema.ini file, with all the format specifiactions for my source file. Here is the code for that. However, when I import, it most of the data is imported fine, except for when the value has "" in the data. eg if the data has value such as 'pending "store" ', it imports the data as only "pending", and omits the store..

Here is my specificaiton of the schema file

[vwConstrLendingmapLoanDefiCdDesc.txt]
ColNameHeader=False
Format=TabDelimited
MaxScanRows=0
CharacterSet=ANSI
Col1=Doccd Char Width 50
Col2=Docdesc LongChar
Col3=Docshcd Char Width 100

- And this is the openrowset:

select * from OpenRowset('MSDASQL', 'Driver={Microsoft Text Driver (*.txt; *.csv)};
DefaultDir=\\c:\SQLRSReports;','select * from vwConstrLendingmapLoanDefiCdDesc.txt')

Pl advise