Showing posts with label csv. Show all posts
Showing posts with label csv. Show all posts

Wednesday, March 28, 2012

importing txt file to multiple table in sql 2000

Hi There,

I am looking for information on how to import the txt or csv file to
the multiple table in sql 2000. If you have any kind of inf. please
let me know wheather we can do this and how.

below is the detail information.

I received txt file every day which contain the information from 3
different related table in my sql 2000 database. Right now we are
keyin the information from the web site (which is link to the txt
file) to our database, but i am wondering if we can import those
record in the tables.

the header of the file goes to table1 and when we insert the record in
table1, it should generate the autoidentityrecord (PK), and that PK is
link to other table2 and table3 where rest of the information from txt
file goes. For table2 and table3 there are multiple record per txt
files.

in our txt file each row is separated with row header, like HTC100
WITH ROW NO. 1,2,3.., which indecate this information goes to table
and 1,2...are the different row number.

Please let me know whether we can achive this task or not.

Thanks for all you help in advance.

Indra.

I have pasted my txt file below:
========

"FHS000",20041029,0900,,"10",1,"TRAILB10_20041029_1B",3,"2.20","Pason-DataHub",,"#
Well 1098831406 Tour 2004/10/29 Trailblazer 10 148",1,"EDR_3-0-10_HF2
ETS 2.2"
"CON000",1,0000,0759
"CON000",2,0800,1559
"CON000",3,1600,2359
"HWI010","0312857","COMPTON BRANT 15-7-18-24","COMPTON PETROLEUM
CORP.","TRAILBLAZER DRILLING
CORP.","15-07-018-24W4","100/15-07-018-24W4/00","HANK
PARANYCH","CURTIS FIESEL",20041029,,,"10",20041027,0600,,,"148","DD04485","VERT.","NO",,
"HCO030",1,"Daily Walk Around Inspection","HP","CF"
"HCO030",2,"Detailed Inspection - Weekly (using checklist)","HP","CF"
"HCO030",3,"H2S Signs Posted (if required)",,
"HCO030",4,"Well License & Stick Diagram Posted","HP","CF"
"HCO030",5,"Flare Lines Staked","HP","CF"
"HCO030",6,"BOP Drills Performed","HP","CF"
"HCO030",7,"Visually Inspect BOP's - Flarelines and Degasser
Lines","HP","CF"
"HDC040",1,"Rig Site Health and Safety Meeting (one/crew/month)","CF"
"HDC040",2,"C.A.O.D.C. Rig Safety Inspection Checklist
(one/rig/month)","CF"
"HDC040",3,"Mast Inspection Before Raising or Lowering","CF"
"HDC040",4,"Crown Saver Checked","CF"
"HDC040",5,"Motor Kills Checked","CF"
"HFU050",2300,2100,,
"HWE060",-5,"Deg C","COOL","WEST","SLIPPERY",,
"HCS070",1,177.8,,"mm",25.3,"STELCO","J-55",8,108.44,3.84,108.44,
"HCS070",2,114.3,,"mm",14.14,"STELCO","J-55",72,979.50,3.84,979.0,
"HDP080",1,127,79.4,"kg/m","E",57,127,"mm","3 1/2 IF",10,"DC",
"HDP080",2,89,19.7,"kg/m","E",68,120,"mm","3 1/2 IF",15,"DP",
"HPU090",1,"F-800","EMSCO",254,"mm",,,,
"HPU090",2,"F-800","EMSCO",254,"mm",,,,
"HTC100",1,"Rig up and tear down"
"HTC100",2,"Drill Actual"
"HTC100",3,"Reaming"
"HTC100",4,"Coring"
"HTC100",5,"Condition Mud & Circulate"
"HTC100",6,"Trips"
"HTC100",7,"Rig Service"
"HTC100",8,"Repair Rig"
"HTC100",9,"Cut off drilling line"
"HTC100",10,"Deviation Survey"
"HTC100",11,"Wire Line Logs"
"HTC100",12,"Run Case & Cement"
"HTC100",13,"Wait on Cement"
"HTC100",14,"Nipple up B.O.P."
"HTC100",15,"Test B.O.P."
"HTC100",16,"Drill Stem Test"
"HTC100",17,"Plug Back"
"HTC100",18,"Squeeze Cement"
"HTC100",19,"Fishing"
"HTC100",20,"Directional Work"
"HTC100",21,"Safety Meeting"
"HTC100",24,"WOD"
"HSS110",1,1,"SWACO","N","110",,"84",,
"HPA130","COMPTON BRANT 15-7-18-24",20041029,"COMPTON PETROLEUM
CORP.","TRAILBLAZER DRILLING CORP.","CURTIS
FIESEL","10","ALBERTA","N",253
"TCP130",1,,,,"kPa",140,,,,"mm",,
"TCP130",2,,,,"kPa",140,,,,"mm",,
"TCP130",3,,,,"kPa",140,,,,"mm",,
"TTL160",1,1,0.00,0.25,0.25,21,"SAFETY MEETING WITH TONG HAND"
"TTL160",1,2,0.25,1.75,1.50,12,"RIG TO AND RUN CASING"
"TTL160",1,3,1.75,2.00,0.25,7,"RIG SERVICE"
"TTL160",1,4,2.00,2.50,0.50,5,"CONDITION MUD & CIRC."
"TTL160",1,5,2.50,2.75,0.25,21,"SAFETY MEETING WITH CEMENTERS"
"TTL160",1,6,2.75,3.50,0.75,12,"RIG TO AND CEMENT CASING"
"TTL160",1,7,3.50,6.00,2.50,1,"SET SLIPS, TEAR OUT RIG, CLEAN TANKS"
"TTL160",1,8,6.00,8.00,2.00,24,"WAIT ON DAYLIGHT/TRUCKS"
"TTL160",1,9,,,,,"CEMENT WITH BJ USING 13 TONNES OF BVF-1500 NP + .7%
FL-5,GIVING 15.5 m3 OF GOOD"
"TTL160",1,10,,,,,"SLURRY @. 1718 kg/m3,PLUG BUMPED & HELD @. 03:30 HRS
OCT 29/04."
"TTL160",1,11,,,,,"RIG RELEASED @. 08:00 HRS OCT 29/04"
"TTL160",1,12,,,,,"MOVE TO 12-3-18-25W4"
"TDI170",1,"JEFF CASE",8,10,475,"Deg C",,,"RUNNING CASING",,,,,
"TLN175",1,"VISUALLY INSPECT PINS, RAMS AND STOOLS PRIOR TO LAYING
OVER DERRICK"
"TPA180",1,1,"DRILLER",647172865,"JEFF CASE",8,,,"JC"
"TPA180",1,2,"DERRICK HAND",648519056,"BRYAN VANHAM",8,,,"BV"
"TPA180",1,3,"MOTOR HAND",651056533,"NEIL WILLIAMS",8,,,"NW"
"TPA180",1,4,"FLOOR HAND",640352662,"TARAS WOITAS",8,,,"TW"
"TPI190",1,"REG",25,,,,,,
"TPI190",2,"REG",25,,,,,,
"TPI190",3,"REG",25,,,,,,
=====[posted and mailed, please reply in news]

Indra (itimilsina@.savannaenergy.com) writes:
> I am looking for information on how to import the txt or csv file to
> the multiple table in sql 2000. If you have any kind of inf. please
> let me know wheather we can do this and how.
> below is the detail information.
> I received txt file every day which contain the information from 3
> different related table in my sql 2000 database. Right now we are
> keyin the information from the web site (which is link to the txt
> file) to our database, but i am wondering if we can import those
> record in the tables.
> the header of the file goes to table1 and when we insert the record in
> table1, it should generate the autoidentityrecord (PK), and that PK is
> link to other table2 and table3 where rest of the information from txt
> file goes. For table2 and table3 there are multiple record per txt
> files.
> in our txt file each row is separated with row header, like HTC100
> WITH ROW NO. 1,2,3.., which indecate this information goes to table
> and 1,2...are the different row number.
> Please let me know whether we can achive this task or not.

Of course, it is possible. However, as far as I can see only by means of
writing a program that reads and parses the file. The standard tools for
loading files into SQL Server are BCP and DTS. BCP cannot cope with your
file, because there is a mix of record formats. BCP can only import files
with uniform records.

DTS is more versatile than BCP, but I as far as can see, you will still
have to write code to have DTS to import the file. I need to add the
disclaimer that I have zero experience of DTS.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||I agree with Erland. Could it be done with BCP and TSQL? Yes. Would it be
an efficient, maintainable, error resilient code? No.

It would be far better to write a piece of code or better get the sender to
supply X number of files, one for each table. The original creator of the
file obviously has control over the file format.

Danny

"Erland Sommarskog" <esquel@.sommarskog.se> wrote in message
news:Xns95998D7498D77Yazorman@.127.0.0.1...
> [posted and mailed, please reply in news]
> Indra (itimilsina@.savannaenergy.com) writes:
>> I am looking for information on how to import the txt or csv file to
>> the multiple table in sql 2000. If you have any kind of inf. please
>> let me know wheather we can do this and how.
>>
>> below is the detail information.
>>
>> I received txt file every day which contain the information from 3
>> different related table in my sql 2000 database. Right now we are
>> keyin the information from the web site (which is link to the txt
>> file) to our database, but i am wondering if we can import those
>> record in the tables.
>>
>> the header of the file goes to table1 and when we insert the record in
>> table1, it should generate the autoidentityrecord (PK), and that PK is
>> link to other table2 and table3 where rest of the information from txt
>> file goes. For table2 and table3 there are multiple record per txt
>> files.
>>
>> in our txt file each row is separated with row header, like HTC100
>> WITH ROW NO. 1,2,3.., which indecate this information goes to table
>> and 1,2...are the different row number.
>>
>> Please let me know whether we can achive this task or not.
> Of course, it is possible. However, as far as I can see only by means of
> writing a program that reads and parses the file. The standard tools for
> loading files into SQL Server are BCP and DTS. BCP cannot cope with your
> file, because there is a mix of record formats. BCP can only import files
> with uniform records.
> DTS is more versatile than BCP, but I as far as can see, you will still
> have to write code to have DTS to import the file. I need to add the
> disclaimer that I have zero experience of DTS.
> --
> Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
> Books Online for SQL Server SP3 at
> http://www.microsoft.com/sql/techin.../2000/books.asp|||Danny (istdrs@.flash.net) writes:
> I agree with Erland. Could it be done with BCP and TSQL? Yes.

No. :-) Since there is a variable number of fields on each row, BCP would
be lost.

> It would be far better to write a piece of code or better get the sender
> to supply X number of files, one for each table. The original creator
> of the file obviously has control over the file format.

Yes, fixing the file format may very well be a good idea.

Interesting enough, it could still be one file, if the shorter rows were
padded with extra fields. Note that just adding ,,,,, would not be enough,
since the " too are delimiters as far as BCP is concerned, so the usage of
" must be consistent from record to record. (Which does not seem to be the
case in the current file.)
--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||Hi Erland,

I am still trying to solve this problem of inserting txt file and
manupulating data to insert in differet table in sql 2000. I have
problem of inserting nvarchar which is coming from txt file to datetime
field in sql table.

If you look through my txt file there is a row start with TTL160 AND WE
HAVE COL4 AND COL5 WITH DATA LIKE 1.00 AND 12.23 ETC,(WHICH IS IN
NVARCHAR), I NEED TO INSERT THIS VALUE AS 1:00 OR 12:23 IN MY ANOTHER
TABEL IN DATETIME FIEDL. I COULD USED REPLACE(COLUMN, '.',':') TO
REPLACE FROM 12.23 TO 12:23 BUT WHEN I INSERT THIS TO SQL DATETIME FIELD
ITS GIVING ME ERROR " Arithmetic overflow error converting expression
to data type datetime".I try to use Cast function, still same error,
could MICROSOFT have a look please.

Thanks for help.

Indra.

*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!

importing to swl server from a flat file PK problem

I am trying to import into sql server from a .csv flat file i made in excel. The problem is the Primary Key. It is set to auto increment.
When i leave an empty column i get:
"Directcopyxform conversion error:destination does not allow null"

when i just omit the column entirely, i get a wrong datatype error because it basically tries to copy all the columns 1 shift to the left

How am i supposed to represent and auto PK in .csv file. ThanksUse a staging table. Where you can load the information to, perform any processing, then insert into your table. Finally cleaning out the staging table at the end.

Monday, March 26, 2012

Importing Selected csv data into SQL tables

Hi,

I hope you can help me.

I've got a csv that I need to import. It's about 74 columns wide and 2500 rows long. What I need to do is select the first 3 columns from the columns from the csv and put them into one table and then put every 7 columns (plus the 1st column as the primary key) into a different table.

I've not explained that very well, basically it's like this...

A B C D E F G H I J K L M N O P

1

2

3

4

5

Table 1 Data

cs# A1
fe B1
cl C1

Table 2

cs# A1
desc D1
date E1
pay F1
amount G1
vat H1
total I1
cReq J1

I'm not really sure were to start. I want to import the data nightly into the tables. I've tried to use the import wizard but I can't seem to select the data I want and exclude the data I don't.

Steve

Which version of SQL are you using? If its 2005 you should be able to do this neatly using Integration Services (SSIS).

For 2000 you might be best off dumping the whole thing into a holding table and then manipulating the data from there. This could be handled by several steps of a SQL Job.

HTH!

|||Thanks for the reply,

Yeah I'm using 2005 sp2. Where can I find the integration services? I can't seem to find it.

Steve|||You need to install SSIS(integration services) @. the time of installing sql server 2005 (you need to choose the services you need to install)........go to startall programsmicrosoft sql server 2005configuration toolssql server configuration manager.....under that you can see the list of sql 2005 services just check if SSIS is present else install it.......after installing connect to SSMS choose integration services instead of database engine to connect to SSIS..........

Thanxx
|||

Follow Deepaks instructions to get it installed if you haven't already. Then, you can open up the Business Intellignece Studio (Visual Studio skin) and select an Integration Services project. This will then give you a graphical interface in which you can drag and drop different datasources (eg Excel & SQL Server) and dataflow methods (eg copy column and export column )

Take a look at the Books Online walkthroughs that will give you a flavour of the different tasks you can achieve and hopefully that will give you a good steer on what to do.

Good luck!

|||Thanks very much.

I feel more at home in Visual Studio!

I'll probably be back when I get stuck again.

Steve

Monday, March 19, 2012

Importing Error

Hi all,

I am facing problem on importing csv data file into table,


For Ex:


csv file which has 2 columns & 4 records are as follows

EmpNo EmpName

1 a
2 b
3 c
1 a

I created new table called T4, structure of T4 is

EmpNo Varchar(50) with Primary Key
EmpName Varchar(50) disabled Allow Nulls

Now I started importing csv data to T4 table, during importing it couldn't finish entire process. Throwing error, when I remove primary key with disable Allow Nulls for both constraints of a table T4. Importing data successfull.
My question is if I have above said structure for the table T4 (with primary Key), during importing instead of throwing error let it be import first 3 rows then throw error as primary key enabled could not able to import. Is this can be possibe, if so please suggest me to solve the problem.Can I use Bulk Insert or SQL Scripts to solve this problem.

Thanks in advance
Karna

I believe that I understand your current situation, but I do not think I understand what it is you are trying to do.

If you're trying to insert the three "good rows" and "fail" the one "bad" row without failing the package as a whole, you may be able to accomplish this by adding an error output to the destination to which you're writing these records. Then the "bad" records will be redirected to that portion of the data flow, where you can count them, log them, or ignore them if you want.

Is this what you're trying to accomplish? If so, please let us know if this technique gives you the solution you need. If not, please see if you can rephrase your goal, and I can try again.

|||Hi Mathew,

Thanks for your reply. I am executing wizard in SQL Server 2005. Actual problem is "I am preparing a job which can import CSV data file to the table. Table already has primary key & disabled allow null option. Here in CSV file it has duplicate records & empty rows. During importing of job is not successfully done, it is throwing error." My question is if the duplicate data lies in 1000th record, let job be success for until 999 record. Then let it can show error for 1000 record & logged .
Please suggest me to how to succeed the job.

Thanks in advance
Karna|||

I'm honestly not certain how I would go about doing that. For most situations where I'm using SSIS, the behavior you're trying to achieve would be inappropriate for my needs (I would either need the entire load to fail with no rows written to the destination, or else have error rows redirected to a log file or table) so I've never tried to set things up this way. I'm also pretty unfamiliar with the Input/Output Wizard, so that's two strikes against me.

With that said, the first thing I would try would be to update the destination component in the package (i do not know if this is exposed through the wizard - i assume it is not) to turn off fast load. This will slow things down quite a bit as each row will be inserted into the destination database one by one, but it might give you what you need.

Perhaps someone else has more experience with this problem than I...

|||Hi Mathew,

Actually I have to run a job for which it has to import csv data file to table. CSV has duplicate records. My job should not import duplicate records, it should append only actual records. If I set primary key & disabled allow null option. After this when I run the job, job is not even appending actual records (which is not duplicated), job is throwing error which is not importing any data from csv file.
Please suggest me to solve this issue.

Thanks in advance
Karna|||Hi Mathew,

Actually I have to run a job for which it has to import csv data file to table. CSV has duplicate records. My job should not import duplicate records, it should append only actual records. If I set primary key & disabled allow null option. After this when I run the job, job is not even appending actual records (which is not duplicated), job is throwing error which is not importing any data from csv file.
Please suggest me to solve this issue.

Thanks in advance
Karna|||

Karna,

I think you best option use the error output in the destination component to redirect the errors. If you don't change the default value of the error configuration; then the package will fail as soon as the 1st duplicate hit the destination table. This is nothing diffrent from what Matthew has already suggested.

|||Hi,

Thanks for reply. My problem is I need to import CSV data file to one of the table in the database. During importing it shouldn't import null rows & same rows to the table. The method which should checks the data in the table before importing, if data exists it shouldn't import any data else it should. Please suggest me to solve this problem.

Thanks in advance
Karna|||

See this link for a number of different ways to see if a row exists:

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1211340&SiteID=1

Monday, March 12, 2012

Importing Data Throught ASP

Hello,
I've been trying to work with code to import data from a "flat file" (.csv)
into a SQL 2000 table.
I'm not that well versed in SQL and was wondering is it was possible using
OPENROWSET or something similar?
I've tried using OPENROWSET and encountered the error "Ad hoc access to OLE
DB provider 'MSDASQL' has been denied. You must access this provider through
a linked server."
Any help or resources to explain the process would be fantastic.
-SteveIf you're trying to upload through an ASP page you don't want to use
OPENROWSET. That's really for accessing OLE DB data sources. For an example
of uploading a file via ASP check out:
http://msdn.microsoft.com/library/d...br />
0900.asp
Mike O.
"Anubis" <anubis2003@.bluestreem.com.au> wrote in message
news:ucZJcoO1DHA.208@.TK2MSFTNGP12.phx.gbl...
quote:

> Hello,
> I've been trying to work with code to import data from a "flat file"

(.csv)
quote:

> into a SQL 2000 table.
> I'm not that well versed in SQL and was wondering is it was possible using
> OPENROWSET or something similar?
> I've tried using OPENROWSET and encountered the error "Ad hoc access to

OLE
quote:

> DB provider 'MSDASQL' has been denied. You must access this provider

through
quote:

> a linked server."
> Any help or resources to explain the process would be fantastic.
> -Steve
>
|||Hi Mike,
No, actually I wanted to upload data from a flat file (.csv) into a SQL
Database like to following
Flat File (x.csv) contains:
Col1, Col2, Col3
Value 0, Value1, Value 2,
Value 0, Value1, Value 2
Then do a command something like the following:
INSERT INTO dbo.table
SELECT Col1, Col2, Col3
FROM x.csv
Basically I want to open a rowset to insert into a SQL table but I cannot
open the rowset from the .csv source through an ASP command.
Any idea's?
Thanks
-Steve
"Michael Otey" <mikeo@.teca.com> wrote in message
news:eEOx90a1DHA.2336@.TK2MSFTNGP09.phx.gbl...
quote:

> If you're trying to upload through an ASP page you don't want to use
> OPENROWSET. That's really for accessing OLE DB data sources. For an

example
quote:

> of uploading a file via ASP check out:
>

http://msdn.microsoft.com/library/d...tml/asp0900.asp
quote:
[
color=darkred]
> Mike O.
>
> "Anubis" <anubis2003@.bluestreem.com.au> wrote in message
> news:ucZJcoO1DHA.208@.TK2MSFTNGP12.phx.gbl...
> (.csv)
using[QUOTE]
> OLE
> through
>
|||If you're running this on the SQL Server system and just want to import the
test file then probably the simplest thing is to use the BULK INSERT
statement something like the following:
BULK INSERT mydb..mytable FROM 'c:\mycsvfile.csv'
WITH (DATAFILETYPE = 'char',FIELDTERMINATOR = ',',ROWTERMINATOR = '\n')
Mike O.
"Anubis" <anubis2003@.bluestreem.com.au> wrote in message
news:%23eXL8Kb1DHA.2396@.TK2MSFTNGP09.phx.gbl...
quote:

> Hi Mike,
> No, actually I wanted to upload data from a flat file (.csv) into a SQL
> Database like to following
>
> Flat File (x.csv) contains:
> Col1, Col2, Col3
> Value 0, Value1, Value 2,
> Value 0, Value1, Value 2
> Then do a command something like the following:
> INSERT INTO dbo.table
> SELECT Col1, Col2, Col3
> FROM x.csv
> Basically I want to open a rowset to insert into a SQL table but I cannot
> open the rowset from the .csv source through an ASP command.
> Any idea's?
> Thanks
> -Steve
>
> "Michael Otey" <mikeo@.teca.com> wrote in message
> news:eEOx90a1DHA.2336@.TK2MSFTNGP09.phx.gbl...
> example
>

http://msdn.microsoft.com/library/d...tml/asp0900.asp
quote:
[
color=darkred]
> using
to[QUOTE]
>

Wednesday, March 7, 2012

Importing data & inserting value

I've got a bunch of CSV files (21) which all contain the same fields but are only distinguishable from their file names. I.e. file.001 file.002 etc. I need to collaborate them all into one table on our Datawarehouse. Im currently doing the following:

1. Drop & recreate a temp table with an additional column as a default value based on the corresponding file extension (manually entered into the DTS routine).
2. Import a single file into a temp table.
3. Transfer the temp table data to the permanent table with Keep Null Values ticked.

Ive only got as far as importing 2 of the files (which works!) but is getting messy already with 10 steps so far!

There must be an easy way as a file is being imported to add a value on each row rather then going through all these processes!

Can anyone help?

Thanks in advance!You can use the bcp import utility with a format file. See in BOL for more info on format files.

Friday, February 24, 2012

Importing CVS to SQLServer 2005 in a web application

Importing CVS to SQLServer 2005 in a web application:

I need to import CSV file to a table. This is a .Net 2.0 web application running in the WebServer which has .Net 2.0 installed. SQL Server 2005 is a database in another machine. What are my options? Would SSIS work in WebServer if no database installed in that machine.

Thanks

>>.Importing CVS to SQLServer 2005 in a web application: << Do you mean CSV?

>>Would SSIS work in WebServer if no database installed in that machine.<< is SQL client and SQL Management studio was installed on the web server and you could log on to the web server.

>>What are my options? << You have specified if this is a on-off import or a regular job. One way of handling it would be to convert the csv to either a series in inserts to the table or calls to a stored procedure to do the job.

Importing csv to sql database

Hi All,

Here is my code
Dim strCon As String = System.Configuration.ConfigurationSettings.AppSettings("OwnerTrader")
con = New SqlConnection(strCon)
con.Open()
Dim strselect As String = ""
Try
strselect = "INSERT INTO tbl_CSV(CSV_TIME,CSV_SIZE,CSV_LOCATION,CSV_COUNTRY,CSV_LAT,CSV_LON,CSV_COMMENTS)"
strselect = strselect & " SELECT * FROM OPENROWSET('MSDASQL','Driver={Microsoft Text Driver (*.txt; *.csv)};DEFAULTDIR=C:\UploadFiles;Extensions=CSV;','SELECT * FROM TestNew.csv')"
cmd = New SqlCommand(strselect, con)
cmd.ExecuteNonQuery()
con.Close()

i have defined connection string in web.config file and my csv file is inside C:\UploadFiles with the name TestNew.csv file.Can u please check it out the code?is it correct or not.I am getting this error

"SQL Server blocked access to STATEMENT 'OpenRowset/OpenDatasource' of component 'Ad Hoc Distributed Queries' because this component is turned off as part of the security configuration for this server. A system administrator can enable the use of 'Ad Hoc Distributed Queries' by using sp_configure. For more information about enabling 'Ad Hoc Distributed Queries', see "Surface Area Configuration" in SQL Server Books Online."

Regards

I've moved this to the SQL forum because this is not a FAQ. And the message you got isn't an error. It's simply a statement of what happened, why and how to fix it. Follow the instructions and you'll get past this message.

Jeff

Importing CSV files with SqlBulkCopy

I am trying to import a CSV file into an SQL Server table with the OleDbDataReader and SqlBulkCopy objects, like this:

using (OleDbConnection dconn =new OleDbConnection("Provider=Microsoft.Jet.OLEDB.4.0;Data Source=c:\\mystuff\\;Extended Properties=\"text;HDR=No;FMT=Delimited\"")){using (OleDbCommand dcmd =new OleDbCommand("select * from mytable.csv", dconn)){try{dconn.Open();using (OleDbDataReader dreader = dcmd.ExecuteReader()){try{using (SqlConnection dconn2 =new SqlConnection(@."data source=MyDBServer;initial catalog=MyDB;user id=mydbid;password=mydbpwd")){using (SqlBulkCopy bc =new SqlBulkCopy(dconn2)){try{dconn2.Open();bc.DestinationTableName ="dbo.mytable";bc.WriteToServer(dreader);}finally{dconn2.Close();}}}}finally{dreader.Close();}}}finally{dconn.Close();}}}

A couple of the columns for the destination table use abitdatatype. The CSV files uses the strings "1" and "0" to represent these.

When I run this code, it throws this exception:

Unhandled Exception: System.InvalidOperationException: The given value of type S
tring from the data source cannot be converted to type bit of the specified targ
et column. --> System.FormatException: Failed to convert parameter value from a
String to a Boolean. --> System.FormatException: String was not recognized as
a valid Boolean.
at System.Boolean.Parse(String value)
at System.String.System.IConvertible.ToBoolean(IFormatProvider provider)
at System.Convert.ChangeType(Object value, Type conversionType, IFormatProvid
er provider)
at System.Data.SqlClient.SqlParameter.CoerceValue(Object value, MetaType dest
inationType)
-- End of inner exception stack trace --
at System.Data.SqlClient.SqlParameter.CoerceValue(Object value, MetaType dest
inationType)
at System.Data.SqlClient.SqlBulkCopy.ConvertValue(Object value, _SqlMetaData
metadata)
-- End of inner exception stack trace --
at System.Data.SqlClient.SqlBulkCopy.ConvertValue(Object value, _SqlMetaData
metadata)
at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternal()
at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServer(Int32 columnCount
)
at System.Data.SqlClient.SqlBulkCopy.WriteToServer(IDataReader reader)
at MyClass.Main()

It appears not to accept "1" and "0" as valid strings to convert to booleans. The System.Convert.ToBoolean method appears to work the same way.

Is there any way to change this behavior? I discovered if you change the "1" to "true" and "0" to "false" in the CSV file it will accept them.

Hello my friend,

I believe this is due to the way that Microsoft Access uses numbers to represent bits. I think 0 is Yes and -1 is No. Try it with 0 and -1 instead of 1 and 0.

Kind regards

Scotty

|||I created a schema.ini file for the table and specified bit columns for it. Then it converted without any problems.

Importing CSV files

Hello,

I need to import a bunch of .csv files. The problem I am having is the
"non data" information in the files creating bogus rows and column
definitions. Here is an example of the csv file.

CBOT - End-of-Day Futures Bulk Download 2001.
2 Year U.S. Treasury Notes Futures

DateSymbolMonth CodeYear CodeOpen
20010103ZTH2001102.09375
20010104ZTH2001102.03125
20010105ZTH2001102.28125

In this case, there are bogues rows created with the text at the
beginning of the file, and also the column names get placed into a row
as well. My question is; how do you import the file and strip out the
"non-data" data? so that only the actual data gets inserted into the db?

Thanks,

TGru

*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!Write a dotnet or vb exe to loop thru your csv file and place valid
rows in a separate file, then run bcp using this new file.|||Use BCP or BULK INSERT with the FIRSTROW (-F) option.

--
David Portas
SQL Server MVP
--|||tgru (tgru@.devdex.com) writes:
> I need to import a bunch of .csv files. The problem I am having is the
> "non data" information in the files creating bogus rows and column
> definitions. Here is an example of the csv file.
> CBOT - End-of-Day Futures Bulk Download 2001.
> 2 Year U.S. Treasury Notes Futures
> Date Symbol Month Code Year Code Open
> 20010103 ZT H 2001 102.09375
> 20010104 ZT H 2001 102.03125
> 20010105 ZT H 2001 102.28125

That doesn't look like a CSV file to me...

> In this case, there are bogues rows created with the text at the
> beginning of the file, and also the column names get placed into a row
> as well. My question is; how do you import the file and strip out the
> "non-data" data? so that only the actual data gets inserted into the db?

Was it only the column headers, you could use -F 2 with BCP to specify
that the BCP is to start with the second record. (I believe -F is the
option. Double-check with Books Online.) But the introducing text is
more difficult to handle. Assuming that there are no tabs in the text,
BCP may consider the row as an error. Then again, you can control how
many errors BCP may accept, so if you only can make sure that you get
in sync, it may be possible.

However, hharry's suggestion that you write a program that strips the
header info, is probably a safer way.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||"tgru" <tgru@.devdex.com> wrote in message news:421f97a6$1_2@.127.0.0.1...
> Hello,
> I need to import a bunch of .csv files. The problem I am having is the
> "non data" information in the files creating bogus rows and column
> definitions. Here is an example of the csv file.
> CBOT - End-of-Day Futures Bulk Download 2001.
> 2 Year U.S. Treasury Notes Futures
> Date Symbol Month Code Year Code Open
> 20010103 ZT H 2001 102.09375
> 20010104 ZT H 2001 102.03125
> 20010105 ZT H 2001 102.28125
>
> In this case, there are bogues rows created with the text at the
> beginning of the file, and also the column names get placed into a row
> as well. My question is; how do you import the file and strip out the
> "non-data" data? so that only the actual data gets inserted into the db?
> Thanks,
> TGru
> *** Sent via Developersdex http://www.developersdex.com ***
> Don't just participate in USENET...get rewarded for it!

Take a look at AWK (I use the GNU version GAWK).

This is a old old text stream (file) processing utility that uses regular
expressions to let you filter and/or reformat data.

You could have it create a new import file that ignores blank lines and
headers (ie. starts with "CBOT').

Importing CSV file with URL using DTS?

I need to be able to create a DTS package that imports a CSV file which is loated at URL. I.E. HTTP://www.url.com/csv/thefile.xls I tried copying the URL an pasting it in the file location when in SQL wazird but I got an error message.

Please help point me to anywhere that I can find some help in doing this? If there is a better way please let me know

ThanksI would post this in the SQL Server Integration Services forum instead of this one. Also, to get a quicker response, please detail exact steps you took to hit the problem, as well as display the exact error message you got.

Importing csv file to SQL Server

Can someone please help me.
I need to import a csv fiel to sql server and I know that the column delimiter is \t
and the newline delimiter is \n but I don't know what the rowterminator is or the fieldterminator. How can I import the file into an empty table in an existing database.
Any suggestions would be greatly appreciated.Use DTS to import the file it can handle it all, including create a new empty table from the file.
You can do it directly in Enterprise Manager, select the database and use right click > All Tasks > Import Data

Importing csv File into SqlExpress .MDF file.

Hi,
I'm trying to use BCP to import a .csv file into a .mdf file.
At the commandline, I'm using the following command:
bcp commercedb.mdf.crmc_products in test.csv -T
I'm getting errors (below) telling me that I'm not successfully
connecting to the database.
Any help would be appreciated. If you can help me, please understand
I'm new to this so limited g speak would be most appreciated.
ERRORS:
SQLState = 08001, NativeError = 53
Error = [Microsoft][SQL Native Client]Named Pipes Provider: Could not
open a con
nection to SQL Server [53].
SQLState = HYT00, NativeError = 0
Error = [Microsoft][SQL Native Client]Login timeout expired
SQLState = 08001, NativeError = 53
Error = [Microsoft][SQL Native Client]An error has occurred while
establishing a
connection to the server. When connecting to SQL Server 2005, this
failure may
be caused by the fact that under the default settings SQL Server does
not allow
remote connections.Data access libraries do not directly access (MDF) SQL Server database files
the same way they can an (MDB) MS Access file. All data access must go
through the SQL Server service itself.
The .mdf database file must first be attached to SQL Server.
http://msdn2.microsoft.com/en-us/library/ms165673.aspx
Once done, you bulk copy specifying the server and database name:
http://msdn2.microsoft.com/en-us/library/ms162802.aspx
<tom.herz@.gmail.com> wrote in message
news:1138982708.766097.92140@.f14g2000cwb.googlegroups.com...
> Hi,
> I'm trying to use BCP to import a .csv file into a .mdf file.
> At the commandline, I'm using the following command:
> bcp commercedb.mdf.crmc_products in test.csv -T
> I'm getting errors (below) telling me that I'm not successfully
> connecting to the database.
> Any help would be appreciated. If you can help me, please understand
> I'm new to this so limited g speak would be most appreciated.
> ERRORS:
> SQLState = 08001, NativeError = 53
> Error = [Microsoft][SQL Native Client]Named Pipes Provider: Could not
> open a con
> nection to SQL Server [53].
> SQLState = HYT00, NativeError = 0
> Error = [Microsoft][SQL Native Client]Login timeout expired
> SQLState = 08001, NativeError = 53
> Error = [Microsoft][SQL Native Client]An error has occurred while
> establishing a
> connection to the server. When connecting to SQL Server 2005, this
> failure may
> be caused by the fact that under the default settings SQL Server does
> not allow
> remote connections.
>|||Hi,
Thanks. This is what I needed. I hadn't attached the database.
I haven't been successful yet, but that is a lack of experience with
all the switches and BCP overall.
Once I got the database attached, things started to gel.
I really appreciate your time, JT.
Have a good day.

Importing CSV file

I'm a newbie to SSIS, so there is probably an easy solution:

I'm importing from a csv file into a db table, but would like to remove the quotation marks from around the data before inserting into the table.

Are there any suggestions of how to remove the "data" marks? Would a Foreach Loop container work for this?

Set the text qualifier in the Flat FIle connection manager to "

Importing CSV data into SQL database using DTS

I would like to import CSV data into a SQL data base using DTS. This
SQL database already contains the information however the CSV file
contains updated information.
data.csv
Computer Name | Serial Number | Processors | Processor Description |
Total Memory
computer 168 123456789 4 Intel 3.06 Ghz
3048 MB
SQL Table
ComputerName | SerialNumber | ProcNum | ProcessorType | Memory
Computer 168 123456789 2 Intel 2.8 ghz
2048 MB
As you can see the memory and processor and other data were updated in
the CSV which I need my import to:
- Find the computer name in the SQL Database table (even know the field
names are different), and then update the SerialNumber, ProcNum,
ProcessorType, Memory information.
- I also want to be sure if the asset name can't be found then it
should create a new row with the new information. Example if computer
218 is not in the SQL database it will populate the database with the
information.
Hi Wildatom
"Wildatom" wrote:

> I would like to import CSV data into a SQL data base using DTS. This
> SQL database already contains the information however the CSV file
> contains updated information.
> data.csv
> Computer Name | Serial Number | Processors | Processor Description |
> Total Memory
> computer 168 123456789 4 Intel 3.06 Ghz
> 3048 MB
> SQL Table
> ComputerName | SerialNumber | ProcNum | ProcessorType | Memory
> Computer 168 123456789 2 Intel 2.8 ghz
> 2048 MB
>
> As you can see the memory and processor and other data were updated in
> the CSV which I need my import to:
> - Find the computer name in the SQL Database table (even know the field
> names are different), and then update the SerialNumber, ProcNum,
> ProcessorType, Memory information.
You need something called a staging table to load this data into. You can
then update the existing records with the values from the staging table when
they match on a (primary) key value ( ComputerName )
UPDATE l
SET SerialNumber = s.SerialNumber
ProcNum = s.ProcNum
ProcessorType = s.ProcessorType
Memory = s.Memory
FROM livetable l
JOIN stagingtable s ON s.ComputerName = l.ComputerName
WHERE l.SerialNumber <> s.SerialNumber
OR l.ProcNum <> s.ProcNum
OR l.ProcessorType <> s.ProcessorType
OR l.Memory <> s.Memory

> - I also want to be sure if the asset name can't be found then it
> should create a new row with the new information. Example if computer
> 218 is not in the SQL database it will populate the database with the
> information.
>
Once you have updated the values you can insert anything that is not already
in the live table
INSERT livetable ( ComputerName, SerialNumber, ProcNum, ProcessorType,
Memory )l
SELECT s.ComputerName, s.SerialNumber, s.ProcNum, s.ProcessorType, s.Memory
FROM stagingtable s
LEFT JOIN livetable l ON s.ComputerName = l.ComputerName
WHERE l.ComputerName IS NULL
Alternatively you could use NOT EXISTS!
John

Importing CSV data into SQL database using DTS

I would like to import CSV data into a SQL data base using DTS. This
SQL database already contains the information however the CSV file
contains updated information.
data.csv
Computer Name | Serial Number | Processors | Processor Description |
Total Memory
computer 168 123456789 4 Intel 3.06 Ghz
3048 MB
SQL Table
ComputerName | SerialNumber | ProcNum | ProcessorType | Memory
Computer 168 123456789 2 Intel 2.8 ghz
2048 MB
As you can see the memory and processor and other data were updated in
the CSV which I need my import to:
- Find the computer name in the SQL Database table (even know the field
names are different), and then update the SerialNumber, ProcNum,
ProcessorType, Memory information.
- I also want to be sure if the asset name can't be found then it
should create a new row with the new information. Example if computer
218 is not in the SQL database it will populate the database with the
information.Hi Wildatom
"Wildatom" wrote:

> I would like to import CSV data into a SQL data base using DTS. This
> SQL database already contains the information however the CSV file
> contains updated information.
> data.csv
> Computer Name | Serial Number | Processors | Processor Description |
> Total Memory
> computer 168 123456789 4 Intel 3.06 Ghz
> 3048 MB
> SQL Table
> ComputerName | SerialNumber | ProcNum | ProcessorType | Memory
> Computer 168 123456789 2 Intel 2.8 ghz
> 2048 MB
>
> As you can see the memory and processor and other data were updated in
> the CSV which I need my import to:
> - Find the computer name in the SQL Database table (even know the field
> names are different), and then update the SerialNumber, ProcNum,
> ProcessorType, Memory information.
You need something called a staging table to load this data into. You can
then update the existing records with the values from the staging table when
they match on a (primary) key value ( ComputerName )
UPDATE l
SET SerialNumber = s.SerialNumber
ProcNum = s.ProcNum
ProcessorType = s.ProcessorType
Memory = s.Memory
FROM livetable l
JOIN stagingtable s ON s.ComputerName = l.ComputerName
WHERE l.SerialNumber <> s.SerialNumber
OR l.ProcNum <> s.ProcNum
OR l.ProcessorType <> s.ProcessorType
OR l.Memory <> s.Memory

> - I also want to be sure if the asset name can't be found then it
> should create a new row with the new information. Example if computer
> 218 is not in the SQL database it will populate the database with the
> information.
>
Once you have updated the values you can insert anything that is not already
in the live table
INSERT livetable ( ComputerName, SerialNumber, ProcNum, ProcessorType,
Memory )l
SELECT s.ComputerName, s.SerialNumber, s.ProcNum, s.ProcessorType, s.Memory
FROM stagingtable s
LEFT JOIN livetable l ON s.ComputerName = l.ComputerName
WHERE l.ComputerName IS NULL
Alternatively you could use NOT EXISTS!
John

Importing CSV Data into a SQL Server 2005 Database

I need to import csv data into a SQL Server 2005 database using SQL Server Management Studio Express. I can't find any menu options for accomplishing this. Is this a limitation of the Express edition, or am I missing something when I try to find this feature?

Thanks for any help provided.

You are missing DTS which comes with service pack 1 the thread below covers how to get it, and a CVS file is comma delimited so SQL Server sees null values so you have to import into a temp table before the destination if you have primary key defined in the destination table. Hope this helps.

http://forums.asp.net/thread/1407898.aspx

|||

Caddre: Thanks for the info. I obtained Service Pack 1 and located DTS. As I run the wizard, it looks like it is set up primarily to transfer data between databases or tables, rather than importing from csv. The only way that I see for importing from csv is to write a sql query. Is there a more direct way using the Import/Export Wizard? Or will I have to write a SQL query?

|||

That cannot be it is a smaller version of a billion dollar ETL(extraction transformation and loading) tool even the previous version can do it, so the link below shows you the steps with the 2000 version. It is not complicated just choose a temp table first or a table without primary key. Hope this helps.

http://www.sqldts.com/default.aspx?276,4

|||

The screen capture images from the link you provided look very different from what I am seeing. I did download my version as described in the other thread and the filename/path matched. I captured some screen images that show what I am seeing. Here is the URL:

http://www.vagarden.com/importwizard.html

I can see how the program shown that you linked to would do the job for me, if I can get it.

Thanks for your help.

|||

I have been searching the Internet for a downloadable version of DTS without any luck.

Since it was not provided with the Express version of SQL Server 2005, I wonder if it is part of the full (Pro) version?

If not, does anyone know where DTS can be obtained?

If not, is it possible to write a sql query that would insert data to a MS SQL Server table from a csv file? If so, what might the query look like?

Thank for any help that can be provided.

|||

Hi sorry,

There is a single query way with the OPENROWSET function but I was hoping to get you the DTS info so I forgot about this function. Try the link below for details. Hope this helps.

http://www.databasejournal.com/features/mssql/article.php/3584751

|||

Caddre, thanks for the info. This should work for my project.

Importing and merging csv using sql

Hi Gurus.

My client drops for me many files like this on a shared drive M: daily
1_Test.csv
2_Test.csv
3_Test.csv

I would like you to advice on how to write a SQL code (that can include DTS if possible) that will take this files, merge them into one (since they have same columns) and send them to another client as one file. But it must only take the files of the same date and must not resend files i have send already. I need this to be an automated process since the files are dumped into M:drive midnite and I need this code so that I can schedule it as a job and run around 4h00am.OK..

I would do all (and have...now where is that code) in a stored procedure.

I would

1. us xp_cmdshell to interogate the directory and store all of the contents in a temp table
2. I would then parse it out and look for the files I want
3. I would then load these file to a temp table and audit them 1 at a time using bcp
4. When everything looked good, I would insert from the temp table to the final destination
5. I would then use xp_cmdshell again to vreate an datetime stamped sub archive folder, and would move EVERYTHING to that archive (i'm a neat freak...well...)
6. Thoughout the process I would echo out messages as to what;s going on to a file, and on the way out I would bcp that file to a log table.

That's what I would do.

If you want I can dig up some code.|||I would also go the route that Brett suggested, there might be some "features" that you could skip, but on the other hand you get quite flexible code if you follow his suggestions!

The only thing that I would do differently is that I would create one folder for the archived files and store them there (with dates as filenames), but this is just a matter of personal preference. :)

Why would you like to include DTS to this?
...I know that DTS is good for a lot of things, but personally I find scritping much more portable (this might have something to do with that I haven't used DTS that much, so my opinion could change in the future).

Check out this page from Nigel Rivett:
http://www.nigelrivett.net/ImportTextFiles.html
...you should at least get an idea of how to aproach the problem.

Pati|||I would do like Brett Kaiser and query the directory where these files are kept. And store all the contents in a #temp table. What I would do differently is that I would use a directory structure. I would for example keep the files in C:\CSV and once the file has been processed I would move the file to another directory say C:\CSV\sent by using a stored procedure.

In the database say TEST, I would create a table test_csv with the same fields as the .csv file.

I will then use a procedure like this one, I have written it specially for you. The input parameter will come from the #temp table well which means this procedure will be called from another procedure that will have loop for the filenames field and the procedure csv_test will be called from inside the loop.

-- SQL CODING BEGINS HERE
-- csv_test 'C:\CSV\Book2.csv'
create proc csv_test
@.filename varchar(70)
as
begin

declare @.cmd as varchar(100)
declare @.cmd_insert as varchar(300)

SET @.filename = replace(@.filename, '''', ''')
set @.cmd_insert = 'BULK INSERT TEST.dbo.test_csv FROM ''' + @.filename + ''' WITH ( FIELDTERMINATOR = '','', ROWTERMINATOR = ''\n'' )'
exec (@.cmd_insert)

set @.cmd = 'xp_cmdshell ''move ' + @.filename + ' C:\CSV\SENT'''
exec (@.cmd)

end

-- SQL CODING ENDS HERE

And then at last transfer the table to the destination table and truncate csv_test. Remember you can do everything using a procedure and schedule a DTS to run at a specified time every day.
:)

Roshmi Choudhury

Sunday, February 19, 2012

Importing a text file using Express

I have a text file of postcodes in CSV format that I would like to import...
I am using SQL Server Express and do not have access to DTS.
I would prefer to import the information as opposed to linking to it, what
is the simplest way to get the text info into my table.
Thanks in advance...try this... hope this helps
SELECT * FROM OpenDataSource( 'MSDASQL',
'Driver={Microsoft Text Driver (*.txt; *.csv)};
DefaultDir=C:\Test;').[C:\test]..[filename.csv]|||"Omnibuzz" <Omnibuzz@.discussions.microsoft.com> wrote in message
news:7268C38F-0C91-4D60-9DD8-4F004C2A84DD@.microsoft.com...
> try this... hope this helps
> SELECT * FROM OpenDataSource( 'MSDASQL',
> 'Driver={Microsoft Text Driver (*.txt; *.csv)};
> DefaultDir=C:\Test;').[C:\test]..[filename.csv]
Thanks for the help however I have never doen this before and...
I've enabled OpenDataSource as required.
The file name is C:\PostCodes.CSV and has field names in the first row.
I am interested in the first three columns
PCode, Locality & State
So I have tried without success:
SELECT PCode, Locality, State FROM OpenDataSource( 'MSDASQL',
'Driver={Microsoft Text Driver (*.txt; *.csv)};
DefaultDir=C:\;').[C:\]..[PostCodes.csv]
With the following error returned:
OLE DB provider "MSDASQL" for linked server "(null)" returned message
"[Microsof
t][ODBC Text Driver] Syntax error (missing operator) in query expression
'`Tbl10
02`.`Pcode` `Col1004`'.".
Msg 7321, Level 16, State 2, Server DIMENSION9150, Line 1
An error occurred while preparing the query "SELECT `Tbl1002`.`Pcode`
`Col1004`,
`Tbl1002`.`Locality` `Col1005`,`Tbl1002`.`State` `Col1006` FROM
`C:\`\`PostCodes
.csv` `Tbl1002`" for execution against OLE DB provider "MSDASQL" for linked
serv
er "(null)".|||"McHenry" <mchenry@.mchenry.com> wrote in message
news:44462abf$0$16667$5a62ac22@.per-qv1-newsreader-01.iinet.net.au...
> "Omnibuzz" <Omnibuzz@.discussions.microsoft.com> wrote in message
> news:7268C38F-0C91-4D60-9DD8-4F004C2A84DD@.microsoft.com...
> Thanks for the help however I have never doen this before and...
> I've enabled OpenDataSource as required.
> The file name is C:\PostCodes.CSV and has field names in the first row.
> I am interested in the first three columns
> PCode, Locality & State
> So I have tried without success:
> SELECT PCode, Locality, State FROM OpenDataSource( 'MSDASQL',
> 'Driver={Microsoft Text Driver (*.txt; *.csv)};
> DefaultDir=C:\;').[C:\]..[PostCodes.csv]
> With the following error returned:
> OLE DB provider "MSDASQL" for linked server "(null)" returned message
> "[Microsof
> t][ODBC Text Driver] Syntax error (missing operator) in query expression
> '`Tbl10
> 02`.`Pcode` `Col1004`'.".
> Msg 7321, Level 16, State 2, Server DIMENSION9150, Line 1
> An error occurred while preparing the query "SELECT `Tbl1002`.`Pcode`
> `Col1004`,
> `Tbl1002`.`Locality` `Col1005`,`Tbl1002`.`State` `Col1006` FROM
> `C:\`\`PostCodes
> .csv` `Tbl1002`" for execution against OLE DB provider "MSDASQL" for
> linked serv
> er "(null)".
>
Fixed and thanks for the help...
select * from OpenRowset('MSDASQL', 'Driver={Microsoft Text Driver (*.txt;
*.csv)};
DefaultDir=C:\;','select * from PostCodes.csv')