Wednesday, March 28, 2012

Importing Text File to SQL

HI Guys,

I am doing the following to read the data in a text file and inserting it into SQL.

1) Open db connection
2) Open Text File
3) loop through text file all along inserting each row into the db
4) close the text file
5) close the db connection

However, the text file has over 400 rows/lines of data that need to be inserted into the db. Each line in the text file is a row in the db. At anyrate, the above script times out. Is there a better, faster way to do this? I can't use Bulk Insert due to permissions previlages.

Thanks in Advance!DTS would do it. If you can get a DTS pakage set up and a procedure to run it, you can do it by uploading the file to a place the database can see it, and then run the proc that runs the DTS job.

400 rows isn't much. I do a similar thing with up to 100,000 rows. Not an ideal thing, but it was right for the situation. I had to set the timeouts longer, which is what you can do also.

You need to set a longer timeout in three places.

1) server.scripttimeout
2) connection timeout
3) command timeout

Google for examples. 400 is not a lot, so it's not a bad way to do it. If you don't expect that to grow much, just set the timeouts longer and be done with it.|||Hi, I'm new in the programming. I am developing an application that uses text file as input for the data. Therefore, i need to import the text file to SQL server before I could use it. Do you mind if you could share the sample of your code for importing the text file and convert it to sql.

No comments:

Post a Comment