Showing posts with label thousand. Show all posts
Showing posts with label thousand. Show all posts

Wednesday, March 21, 2012

importing flat files to many tables

I'm trying to input a few thousand flat files into a few thousand tables in a sql database, using SQL Server Business Intelligence Development Studio.

im using a for each loop to read all the files in a directory

the problem is i can only insert the data from all the files into one table

does anyone know a way to do multiple tables? maybe using some sort of variable?


Use the multicast in the dataflow.

Kirk Haselden
Author "SQL Server Integration Services"

importing flat files to many tables

I'm trying to input a few thousand flat files into a few thousand tables in a sql database

im using integration services with a for each loop to read all the files in a directory

the problem is i can only insert the data from all the files into one table

does anyone know a way to do multiple tables? maybe using some sort of variable?

Yes you can, although the approach differs based on what you are trying to accomplish. If each file goes to a different table, use table name variable as your data access mode. If you want each file to go to each table, use nested foreach loops. Foreach file, foreach table, input data.

Friday, March 9, 2012

Importing data from Paradox into SQL

I have a Paradox file which has 270 thousand records. If I open this db file through a paradox viewer it shows the correct number of rows.

I have an SSIS package which is suppose to read this db file and insert it into the SQL 2005 table. But when I execute this package it goes into a loop, reading the file over and over again. The package fails after inserting some 10 million rows giving the error msg 'Not enough space on temporary disk'.

On examing the data transfered into SQL there are duplicate rows.

I also used the import export wizard (thinking there might be some error in Package code) provided by SQL to transfer the data from .db to SQL but it has the same result(goes in a loop).

I would appreciate any help in this problem. Let me know if you have any other questions.

Thanks

Duplicate! Ignore!