Showing posts with label rows. Show all posts
Showing posts with label rows. Show all posts

Monday, March 26, 2012

Importing Selected csv data into SQL tables

Hi,

I hope you can help me.

I've got a csv that I need to import. It's about 74 columns wide and 2500 rows long. What I need to do is select the first 3 columns from the columns from the csv and put them into one table and then put every 7 columns (plus the 1st column as the primary key) into a different table.

I've not explained that very well, basically it's like this...

A B C D E F G H I J K L M N O P

1

2

3

4

5

Table 1 Data

cs# A1
fe B1
cl C1

Table 2

cs# A1
desc D1
date E1
pay F1
amount G1
vat H1
total I1
cReq J1

I'm not really sure were to start. I want to import the data nightly into the tables. I've tried to use the import wizard but I can't seem to select the data I want and exclude the data I don't.

Steve

Which version of SQL are you using? If its 2005 you should be able to do this neatly using Integration Services (SSIS).

For 2000 you might be best off dumping the whole thing into a holding table and then manipulating the data from there. This could be handled by several steps of a SQL Job.

HTH!

|||Thanks for the reply,

Yeah I'm using 2005 sp2. Where can I find the integration services? I can't seem to find it.

Steve|||You need to install SSIS(integration services) @. the time of installing sql server 2005 (you need to choose the services you need to install)........go to startall programsmicrosoft sql server 2005configuration toolssql server configuration manager.....under that you can see the list of sql 2005 services just check if SSIS is present else install it.......after installing connect to SSMS choose integration services instead of database engine to connect to SSIS..........

Thanxx
|||

Follow Deepaks instructions to get it installed if you haven't already. Then, you can open up the Business Intellignece Studio (Visual Studio skin) and select an Integration Services project. This will then give you a graphical interface in which you can drag and drop different datasources (eg Excel & SQL Server) and dataflow methods (eg copy column and export column )

Take a look at the Books Online walkthroughs that will give you a flavour of the different tasks you can achieve and hopefully that will give you a good steer on what to do.

Good luck!

|||Thanks very much.

I feel more at home in Visual Studio!

I'll probably be back when I get stuck again.

Steve

Friday, March 23, 2012

Importing non-hidden rows from Excel?

I have a need to import only non-hidden rows from an excel spreadsheet. When I create the package in SSIS, it imports everything. Due to the use of the data on the spreadsheet, we cannot simply delete the data.

Is there a special setting in the Excel Source or Connection manager that can be set to "only import non-hidden rows"?

Also, how do I go about setting the sheet with an index instead of the actual Sheet name? The user changes the sheet name at random, but I know I only need the first two sheets on the file.

Thanks!

Matt Michuta

No, there is no setting like this. You at the mercy of the Excel OLE DB Provider here and it doesn't possess functionality like you are requesting. You will have to filter those rows out in the pipeline.

Not sure about the sheet index problem - I don't ever use the Excel provider. I suspect you can't do that either. Perhaps try the data access forum if no-one here knows: https://forums.microsoft.com/MSDN/ShowForum.aspx?ForumID=87&SiteID=1

-Jamie

|||You should be able to do this with a little bit of creative vba coding in excel. You can write a macro that will copy all the data to a sheet that has a static name and link the action to an event or button. Since you can use sheet indexes and row properties in vba you can get exactly what you need, and since the sheet you're copying to has a static name you can use the excel source in SSIS to import it.

Importing More Than 10'000 Rows

Hi,
I have been tasked with moving a company database from MS Access to SQL
Server 2000. The problem I have is that there are some very large tables
which I need to import from the Access version to SQL Server. One of these,
for example, is just under 33'000 records in size. Is there a way that I can
import all of these records in one go. I can only import 10'000 at a time.
Many thanks in advance to anyone who might be able to help
Hi
Have you look into DTS?
"Tomsky02" <Tomsky02@.discussions.microsoft.com> wrote in message
news:2EE92E2F-D6B2-461D-9D2F-743DD2030522@.microsoft.com...
> Hi,
> I have been tasked with moving a company database from MS Access to SQL
> Server 2000. The problem I have is that there are some very large tables
> which I need to import from the Access version to SQL Server. One of
> these,
> for example, is just under 33'000 records in size. Is there a way that I
> can
> import all of these records in one go. I can only import 10'000 at a time.
> Many thanks in advance to anyone who might be able to help
|||Hi Uri,
Thanks for the reply. I solved the problem. I was being a bit of an idiot
and had the number of records to be displayed in my forms as 10000. I upped
this and everything is ok now.
Thanks,
Tom.
"Uri Dimant" wrote:

> Hi
> Have you look into DTS?
> "Tomsky02" <Tomsky02@.discussions.microsoft.com> wrote in message
> news:2EE92E2F-D6B2-461D-9D2F-743DD2030522@.microsoft.com...
>
>

Importing More Than 10'000 Rows

Hi,
I have been tasked with moving a company database from MS Access to SQL
Server 2000. The problem I have is that there are some very large tables
which I need to import from the Access version to SQL Server. One of these,
for example, is just under 33'000 records in size. Is there a way that I can
import all of these records in one go. I can only import 10'000 at a time.
Many thanks in advance to anyone who might be able to helpHi
Have you look into DTS?
"Tomsky02" <Tomsky02@.discussions.microsoft.com> wrote in message
news:2EE92E2F-D6B2-461D-9D2F-743DD2030522@.microsoft.com...
> Hi,
> I have been tasked with moving a company database from MS Access to SQL
> Server 2000. The problem I have is that there are some very large tables
> which I need to import from the Access version to SQL Server. One of
> these,
> for example, is just under 33'000 records in size. Is there a way that I
> can
> import all of these records in one go. I can only import 10'000 at a time.
> Many thanks in advance to anyone who might be able to help|||Hi Uri,
Thanks for the reply. I solved the problem. I was being a bit of an idiot
and had the number of records to be displayed in my forms as 10000. I upped
this and everything is ok now.
Thanks,
Tom.
"Uri Dimant" wrote:

> Hi
> Have you look into DTS?
> "Tomsky02" <Tomsky02@.discussions.microsoft.com> wrote in message
> news:2EE92E2F-D6B2-461D-9D2F-743DD2030522@.microsoft.com...
>
>

Importing More Than 10'000 Rows

Hi,
I have been tasked with moving a company database from MS Access to SQL
Server 2000. The problem I have is that there are some very large tables
which I need to import from the Access version to SQL Server. One of these,
for example, is just under 33'000 records in size. Is there a way that I can
import all of these records in one go. I can only import 10'000 at a time.
Many thanks in advance to anyone who might be able to helpHi
Have you look into DTS?
"Tomsky02" <Tomsky02@.discussions.microsoft.com> wrote in message
news:2EE92E2F-D6B2-461D-9D2F-743DD2030522@.microsoft.com...
> Hi,
> I have been tasked with moving a company database from MS Access to SQL
> Server 2000. The problem I have is that there are some very large tables
> which I need to import from the Access version to SQL Server. One of
> these,
> for example, is just under 33'000 records in size. Is there a way that I
> can
> import all of these records in one go. I can only import 10'000 at a time.
> Many thanks in advance to anyone who might be able to help|||Hi Uri,
Thanks for the reply. I solved the problem. I was being a bit of an idiot
and had the number of records to be displayed in my forms as 10000. I upped
this and everything is ok now.
Thanks,
Tom.
"Uri Dimant" wrote:
> Hi
> Have you look into DTS?
> "Tomsky02" <Tomsky02@.discussions.microsoft.com> wrote in message
> news:2EE92E2F-D6B2-461D-9D2F-743DD2030522@.microsoft.com...
> > Hi,
> > I have been tasked with moving a company database from MS Access to SQL
> > Server 2000. The problem I have is that there are some very large tables
> > which I need to import from the Access version to SQL Server. One of
> > these,
> > for example, is just under 33'000 records in size. Is there a way that I
> > can
> > import all of these records in one go. I can only import 10'000 at a time.
> >
> > Many thanks in advance to anyone who might be able to help
>
>

Importing lots of rows :-)

Hi All,
I'm not sure how to approach this problem so if anyone could advise me
how to do it I would be very grateful.
I have an app that calls a web service. The webservice returns an array
of business objects.
So I have a collections of objects that might be, say, 10000 objects in
size.
I need to import all the business objects into an SQL Server 2005
database. The only way I know how to do this is to loop through each
object and call a stored procedure repeatedly that takes the appropriate
parameters. The whole import must be atomic. That is, if something goes
wrong, the whole import needs to be aborted.
I guess I have two questions:
1. Is repeatedly calling a stored procedure the best way to do this sort
of stuff? I know doing imports like this must be a pretty common
operation, but I don't know if using an SPROC is a dumb idea.
2. Is it ok to start a transaction and repeatedly call a SPROC, like,
10000 times? Is there a better what to go about this?
Maybe I'm worrying about nothing - but it just "feels" wrong to be
repeatedly calling an SPROC 10,000 times! :-) I was wondering if there
is a more elegant way?
Any advice anyone could offer would be very much appreciated.
Kindest Regards
SimonSimon,
Try using SqlBulkCopy object (ado.net 2.0).
AMB
"Simon Harvey" wrote:
> Hi All,
> I'm not sure how to approach this problem so if anyone could advise me
> how to do it I would be very grateful.
> I have an app that calls a web service. The webservice returns an array
> of business objects.
> So I have a collections of objects that might be, say, 10000 objects in
> size.
> I need to import all the business objects into an SQL Server 2005
> database. The only way I know how to do this is to loop through each
> object and call a stored procedure repeatedly that takes the appropriate
> parameters. The whole import must be atomic. That is, if something goes
> wrong, the whole import needs to be aborted.
> I guess I have two questions:
> 1. Is repeatedly calling a stored procedure the best way to do this sort
> of stuff? I know doing imports like this must be a pretty common
> operation, but I don't know if using an SPROC is a dumb idea.
> 2. Is it ok to start a transaction and repeatedly call a SPROC, like,
> 10000 times? Is there a better what to go about this?
> Maybe I'm worrying about nothing - but it just "feels" wrong to be
> repeatedly calling an SPROC 10,000 times! :-) I was wondering if there
> is a more elegant way?
> Any advice anyone could offer would be very much appreciated.
> Kindest Regards
> Simon
>|||You're right. You don't want to call the same stored procedure 10,000 times
if there is way to accomplish the task as one unit of work. First off,
calling the stored procedure 10,000 times -while inside a transaction, will
consume and hold more resources longer than necessary.
Look into SQLBulkCopy as Alejandro offered
--
Arnie Rowland, YACE*
"To be successful, your heart must accompany your knowledge."
*Yet Another Certification Exam
"Simon Harvey" <nothanks@.hotmail.com> wrote in message
news:ezy9dt4mGHA.732@.TK2MSFTNGP04.phx.gbl...
> Hi All,
> I'm not sure how to approach this problem so if anyone could advise me how
> to do it I would be very grateful.
> I have an app that calls a web service. The webservice returns an array of
> business objects.
> So I have a collections of objects that might be, say, 10000 objects in
> size.
> I need to import all the business objects into an SQL Server 2005
> database. The only way I know how to do this is to loop through each
> object and call a stored procedure repeatedly that takes the appropriate
> parameters. The whole import must be atomic. That is, if something goes
> wrong, the whole import needs to be aborted.
> I guess I have two questions:
> 1. Is repeatedly calling a stored procedure the best way to do this sort
> of stuff? I know doing imports like this must be a pretty common
> operation, but I don't know if using an SPROC is a dumb idea.
> 2. Is it ok to start a transaction and repeatedly call a SPROC, like,
> 10000 times? Is there a better what to go about this?
> Maybe I'm worrying about nothing - but it just "feels" wrong to be
> repeatedly calling an SPROC 10,000 times! :-) I was wondering if there is
> a more elegant way?
> Any advice anyone could offer would be very much appreciated.
> Kindest Regards
> Simon|||If your data is coming in as one large XML document, XML Bulk Load might be
another alternative. The final result is the same as SQLBulkCopy but the
input is an XML document.
--
This posting is provided "AS IS" with no warranties, and confers no rights.
Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm
"Arnie Rowland" <arnie@.1568.com> wrote in message
news:eyYZQd5mGHA.1204@.TK2MSFTNGP03.phx.gbl...
> You're right. You don't want to call the same stored procedure 10,000
> times if there is way to accomplish the task as one unit of work. First
> off, calling the stored procedure 10,000 times -while inside a
> transaction, will consume and hold more resources longer than necessary.
> Look into SQLBulkCopy as Alejandro offered
> --
> Arnie Rowland, YACE*
> "To be successful, your heart must accompany your knowledge."
> *Yet Another Certification Exam
>
> "Simon Harvey" <nothanks@.hotmail.com> wrote in message
> news:ezy9dt4mGHA.732@.TK2MSFTNGP04.phx.gbl...
>> Hi All,
>> I'm not sure how to approach this problem so if anyone could advise me
>> how to do it I would be very grateful.
>> I have an app that calls a web service. The webservice returns an array
>> of business objects.
>> So I have a collections of objects that might be, say, 10000 objects in
>> size.
>> I need to import all the business objects into an SQL Server 2005
>> database. The only way I know how to do this is to loop through each
>> object and call a stored procedure repeatedly that takes the appropriate
>> parameters. The whole import must be atomic. That is, if something goes
>> wrong, the whole import needs to be aborted.
>> I guess I have two questions:
>> 1. Is repeatedly calling a stored procedure the best way to do this sort
>> of stuff? I know doing imports like this must be a pretty common
>> operation, but I don't know if using an SPROC is a dumb idea.
>> 2. Is it ok to start a transaction and repeatedly call a SPROC, like,
>> 10000 times? Is there a better what to go about this?
>> Maybe I'm worrying about nothing - but it just "feels" wrong to be
>> repeatedly calling an SPROC 10,000 times! :-) I was wondering if there is
>> a more elegant way?
>> Any advice anyone could offer would be very much appreciated.
>> Kindest Regards
>> Simon
>|||Thanks guys!
That XML one seems even more interesting actually, although I'm guessing
it might be quite hard to set up the mappings. Especially seeing as the
data is quite hierarchichal in nature.
Many thanks for your advice!
Simon
Roger Wolter[MSFT] wrote:
> If your data is coming in as one large XML document, XML Bulk Load might be
> another alternative. The final result is the same as SQLBulkCopy but the
> input is an XML document.
>

Wednesday, March 21, 2012

Importing lots of rows :-)

Simon,
Try using SqlBulkCopy object (ado.net 2.0).
AMB
"Simon Harvey" wrote:

> Hi All,
> I'm not sure how to approach this problem so if anyone could advise me
> how to do it I would be very grateful.
> I have an app that calls a web service. The webservice returns an array
> of business objects.
> So I have a collections of objects that might be, say, 10000 objects in
> size.
> I need to import all the business objects into an SQL Server 2005
> database. The only way I know how to do this is to loop through each
> object and call a stored procedure repeatedly that takes the appropriate
> parameters. The whole import must be atomic. That is, if something goes
> wrong, the whole import needs to be aborted.
> I guess I have two questions:
> 1. Is repeatedly calling a stored procedure the best way to do this sort
> of stuff? I know doing imports like this must be a pretty common
> operation, but I don't know if using an SPROC is a dumb idea.
> 2. Is it ok to start a transaction and repeatedly call a SPROC, like,
> 10000 times? Is there a better what to go about this?
> Maybe I'm worrying about nothing - but it just "feels" wrong to be
> repeatedly calling an SPROC 10,000 times! :-) I was wondering if there
> is a more elegant way?
> Any advice anyone could offer would be very much appreciated.
> Kindest Regards
> Simon
>You're right. You don't want to call the same stored procedure 10,000 times
if there is way to accomplish the task as one unit of work. First off,
calling the stored procedure 10,000 times -while inside a transaction, will
consume and hold more resources longer than necessary.
Look into SQLBulkCopy as Alejandro offered
Arnie Rowland, YACE*
"To be successful, your heart must accompany your knowledge."
*Yet Another certification Exam
"Simon Harvey" <nothanks@.hotmail.com> wrote in message
news:ezy9dt4mGHA.732@.TK2MSFTNGP04.phx.gbl...
> Hi All,
> I'm not sure how to approach this problem so if anyone could advise me how
> to do it I would be very grateful.
> I have an app that calls a web service. The webservice returns an array of
> business objects.
> So I have a collections of objects that might be, say, 10000 objects in
> size.
> I need to import all the business objects into an SQL Server 2005
> database. The only way I know how to do this is to loop through each
> object and call a stored procedure repeatedly that takes the appropriate
> parameters. The whole import must be atomic. That is, if something goes
> wrong, the whole import needs to be aborted.
> I guess I have two questions:
> 1. Is repeatedly calling a stored procedure the best way to do this sort
> of stuff? I know doing imports like this must be a pretty common
> operation, but I don't know if using an SPROC is a dumb idea.
> 2. Is it ok to start a transaction and repeatedly call a SPROC, like,
> 10000 times? Is there a better what to go about this?
> Maybe I'm worrying about nothing - but it just "feels" wrong to be
> repeatedly calling an SPROC 10,000 times! :-) I was wondering if there is
> a more elegant way?
> Any advice anyone could offer would be very much appreciated.
> Kindest Regards
> Simon|||If your data is coming in as one large XML document, XML Bulk Load might be
another alternative. The final result is the same as SQLBulkCopy but the
input is an XML document.
This posting is provided "AS IS" with no warranties, and confers no rights.
Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm
"Arnie Rowland" <arnie@.1568.com> wrote in message
news:eyYZQd5mGHA.1204@.TK2MSFTNGP03.phx.gbl...
> You're right. You don't want to call the same stored procedure 10,000
> times if there is way to accomplish the task as one unit of work. First
> off, calling the stored procedure 10,000 times -while inside a
> transaction, will consume and hold more resources longer than necessary.
> Look into SQLBulkCopy as Alejandro offered
> --
> Arnie Rowland, YACE*
> "To be successful, your heart must accompany your knowledge."
> *Yet Another certification Exam
>
> "Simon Harvey" <nothanks@.hotmail.com> wrote in message
> news:ezy9dt4mGHA.732@.TK2MSFTNGP04.phx.gbl...
>|||Hi All,
I'm not sure how to approach this problem so if anyone could advise me
how to do it I would be very grateful.
I have an app that calls a web service. The webservice returns an array
of business objects.
So I have a collections of objects that might be, say, 10000 objects in
size.
I need to import all the business objects into an SQL Server 2005
database. The only way I know how to do this is to loop through each
object and call a stored procedure repeatedly that takes the appropriate
parameters. The whole import must be atomic. That is, if something goes
wrong, the whole import needs to be aborted.
I guess I have two questions:
1. Is repeatedly calling a stored procedure the best way to do this sort
of stuff? I know doing imports like this must be a pretty common
operation, but I don't know if using an SPROC is a dumb idea.
2. Is it ok to start a transaction and repeatedly call a SPROC, like,
10000 times? Is there a better what to go about this?
Maybe I'm worrying about nothing - but it just "feels" wrong to be
repeatedly calling an SPROC 10,000 times! :-) I was wondering if there
is a more elegant way?
Any advice anyone could offer would be very much appreciated.
Kindest Regards
Simon|||Simon,
Try using SqlBulkCopy object (ado.net 2.0).
AMB
"Simon Harvey" wrote:

> Hi All,
> I'm not sure how to approach this problem so if anyone could advise me
> how to do it I would be very grateful.
> I have an app that calls a web service. The webservice returns an array
> of business objects.
> So I have a collections of objects that might be, say, 10000 objects in
> size.
> I need to import all the business objects into an SQL Server 2005
> database. The only way I know how to do this is to loop through each
> object and call a stored procedure repeatedly that takes the appropriate
> parameters. The whole import must be atomic. That is, if something goes
> wrong, the whole import needs to be aborted.
> I guess I have two questions:
> 1. Is repeatedly calling a stored procedure the best way to do this sort
> of stuff? I know doing imports like this must be a pretty common
> operation, but I don't know if using an SPROC is a dumb idea.
> 2. Is it ok to start a transaction and repeatedly call a SPROC, like,
> 10000 times? Is there a better what to go about this?
> Maybe I'm worrying about nothing - but it just "feels" wrong to be
> repeatedly calling an SPROC 10,000 times! :-) I was wondering if there
> is a more elegant way?
> Any advice anyone could offer would be very much appreciated.
> Kindest Regards
> Simon
>|||You're right. You don't want to call the same stored procedure 10,000 times
if there is way to accomplish the task as one unit of work. First off,
calling the stored procedure 10,000 times -while inside a transaction, will
consume and hold more resources longer than necessary.
Look into SQLBulkCopy as Alejandro offered
Arnie Rowland, YACE*
"To be successful, your heart must accompany your knowledge."
*Yet Another certification Exam
"Simon Harvey" <nothanks@.hotmail.com> wrote in message
news:ezy9dt4mGHA.732@.TK2MSFTNGP04.phx.gbl...
> Hi All,
> I'm not sure how to approach this problem so if anyone could advise me how
> to do it I would be very grateful.
> I have an app that calls a web service. The webservice returns an array of
> business objects.
> So I have a collections of objects that might be, say, 10000 objects in
> size.
> I need to import all the business objects into an SQL Server 2005
> database. The only way I know how to do this is to loop through each
> object and call a stored procedure repeatedly that takes the appropriate
> parameters. The whole import must be atomic. That is, if something goes
> wrong, the whole import needs to be aborted.
> I guess I have two questions:
> 1. Is repeatedly calling a stored procedure the best way to do this sort
> of stuff? I know doing imports like this must be a pretty common
> operation, but I don't know if using an SPROC is a dumb idea.
> 2. Is it ok to start a transaction and repeatedly call a SPROC, like,
> 10000 times? Is there a better what to go about this?
> Maybe I'm worrying about nothing - but it just "feels" wrong to be
> repeatedly calling an SPROC 10,000 times! :-) I was wondering if there is
> a more elegant way?
> Any advice anyone could offer would be very much appreciated.
> Kindest Regards
> Simon|||If your data is coming in as one large XML document, XML Bulk Load might be
another alternative. The final result is the same as SQLBulkCopy but the
input is an XML document.
This posting is provided "AS IS" with no warranties, and confers no rights.
Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm
"Arnie Rowland" <arnie@.1568.com> wrote in message
news:eyYZQd5mGHA.1204@.TK2MSFTNGP03.phx.gbl...
> You're right. You don't want to call the same stored procedure 10,000
> times if there is way to accomplish the task as one unit of work. First
> off, calling the stored procedure 10,000 times -while inside a
> transaction, will consume and hold more resources longer than necessary.
> Look into SQLBulkCopy as Alejandro offered
> --
> Arnie Rowland, YACE*
> "To be successful, your heart must accompany your knowledge."
> *Yet Another certification Exam
>
> "Simon Harvey" <nothanks@.hotmail.com> wrote in message
> news:ezy9dt4mGHA.732@.TK2MSFTNGP04.phx.gbl...
>|||Thanks guys!
That XML one seems even more interesting actually, although I'm guessing
it might be quite hard to set up the mappings. Especially seeing as the
data is quite hierarchichal in nature.
Many thanks for your advice!
Simon
Roger Wolter[MSFT] wrote:
> If your data is coming in as one large XML document, XML Bulk Load might b
e
> another alternative. The final result is the same as SQLBulkCopy but the
> input is an XML document.
>|||Thanks guys!
That XML one seems even more interesting actually, although I'm guessing
it might be quite hard to set up the mappings. Especially seeing as the
data is quite hierarchichal in nature.
Many thanks for your advice!
Simon
Roger Wolter[MSFT] wrote:
> If your data is coming in as one large XML document, XML Bulk Load might b
e
> another alternative. The final result is the same as SQLBulkCopy but the
> input is an XML document.
>

Monday, March 19, 2012

Importing Excel data that spans rows

Hi,

I need to import and transform data from an Excel spread sheet where the information spans two rows. The file layout is something like:

Row1Product1 QTY Store1 Store2 Store3 ...
Row2Product1 AMT Store1 Store2 Store3
Row3Product2 QTY Store1 Store2 Store3
Row4Product2 AMT Store1 Store2 Store3

The output would look like

Product1 Store1 QTY AMT
Product1 Store2 QTY AMT
...
Product2 Store1 QTY AMT
Product2 Store2 QTY AMT
...

We currently use a VB6 program using Office Tools to handle this. Is there a way to handle this with the out-of-the-box SSIS?

Thanks in advance,

John

Script Data Flow Component seem like a good candidate. You can probably reuse a lot of your VB code.

Thanks.

|||

Bob,

Do you need Visual Tools for Office 2005 to acess the worksheet from the script task?

jOHN

|||

Not necessarily.

You may get the data as it is using the SSIS Excel Source, and then manipulate the loaded data in the Script Component.

Thanks.

|||Hi John,

As you've already discovered, SSIS does not provide an out-of-the box solution to your problem. I think you may find DataDefractor a useful tool to do just that. It is a custom SSIS data source component designed to extract and normalize data captured in semi-structured data sources such as Excel and CSV data reports. With the help of its example-driven user interface you can map out a flexible schema to extract data straight out of your data source without writing one line of code.

You can download a free trial of DataDefractor at http://www.datadefractor.com

Cheers,
Vassil Kovatchev

Importing Excel data that spans rows

Hi,

I need to import and transform data from an Excel spread sheet where the information spans two rows. The file layout is something like:

Row1Product1 QTY Store1 Store2 Store3 ...
Row2Product1 AMT Store1 Store2 Store3
Row3Product2 QTY Store1 Store2 Store3
Row4Product2 AMT Store1 Store2 Store3

The output would look like

Product1 Store1 QTY AMT
Product1 Store2 QTY AMT
...
Product2 Store1 QTY AMT
Product2 Store2 QTY AMT
...

We currently use a VB6 program using Office Tools to handle this. Is there a way to handle this with the out-of-the-box SSIS?

Thanks in advance,

John

Script Data Flow Component seem like a good candidate. You can probably reuse a lot of your VB code.

Thanks.

|||

Bob,

Do you need Visual Tools for Office 2005 to acess the worksheet from the script task?

jOHN

|||

Not necessarily.

You may get the data as it is using the SSIS Excel Source, and then manipulate the loaded data in the Script Component.

Thanks.

|||Hi John,

As you've already discovered, SSIS does not provide an out-of-the box solution to your problem. I think you may find DataDefractor a useful tool to do just that. It is a custom SSIS data source component designed to extract and normalize data captured in semi-structured data sources such as Excel and CSV data reports. With the help of its example-driven user interface you can map out a flexible schema to extract data straight out of your data source without writing one line of code.

You can download a free trial of DataDefractor at http://www.datadefractor.com

Cheers,
Vassil Kovatchev

Monday, March 12, 2012

Importing data whilst ignoring constraints

Hi all,
I'm trying to import rows in tables from one database into a few tables in
another database
The problem is that the import is failing because constraints are being breached
no matter what order I try and import the table.
I'm sure that there must be a way of switching off constraint checking whilst
importing the data, but at the moment I can't see it. I'm in a catch 22 situation
at the moment, with data in each table depending on data that hasnt been
imported yet.
Any help would be very much appreciated.
Many thanks all
S
> The problem is that the import is failing because constraints are being
> breached no matter what order I try and import the table.
You ought to be able to successfully import data with foreign key
constraints enabled as long as your are doing this in the correct order.
You can enable and disable foreign key constraints with ALTER TABLE like the
example below. See ALTER TABLE in the Books Online for more info.
--disable FK
ALTER TABLE MyTable
NOCHECK CONSTRAINT ALL
--enable FK and validate data
ALTER TABLE MyTable
WITH CHECK
CHECK CONSTRAINT ALL
Hope this helps.
Dan Guzman
SQL Server MVP
"SimonH" <none@.hotmail.com> wrote in message
news:39680632423364747500000@.news.microsoft.com...
> Hi all,
> I'm trying to import rows in tables from one database into a few tables in
> another database
> The problem is that the import is failing because constraints are being
> breached no matter what order I try and import the table.
> I'm sure that there must be a way of switching off constraint checking
> whilst importing the data, but at the moment I can't see it. I'm in a
> catch 22 situation at the moment, with data in each table depending on
> data that hasnt been imported yet.
> Any help would be very much appreciated.
> Many thanks all
> S
>

Importing data whilst ignoring constraints

Hi all,
I'm trying to import rows in tables from one database into a few tables in
another database
The problem is that the import is failing because constraints are being brea
ched
no matter what order I try and import the table.
I'm sure that there must be a way of switching off constraint checking whils
t
importing the data, but at the moment I can't see it. I'm in a catch 22 situ
ation
at the moment, with data in each table depending on data that hasnt been
imported yet.
Any help would be very much appreciated.
Many thanks all
S> The problem is that the import is failing because constraints are being
> breached no matter what order I try and import the table.
You ought to be able to successfully import data with foreign key
constraints enabled as long as your are doing this in the correct order.
You can enable and disable foreign key constraints with ALTER TABLE like the
example below. See ALTER TABLE in the Books Online for more info.
--disable FK
ALTER TABLE MyTable
NOCHECK CONSTRAINT ALL
--enable FK and validate data
ALTER TABLE MyTable
WITH CHECK
CHECK CONSTRAINT ALL
Hope this helps.
Dan Guzman
SQL Server MVP
"SimonH" <none@.hotmail.com> wrote in message
news:39680632423364747500000@.news.microsoft.com...
> Hi all,
> I'm trying to import rows in tables from one database into a few tables in
> another database
> The problem is that the import is failing because constraints are being
> breached no matter what order I try and import the table.
> I'm sure that there must be a way of switching off constraint checking
> whilst importing the data, but at the moment I can't see it. I'm in a
> catch 22 situation at the moment, with data in each table depending on
> data that hasnt been imported yet.
> Any help would be very much appreciated.
> Many thanks all
> S
>

Importing data whilst ignoring constraints

Hi all,
I'm trying to import rows in tables from one database into a few tables in
another database
The problem is that the import is failing because constraints are being breached
no matter what order I try and import the table.
I'm sure that there must be a way of switching off constraint checking whilst
importing the data, but at the moment I can't see it. I'm in a catch 22 situation
at the moment, with data in each table depending on data that hasnt been
imported yet.
Any help would be very much appreciated.
Many thanks all> The problem is that the import is failing because constraints are being
> breached no matter what order I try and import the table.
You ought to be able to successfully import data with foreign key
constraints enabled as long as your are doing this in the correct order.
You can enable and disable foreign key constraints with ALTER TABLE like the
example below. See ALTER TABLE in the Books Online for more info.
--disable FK
ALTER TABLE MyTable
NOCHECK CONSTRAINT ALL
--enable FK and validate data
ALTER TABLE MyTable
WITH CHECK
CHECK CONSTRAINT ALL
--
Hope this helps.
Dan Guzman
SQL Server MVP
"SimonH" <none@.hotmail.com> wrote in message
news:39680632423364747500000@.news.microsoft.com...
> Hi all,
> I'm trying to import rows in tables from one database into a few tables in
> another database
> The problem is that the import is failing because constraints are being
> breached no matter what order I try and import the table.
> I'm sure that there must be a way of switching off constraint checking
> whilst importing the data, but at the moment I can't see it. I'm in a
> catch 22 situation at the moment, with data in each table depending on
> data that hasnt been imported yet.
> Any help would be very much appreciated.
> Many thanks all
> S
>

Friday, March 9, 2012

Importing data from Paradox into SQL

I have a Paradox file which has 270 thousand records. If I open this db file through a paradox viewer it shows the correct number of rows.

I have an SSIS package which is suppose to read this db file and insert it into the SQL 2005 table. But when I execute this package it goes into a loop, reading the file over and over again. The package fails after inserting some 10 million rows giving the error msg 'Not enough space on temporary disk'.

On examing the data transfered into SQL there are duplicate rows.

I also used the import export wizard (thinking there might be some error in Package code) provided by SQL to transfer the data from .db to SQL but it has the same result(goes in a loop).

I would appreciate any help in this problem. Let me know if you have any other questions.

Thanks

Duplicate! Ignore!

Friday, February 24, 2012

Importing CSV files

Hello,

I need to import a bunch of .csv files. The problem I am having is the
"non data" information in the files creating bogus rows and column
definitions. Here is an example of the csv file.

CBOT - End-of-Day Futures Bulk Download 2001.
2 Year U.S. Treasury Notes Futures

DateSymbolMonth CodeYear CodeOpen
20010103ZTH2001102.09375
20010104ZTH2001102.03125
20010105ZTH2001102.28125

In this case, there are bogues rows created with the text at the
beginning of the file, and also the column names get placed into a row
as well. My question is; how do you import the file and strip out the
"non-data" data? so that only the actual data gets inserted into the db?

Thanks,

TGru

*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!Write a dotnet or vb exe to loop thru your csv file and place valid
rows in a separate file, then run bcp using this new file.|||Use BCP or BULK INSERT with the FIRSTROW (-F) option.

--
David Portas
SQL Server MVP
--|||tgru (tgru@.devdex.com) writes:
> I need to import a bunch of .csv files. The problem I am having is the
> "non data" information in the files creating bogus rows and column
> definitions. Here is an example of the csv file.
> CBOT - End-of-Day Futures Bulk Download 2001.
> 2 Year U.S. Treasury Notes Futures
> Date Symbol Month Code Year Code Open
> 20010103 ZT H 2001 102.09375
> 20010104 ZT H 2001 102.03125
> 20010105 ZT H 2001 102.28125

That doesn't look like a CSV file to me...

> In this case, there are bogues rows created with the text at the
> beginning of the file, and also the column names get placed into a row
> as well. My question is; how do you import the file and strip out the
> "non-data" data? so that only the actual data gets inserted into the db?

Was it only the column headers, you could use -F 2 with BCP to specify
that the BCP is to start with the second record. (I believe -F is the
option. Double-check with Books Online.) But the introducing text is
more difficult to handle. Assuming that there are no tabs in the text,
BCP may consider the row as an error. Then again, you can control how
many errors BCP may accept, so if you only can make sure that you get
in sync, it may be possible.

However, hharry's suggestion that you write a program that strips the
header info, is probably a safer way.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||"tgru" <tgru@.devdex.com> wrote in message news:421f97a6$1_2@.127.0.0.1...
> Hello,
> I need to import a bunch of .csv files. The problem I am having is the
> "non data" information in the files creating bogus rows and column
> definitions. Here is an example of the csv file.
> CBOT - End-of-Day Futures Bulk Download 2001.
> 2 Year U.S. Treasury Notes Futures
> Date Symbol Month Code Year Code Open
> 20010103 ZT H 2001 102.09375
> 20010104 ZT H 2001 102.03125
> 20010105 ZT H 2001 102.28125
>
> In this case, there are bogues rows created with the text at the
> beginning of the file, and also the column names get placed into a row
> as well. My question is; how do you import the file and strip out the
> "non-data" data? so that only the actual data gets inserted into the db?
> Thanks,
> TGru
> *** Sent via Developersdex http://www.developersdex.com ***
> Don't just participate in USENET...get rewarded for it!

Take a look at AWK (I use the GNU version GAWK).

This is a old old text stream (file) processing utility that uses regular
expressions to let you filter and/or reformat data.

You could have it create a new import file that ignores blank lines and
headers (ie. starts with "CBOT').