Showing posts with label run. Show all posts
Showing posts with label run. Show all posts

Wednesday, March 28, 2012

Importing unicode data from Oracle Database

Hi everybody,

i try to import data from an oracle database configured in unicode.
When i run the package i have an error message

Error at Data Flow Task [Source - ACCOUNT]: Column "ACCOUNT_CODE" cannot convert between unicode and non-unicode string data types.

how can i resolve this?

regards.Open up your package and set the data type of ACCOUNT_CODE field to be DT_WSTR.

-Jamie|||Thanks for your help.

I have change data type in advanced editor but now i have this error
the output column "ACCOUNT_CODE" on the error output has properties that do not match the properties of its corresponding data source column.
The data appears as string and i can not change the error output data type.

how can i resolve this ?

Thank you.|||Any luck here? I'm having the same problem. Unicode coming in, bound for a text file so I change the output column data type to string. Trouble is, the error output column data type remains Unicode text stream, and I can't change it.|||have you tried using the data conversion transformation component?|||

Indeed, using a Data Conversion component is the way to go. We do not perform implicit conversions.

We have had a number of requests for implicit conversions and we're certainly looking at those. However, it is important to bear in mind that the more SSIS does implicitly, the fewer opportunities you have to capture and respond to errors at a fine grain of detail. Of course, there will be some business cases where such fine-grained error handling is less important than others - Unicode Ansi conversion may well be one of them.

Donald

|||

Importing unicode from Oracle 10g to SQL2005. Source column is defined as VARCHAR2, destination as NVARCHAR.

I have tried a standard OLEDB input/output using the MS Oracle driver, and also with a data conversion component specifying the input and output columns as DT_WSTR. The data-flow runs with no errors, yet the data loaded into SQL is incorrect.

In Oracle, using the DUMP function to see the Unicode value of the column - value is 151. Inspecting the column in SQL Server after the import using the UNICODE function yields a value of 191.

Any other suggestions as to what I might try? Thanks

|||

Could you check what Locale ID and code pages are set on your source connecting to Oracle?

Thanks.

|||

LocaleID is English (United States)

Codepage is 1252.

Using SQL2005 SP2.

thanks

|||

OK. And do you know what code page your string in the Oracle database is supposed to be associated with? If it is not 1252 (ANSI - Latin I) you can expect conversion errors.

Thanks.

|||

The Oracle CharacterSet is WE8ISO8859P1.

I strongly suspect the issue is the MS Oracle OLEDB driver. The source columns can contain characters equivalent to the decimal values 150,151,152,153 - which are extended ASCII.

My workaround is to translate these characters to standard ASCII characters within the source query, then translate back after the SQL insert is completed.

Thanks for your assistance.

|||Hi Marc,

How do you translate extended ASCII characters to standard ASCII characters within the source query?

This data importation is given me a lot of headeaches !!! :-((

Your help will be much appreciated.

Thank you,

Nicole
|||Has anyone successfully gotten this to work? We started running into this issue last week when we started testing going from 9.2.0.8 to using 10.2.0.3 migrating to SQL server 2005.

Here is my character set for 9i:

NLS_CHARACTER_SET AL16UTF16
NLS_NUMERIC_CHARACTERS .,
NLS_CHARACTERSET WE8iSO8859P1

10g

NLS_CHARACTER_SET AL16UTF16
NLS_NUMERIC_CHARACTERS .,
NLS_CHARACTERSET AL32UTF

Are these the settings everyone else is using? 10g defaults to the newer version.

Thanks for any help.

Importing unicode data from Oracle Database

Hi everybody,

i try to import data from an oracle database configured in unicode.
When i run the package i have an error message

Error at Data Flow Task [Source - ACCOUNT]: Column "ACCOUNT_CODE" cannot convert between unicode and non-unicode string data types.

how can i resolve this?

regards.Open up your package and set the data type of ACCOUNT_CODE field to be DT_WSTR.

-Jamie|||Thanks for your help.

I have change data type in advanced editor but now i have this error
the output column "ACCOUNT_CODE" on the error output has properties that do not match the properties of its corresponding data source column.
The data appears as string and i can not change the error output data type.

how can i resolve this ?

Thank you.|||Any luck here? I'm having the same problem. Unicode coming in, bound for a text file so I change the output column data type to string. Trouble is, the error output column data type remains Unicode text stream, and I can't change it.|||have you tried using the data conversion transformation component?|||

Indeed, using a Data Conversion component is the way to go. We do not perform implicit conversions.

We have had a number of requests for implicit conversions and we're certainly looking at those. However, it is important to bear in mind that the more SSIS does implicitly, the fewer opportunities you have to capture and respond to errors at a fine grain of detail. Of course, there will be some business cases where such fine-grained error handling is less important than others - Unicode Ansi conversion may well be one of them.

Donald

|||

Importing unicode from Oracle 10g to SQL2005. Source column is defined as VARCHAR2, destination as NVARCHAR.

I have tried a standard OLEDB input/output using the MS Oracle driver, and also with a data conversion component specifying the input and output columns as DT_WSTR. The data-flow runs with no errors, yet the data loaded into SQL is incorrect.

In Oracle, using the DUMP function to see the Unicode value of the column - value is 151. Inspecting the column in SQL Server after the import using the UNICODE function yields a value of 191.

Any other suggestions as to what I might try? Thanks

|||

Could you check what Locale ID and code pages are set on your source connecting to Oracle?

Thanks.

|||

LocaleID is English (United States)

Codepage is 1252.

Using SQL2005 SP2.

thanks

|||

OK. And do you know what code page your string in the Oracle database is supposed to be associated with? If it is not 1252 (ANSI - Latin I) you can expect conversion errors.

Thanks.

|||

The Oracle CharacterSet is WE8ISO8859P1.

I strongly suspect the issue is the MS Oracle OLEDB driver. The source columns can contain characters equivalent to the decimal values 150,151,152,153 - which are extended ASCII.

My workaround is to translate these characters to standard ASCII characters within the source query, then translate back after the SQL insert is completed.

Thanks for your assistance.

|||Hi Marc,

How do you translate extended ASCII characters to standard ASCII characters within the source query?

This data importation is given me a lot of headeaches !!! :-((

Your help will be much appreciated.

Thank you,

Nicole
|||Has anyone successfully gotten this to work? We started running into this issue last week when we started testing going from 9.2.0.8 to using 10.2.0.3 migrating to SQL server 2005.

Here is my character set for 9i:

NLS_CHARACTER_SET AL16UTF16
NLS_NUMERIC_CHARACTERS .,
NLS_CHARACTERSET WE8iSO8859P1

10g

NLS_CHARACTER_SET AL16UTF16
NLS_NUMERIC_CHARACTERS .,
NLS_CHARACTERSET AL32UTF

Are these the settings everyone else is using? 10g defaults to the newer version.

Thanks for any help.

Importing unicode data from Oracle Database

Hi everybody,

i try to import data from an oracle database configured in unicode.
When i run the package i have an error message

Error at Data Flow Task [Source - ACCOUNT]: Column "ACCOUNT_CODE" cannot convert between unicode and non-unicode string data types.

how can i resolve this?

regards.Open up your package and set the data type of ACCOUNT_CODE field to be DT_WSTR.

-Jamie|||Thanks for your help.

I have change data type in advanced editor but now i have this error
the output column "ACCOUNT_CODE" on the error output has properties that do not match the properties of its corresponding data source column.
The data appears as string and i can not change the error output data type.

how can i resolve this ?

Thank you.|||Any luck here? I'm having the same problem. Unicode coming in, bound for a text file so I change the output column data type to string. Trouble is, the error output column data type remains Unicode text stream, and I can't change it.|||have you tried using the data conversion transformation component?|||

Indeed, using a Data Conversion component is the way to go. We do not perform implicit conversions.

We have had a number of requests for implicit conversions and we're certainly looking at those. However, it is important to bear in mind that the more SSIS does implicitly, the fewer opportunities you have to capture and respond to errors at a fine grain of detail. Of course, there will be some business cases where such fine-grained error handling is less important than others - Unicode Ansi conversion may well be one of them.

Donald

|||

Importing unicode from Oracle 10g to SQL2005. Source column is defined as VARCHAR2, destination as NVARCHAR.

I have tried a standard OLEDB input/output using the MS Oracle driver, and also with a data conversion component specifying the input and output columns as DT_WSTR. The data-flow runs with no errors, yet the data loaded into SQL is incorrect.

In Oracle, using the DUMP function to see the Unicode value of the column - value is 151. Inspecting the column in SQL Server after the import using the UNICODE function yields a value of 191.

Any other suggestions as to what I might try? Thanks

|||

Could you check what Locale ID and code pages are set on your source connecting to Oracle?

Thanks.

|||

LocaleID is English (United States)

Codepage is 1252.

Using SQL2005 SP2.

thanks

|||

OK. And do you know what code page your string in the Oracle database is supposed to be associated with? If it is not 1252 (ANSI - Latin I) you can expect conversion errors.

Thanks.

|||

The Oracle CharacterSet is WE8ISO8859P1.

I strongly suspect the issue is the MS Oracle OLEDB driver. The source columns can contain characters equivalent to the decimal values 150,151,152,153 - which are extended ASCII.

My workaround is to translate these characters to standard ASCII characters within the source query, then translate back after the SQL insert is completed.

Thanks for your assistance.

|||Hi Marc,

How do you translate extended ASCII characters to standard ASCII characters within the source query?

This data importation is given me a lot of headeaches !!! :-((

Your help will be much appreciated.

Thank you,

Nicole
|||Has anyone successfully gotten this to work? We started running into this issue last week when we started testing going from 9.2.0.8 to using 10.2.0.3 migrating to SQL server 2005.

Here is my character set for 9i:

NLS_CHARACTER_SET AL16UTF16
NLS_NUMERIC_CHARACTERS .,
NLS_CHARACTERSET WE8iSO8859P1

10g

NLS_CHARACTER_SET AL16UTF16
NLS_NUMERIC_CHARACTERS .,
NLS_CHARACTERSET AL32UTF

Are these the settings everyone else is using? 10g defaults to the newer version.

Thanks for any help.
sql

Importing text file

Hi all,
I'm importing a text-file into a SQL table, The catch is, I want to run a UDF on one of the columns before it gets added to the table... I would like to do the importing, and conversion in one step, and i'd prefer to do this using bulk insert, or bcp if I
have to, as this will run frequently on the data. ANY help will be greatly appreciated.
Rival
Your best bet would be to use SQL Server Data Transforation Services (DTS),
as you would be able to perform the transformations required and import all
in one step. For more information refer to Data Transformation Services in
SQL Server Books Online (BOL). A good DTS reference site is www.sqldts.com.
--
PETER WARD
WARDY Inc.
www.wardyinc.com
"Rival" <anonymous@.discussions.microsoft.com> wrote in message
news:5D60CE30-BFA2-4F13-BE8D-64059DE91D82@.microsoft.com...
> Hi all,
> I'm importing a text-file into a SQL table, The catch is, I want to run a
UDF on one of the columns before it gets added to the table... I would like
to do the importing, and conversion in one step, and i'd prefer to do this
using bulk insert, or bcp if I have to, as this will run frequently on the
data. ANY help will be greatly appreciated.
|||Rival,
Bear in mind that some UDFs will not be set based, ie they will operate row
by row and will be poor performers. Look into DTS, or bcp into a staging
table and then run a single INSERT...SELECT statement to your destination.
Mark Allison, SQL Server MVP
http://www.markallison.co.uk
"Rival" <anonymous@.discussions.microsoft.com> wrote in message
news:5D60CE30-BFA2-4F13-BE8D-64059DE91D82@.microsoft.com...
> Hi all,
> I'm importing a text-file into a SQL table, The catch is, I want to run a
UDF on one of the columns before it gets added to the table... I would like
to do the importing, and conversion in one step, and i'd prefer to do this
using bulk insert, or bcp if I have to, as this will run frequently on the
data. ANY help will be greatly appreciated.
sql

Importing text file

Hi all
I'm importing a text-file into a SQL table, The catch is, I want to run a UDF on one of the columns before it gets added to the table... I would like to do the importing, and conversion in one step, and i'd prefer to do this using bulk insert, or bcp if I have to, as this will run frequently on the data. ANY help will be greatly appreciated.Rival
Your best bet would be to use SQL Server Data Transforation Services (DTS),
as you would be able to perform the transformations required and import all
in one step. For more information refer to Data Transformation Services in
SQL Server Books Online (BOL). A good DTS reference site is www.sqldts.com.
--
--
PETER WARD
WARDY Inc.
www.wardyinc.com
--
"Rival" <anonymous@.discussions.microsoft.com> wrote in message
news:5D60CE30-BFA2-4F13-BE8D-64059DE91D82@.microsoft.com...
> Hi all,
> I'm importing a text-file into a SQL table, The catch is, I want to run a
UDF on one of the columns before it gets added to the table... I would like
to do the importing, and conversion in one step, and i'd prefer to do this
using bulk insert, or bcp if I have to, as this will run frequently on the
data. ANY help will be greatly appreciated.|||Rival,
Bear in mind that some UDFs will not be set based, ie they will operate row
by row and will be poor performers. Look into DTS, or bcp into a staging
table and then run a single INSERT...SELECT statement to your destination.
--
Mark Allison, SQL Server MVP
http://www.markallison.co.uk
"Rival" <anonymous@.discussions.microsoft.com> wrote in message
news:5D60CE30-BFA2-4F13-BE8D-64059DE91D82@.microsoft.com...
> Hi all,
> I'm importing a text-file into a SQL table, The catch is, I want to run a
UDF on one of the columns before it gets added to the table... I would like
to do the importing, and conversion in one step, and i'd prefer to do this
using bulk insert, or bcp if I have to, as this will run frequently on the
data. ANY help will be greatly appreciated.

Importing text file

Hi all,
I'm importing a text-file into a SQL table, The catch is, I want to run a UD
F on one of the columns before it gets added to the table... I would like to
do the importing, and conversion in one step, and i'd prefer to do this usi
ng bulk insert, or bcp if I
have to, as this will run frequently on the data. ANY help will be greatly a
ppreciated.Rival
Your best bet would be to use SQL Server Data Transforation Services (DTS),
as you would be able to perform the transformations required and import all
in one step. For more information refer to Data Transformation Services in
SQL Server Books Online (BOL). A good DTS reference site is www.sqldts.com.
--
PETER WARD
WARDY Inc.
www.wardyinc.com
--
"Rival" <anonymous@.discussions.microsoft.com> wrote in message
news:5D60CE30-BFA2-4F13-BE8D-64059DE91D82@.microsoft.com...
> Hi all,
> I'm importing a text-file into a SQL table, The catch is, I want to run a
UDF on one of the columns before it gets added to the table... I would like
to do the importing, and conversion in one step, and i'd prefer to do this
using bulk insert, or bcp if I have to, as this will run frequently on the
data. ANY help will be greatly appreciated.|||Rival,
Bear in mind that some UDFs will not be set based, ie they will operate row
by row and will be poor performers. Look into DTS, or bcp into a staging
table and then run a single INSERT...SELECT statement to your destination.
Mark Allison, SQL Server MVP
http://www.markallison.co.uk
"Rival" <anonymous@.discussions.microsoft.com> wrote in message
news:5D60CE30-BFA2-4F13-BE8D-64059DE91D82@.microsoft.com...
> Hi all,
> I'm importing a text-file into a SQL table, The catch is, I want to run a
UDF on one of the columns before it gets added to the table... I would like
to do the importing, and conversion in one step, and i'd prefer to do this
using bulk insert, or bcp if I have to, as this will run frequently on the
data. ANY help will be greatly appreciated.

Wednesday, March 21, 2012

importing LongText fields

Hi all,
I've got a large db I downloaded in a .SQL format that came originally from
MySQL. I'm trying to alter the code to allow me to run it under MS-SQL.
I've got it mostly translated over, but I'm running into a problem with
MySQL's LONGTEXT data type. I created a unique data type (using SQL 2K5)
called LONGTEXT that I assigned to a VARCHAR(8000), and I'm still running
into problems with fields being too long to enter.
I guess the crux of my question is this: is there any way to go over the
8000-character limitation?
Regards,
Scott McNair
"Scott McNair" <scott.mcnair@.sfmco.takethispartout.com> wrote in message
news:Xns95DAA939F3E2Dsfmco@.207.46.248.16...
> Hi all,
> I've got a large db I downloaded in a .SQL format that came originally
> from
> MySQL. I'm trying to alter the code to allow me to run it under MS-SQL.
> I've got it mostly translated over, but I'm running into a problem with
> MySQL's LONGTEXT data type. I created a unique data type (using SQL 2K5)
> called LONGTEXT that I assigned to a VARCHAR(8000), and I'm still running
> into problems with fields being too long to enter.
> I guess the crux of my question is this: is there any way to go over the
> 8000-character limitation?
> Regards,
> Scott McNair
Take a look at text and ntext. You have a 2GB/1GB limit respectively.
HTH
Rick Sawtell
MCT, MCSD, MCDBA
|||"Rick Sawtell" <quickening@.msn.com> wrote in news:ODEu8w#9EHA.2568
@.TK2MSFTNGP10.phx.gbl:

> Take a look at text and ntext. You have a 2GB/1GB limit respectively.
Perfect. I created a UDDT associating "longtext" with text, and it imports
more-or-less perfectly.
|||My apology for btting in this reply but I have the same issue where I need to
store +- up to 16000 characters in one field and I tried text or ntext but
the max size value is 16, can you tell me how to set up that field to accept
the 16000 i need or more?
Thanks
Xaviervp
"Scott McNair" wrote:

> "Rick Sawtell" <quickening@.msn.com> wrote in news:ODEu8w#9EHA.2568
> @.TK2MSFTNGP10.phx.gbl:
>
> Perfect. I created a UDDT associating "longtext" with text, and it imports
> more-or-less perfectly.
>
|||As Rick stated, the max text/ntext/image size is 2GB. You don't need to do
anything special.
The reported 16-byte length is the default text-in-row length. The default
length of 16 will hold the pointer to the separately stored value.
Hope this helps.
Dan Guzman
SQL Server MVP
"xaviervp" <xaviervp@.discussions.microsoft.com> wrote in message
news:66C5708A-B2F2-44ED-BC46-727AFD090EEB@.microsoft.com...[vbcol=seagreen]
> My apology for btting in this reply but I have the same issue where I need
> to
> store +- up to 16000 characters in one field and I tried text or ntext but
> the max size value is 16, can you tell me how to set up that field to
> accept
> the 16000 i need or more?
> Thanks
> Xaviervp
> "Scott McNair" wrote:

importing LongText fields

Hi all,
I've got a large db I downloaded in a .SQL format that came originally from
MySQL. I'm trying to alter the code to allow me to run it under MS-SQL.
I've got it mostly translated over, but I'm running into a problem with
MySQL's LONGTEXT data type. I created a unique data type (using SQL 2K5)
called LONGTEXT that I assigned to a VARCHAR(8000), and I'm still running
into problems with fields being too long to enter.
I guess the crux of my question is this: is there any way to go over the
8000-character limitation?
Regards,
Scott McNair"Scott McNair" <scott.mcnair@.sfmco.takethispartout.com> wrote in message
news:Xns95DAA939F3E2Dsfmco@.207.46.248.16...
> Hi all,
> I've got a large db I downloaded in a .SQL format that came originally
> from
> MySQL. I'm trying to alter the code to allow me to run it under MS-SQL.
> I've got it mostly translated over, but I'm running into a problem with
> MySQL's LONGTEXT data type. I created a unique data type (using SQL 2K5)
> called LONGTEXT that I assigned to a VARCHAR(8000), and I'm still running
> into problems with fields being too long to enter.
> I guess the crux of my question is this: is there any way to go over the
> 8000-character limitation?
> Regards,
> Scott McNair
Take a look at text and ntext. You have a 2GB/1GB limit respectively.
HTH
Rick Sawtell
MCT, MCSD, MCDBA|||"Rick Sawtell" <quickening@.msn.com> wrote in news:ODEu8w#9EHA.2568
@.TK2MSFTNGP10.phx.gbl:

> Take a look at text and ntext. You have a 2GB/1GB limit respectively.
Perfect. I created a UDDT associating "longtext" with text, and it imports
more-or-less perfectly.|||My apology for btting in this reply but I have the same issue where I need t
o
store +- up to 16000 characters in one field and I tried text or ntext but
the max size value is 16, can you tell me how to set up that field to accept
the 16000 i need or more?
Thanks
Xaviervp
"Scott McNair" wrote:

> "Rick Sawtell" <quickening@.msn.com> wrote in news:ODEu8w#9EHA.2568
> @.TK2MSFTNGP10.phx.gbl:
>
> Perfect. I created a UDDT associating "longtext" with text, and it import
s
> more-or-less perfectly.
>|||As Rick stated, the max text/ntext/image size is 2GB. You don't need to do
anything special.
The reported 16-byte length is the default text-in-row length. The default
length of 16 will hold the pointer to the separately stored value.
Hope this helps.
Dan Guzman
SQL Server MVP
"xaviervp" <xaviervp@.discussions.microsoft.com> wrote in message
news:66C5708A-B2F2-44ED-BC46-727AFD090EEB@.microsoft.com...[vbcol=seagreen]
> My apology for btting in this reply but I have the same issue where I need
> to
> store +- up to 16000 characters in one field and I tried text or ntext but
> the max size value is 16, can you tell me how to set up that field to
> accept
> the 16000 i need or more?
> Thanks
> Xaviervp
> "Scott McNair" wrote:
>

importing LongText fields

Hi all,
I've got a large db I downloaded in a .SQL format that came originally from
MySQL. I'm trying to alter the code to allow me to run it under MS-SQL.
I've got it mostly translated over, but I'm running into a problem with
MySQL's LONGTEXT data type. I created a unique data type (using SQL 2K5)
called LONGTEXT that I assigned to a VARCHAR(8000), and I'm still running
into problems with fields being too long to enter.
I guess the crux of my question is this: is there any way to go over the
8000-character limitation?
Regards,
Scott McNair"Scott McNair" <scott.mcnair@.sfmco.takethispartout.com> wrote in message
news:Xns95DAA939F3E2Dsfmco@.207.46.248.16...
> Hi all,
> I've got a large db I downloaded in a .SQL format that came originally
> from
> MySQL. I'm trying to alter the code to allow me to run it under MS-SQL.
> I've got it mostly translated over, but I'm running into a problem with
> MySQL's LONGTEXT data type. I created a unique data type (using SQL 2K5)
> called LONGTEXT that I assigned to a VARCHAR(8000), and I'm still running
> into problems with fields being too long to enter.
> I guess the crux of my question is this: is there any way to go over the
> 8000-character limitation?
> Regards,
> Scott McNair
Take a look at text and ntext. You have a 2GB/1GB limit respectively.
HTH
Rick Sawtell
MCT, MCSD, MCDBA|||"Rick Sawtell" <quickening@.msn.com> wrote in news:ODEu8w#9EHA.2568
@.TK2MSFTNGP10.phx.gbl:
> Take a look at text and ntext. You have a 2GB/1GB limit respectively.
Perfect. I created a UDDT associating "longtext" with text, and it imports
more-or-less perfectly.|||My apology for btting in this reply but I have the same issue where I need to
store +- up to 16000 characters in one field and I tried text or ntext but
the max size value is 16, can you tell me how to set up that field to accept
the 16000 i need or more?
Thanks
Xaviervp
"Scott McNair" wrote:
> "Rick Sawtell" <quickening@.msn.com> wrote in news:ODEu8w#9EHA.2568
> @.TK2MSFTNGP10.phx.gbl:
> > Take a look at text and ntext. You have a 2GB/1GB limit respectively.
> Perfect. I created a UDDT associating "longtext" with text, and it imports
> more-or-less perfectly.
>|||As Rick stated, the max text/ntext/image size is 2GB. You don't need to do
anything special.
The reported 16-byte length is the default text-in-row length. The default
length of 16 will hold the pointer to the separately stored value.
--
Hope this helps.
Dan Guzman
SQL Server MVP
"xaviervp" <xaviervp@.discussions.microsoft.com> wrote in message
news:66C5708A-B2F2-44ED-BC46-727AFD090EEB@.microsoft.com...
> My apology for btting in this reply but I have the same issue where I need
> to
> store +- up to 16000 characters in one field and I tried text or ntext but
> the max size value is 16, can you tell me how to set up that field to
> accept
> the 16000 i need or more?
> Thanks
> Xaviervp
> "Scott McNair" wrote:
>> "Rick Sawtell" <quickening@.msn.com> wrote in news:ODEu8w#9EHA.2568
>> @.TK2MSFTNGP10.phx.gbl:
>> > Take a look at text and ntext. You have a 2GB/1GB limit respectively.
>> Perfect. I created a UDDT associating "longtext" with text, and it
>> imports
>> more-or-less perfectly.sql

Monday, March 19, 2012

Importing excel file into Database

Hi, I am very new to ssis, I am trying to set up a package that I can set to run every hour or so, what it will do is look for all excel files in a certain folder and import them into a table on an sql server.

I managed to get it working but my problem is that my data is survey answers and some columns contain a comment.

I get these files on a weekly basis and some weeks the length of the longest comment makes ssis want to use a different data type for the comment column (sometimes it wants DT_NTEXT, other times it wants DT_WSTR).

as this column is filled out by a human I guess aposrtophies and other characters may affext this as well.

when I made the data flow task, I used the wizard on a file which contained about 8 weeks worth of data. when I use 1 weeks worth of data where the comment length is very low, the task gives a validation error saying the metadata is out of sync. if I go back and set the data type for that column to DT_WSTR and rerun the task, it works but then when it tries to process a different weeks worth of data it will fail again

here is an example of an error I get when it thinks the data type is wrong.

[Source - 'raw data$' [1]] Error: The output column "Question ID50# (Verbatim)" (439) on the error output has properties that do not match the properties of its corresponding data source column.

I played around with the data types for a while and managed to get it to process the first file and then try to process the secondfile, in the second file it got around the validation but then got this error:

[Source - 'raw data$' [1]] Error: Failed to retrieve long data for column "Question ID3# (Verbatim)".

is there a way to make it recalculate the data types by itself for each excel file?

I am stuck trying to figure this one out.

sorry if I havent provided enough information, I am not sure which direction to head with this

First, you have to implement an administrative policy: a same type of Excel file must be used in your activities with same list of columns. So, you can use a SSIS package that will be created with DT_NTEXT data type (in this way it can get small or big text strings) and run it when you need; you don't have to rebuild periodically this SSIS package or recreate with wizard as you say.

In plus, using "for each loop container" you can, automatically load a series of excel files from a folder.

|||

Excel reports the metadata based on the actual data in the column. Unfortunately, the only workaround for the DT_WSTR / DT_NTEXT issue that I am aware is of to create two data flows, one with DT_WSTR and one with DT_NTEXT. Then use a precendence constraint to determine which one to execute.

You might check out this thread for some additional detail.

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2043158&SiteID=1

Friday, March 9, 2012

Importing data from Oracle9i CLOB column to MS SQL Server text column

Hi everyone,

I encountered an error "Need to run the object to perform this operation

Code execution exception: EXCEPTION_ACCESS_VIOLATION

" When I try to import data from Oracle to MS SQL Server with Enterprise
Manager (version 8.0) using DTS Import/Export Wizard. There are 508 rows
in Oracle table and I did get first 42 rows imported to SQL Server.
Anyone knows what does the above error message mean and what causes the
rest of the row failed importing?

Thanks very much in advance!

Rene Z.

--
Posted via http://dbforums.com"yzhao12" <member46605@.dbforums.com> wrote in message
news:3547161.1067635623@.dbforums.com...
> Hi everyone,
>
> I encountered an error "Need to run the object to perform this operation
> Code execution exception: EXCEPTION_ACCESS_VIOLATION
> " When I try to import data from Oracle to MS SQL Server with Enterprise
> Manager (version 8.0) using DTS Import/Export Wizard. There are 508 rows
> in Oracle table and I did get first 42 rows imported to SQL Server.
> Anyone knows what does the above error message mean and what causes the
> rest of the row failed importing?
>
> Thanks very much in advance!
>
> Rene Z.
>
> --
> Posted via http://dbforums.com

This error is described here:

http://support.microsoft.com/?kbid=271889

However, since the KB article refers to running a package from VB, but
you're using the Import/Export wizard, I'm not sure if this will help. You
could try saving the package from the wizard, instead of executing it, then
using this workaround:

http://www.databasejournal.com/feat...e.php/1461391#5

If that doesn't help, then I suggest you post to
microsoft.public.sqlserver.dts - you may get a better answer there.

Simon