Showing posts with label package. Show all posts
Showing posts with label package. Show all posts

Wednesday, March 28, 2012

Importing unicode data from Oracle Database

Hi everybody,

i try to import data from an oracle database configured in unicode.
When i run the package i have an error message

Error at Data Flow Task [Source - ACCOUNT]: Column "ACCOUNT_CODE" cannot convert between unicode and non-unicode string data types.

how can i resolve this?

regards.Open up your package and set the data type of ACCOUNT_CODE field to be DT_WSTR.

-Jamie|||Thanks for your help.

I have change data type in advanced editor but now i have this error
the output column "ACCOUNT_CODE" on the error output has properties that do not match the properties of its corresponding data source column.
The data appears as string and i can not change the error output data type.

how can i resolve this ?

Thank you.|||Any luck here? I'm having the same problem. Unicode coming in, bound for a text file so I change the output column data type to string. Trouble is, the error output column data type remains Unicode text stream, and I can't change it.|||have you tried using the data conversion transformation component?|||

Indeed, using a Data Conversion component is the way to go. We do not perform implicit conversions.

We have had a number of requests for implicit conversions and we're certainly looking at those. However, it is important to bear in mind that the more SSIS does implicitly, the fewer opportunities you have to capture and respond to errors at a fine grain of detail. Of course, there will be some business cases where such fine-grained error handling is less important than others - Unicode Ansi conversion may well be one of them.

Donald

|||

Importing unicode from Oracle 10g to SQL2005. Source column is defined as VARCHAR2, destination as NVARCHAR.

I have tried a standard OLEDB input/output using the MS Oracle driver, and also with a data conversion component specifying the input and output columns as DT_WSTR. The data-flow runs with no errors, yet the data loaded into SQL is incorrect.

In Oracle, using the DUMP function to see the Unicode value of the column - value is 151. Inspecting the column in SQL Server after the import using the UNICODE function yields a value of 191.

Any other suggestions as to what I might try? Thanks

|||

Could you check what Locale ID and code pages are set on your source connecting to Oracle?

Thanks.

|||

LocaleID is English (United States)

Codepage is 1252.

Using SQL2005 SP2.

thanks

|||

OK. And do you know what code page your string in the Oracle database is supposed to be associated with? If it is not 1252 (ANSI - Latin I) you can expect conversion errors.

Thanks.

|||

The Oracle CharacterSet is WE8ISO8859P1.

I strongly suspect the issue is the MS Oracle OLEDB driver. The source columns can contain characters equivalent to the decimal values 150,151,152,153 - which are extended ASCII.

My workaround is to translate these characters to standard ASCII characters within the source query, then translate back after the SQL insert is completed.

Thanks for your assistance.

|||Hi Marc,

How do you translate extended ASCII characters to standard ASCII characters within the source query?

This data importation is given me a lot of headeaches !!! :-((

Your help will be much appreciated.

Thank you,

Nicole
|||Has anyone successfully gotten this to work? We started running into this issue last week when we started testing going from 9.2.0.8 to using 10.2.0.3 migrating to SQL server 2005.

Here is my character set for 9i:

NLS_CHARACTER_SET AL16UTF16
NLS_NUMERIC_CHARACTERS .,
NLS_CHARACTERSET WE8iSO8859P1

10g

NLS_CHARACTER_SET AL16UTF16
NLS_NUMERIC_CHARACTERS .,
NLS_CHARACTERSET AL32UTF

Are these the settings everyone else is using? 10g defaults to the newer version.

Thanks for any help.

Importing unicode data from Oracle Database

Hi everybody,

i try to import data from an oracle database configured in unicode.
When i run the package i have an error message

Error at Data Flow Task [Source - ACCOUNT]: Column "ACCOUNT_CODE" cannot convert between unicode and non-unicode string data types.

how can i resolve this?

regards.Open up your package and set the data type of ACCOUNT_CODE field to be DT_WSTR.

-Jamie|||Thanks for your help.

I have change data type in advanced editor but now i have this error
the output column "ACCOUNT_CODE" on the error output has properties that do not match the properties of its corresponding data source column.
The data appears as string and i can not change the error output data type.

how can i resolve this ?

Thank you.|||Any luck here? I'm having the same problem. Unicode coming in, bound for a text file so I change the output column data type to string. Trouble is, the error output column data type remains Unicode text stream, and I can't change it.|||have you tried using the data conversion transformation component?|||

Indeed, using a Data Conversion component is the way to go. We do not perform implicit conversions.

We have had a number of requests for implicit conversions and we're certainly looking at those. However, it is important to bear in mind that the more SSIS does implicitly, the fewer opportunities you have to capture and respond to errors at a fine grain of detail. Of course, there will be some business cases where such fine-grained error handling is less important than others - Unicode Ansi conversion may well be one of them.

Donald

|||

Importing unicode from Oracle 10g to SQL2005. Source column is defined as VARCHAR2, destination as NVARCHAR.

I have tried a standard OLEDB input/output using the MS Oracle driver, and also with a data conversion component specifying the input and output columns as DT_WSTR. The data-flow runs with no errors, yet the data loaded into SQL is incorrect.

In Oracle, using the DUMP function to see the Unicode value of the column - value is 151. Inspecting the column in SQL Server after the import using the UNICODE function yields a value of 191.

Any other suggestions as to what I might try? Thanks

|||

Could you check what Locale ID and code pages are set on your source connecting to Oracle?

Thanks.

|||

LocaleID is English (United States)

Codepage is 1252.

Using SQL2005 SP2.

thanks

|||

OK. And do you know what code page your string in the Oracle database is supposed to be associated with? If it is not 1252 (ANSI - Latin I) you can expect conversion errors.

Thanks.

|||

The Oracle CharacterSet is WE8ISO8859P1.

I strongly suspect the issue is the MS Oracle OLEDB driver. The source columns can contain characters equivalent to the decimal values 150,151,152,153 - which are extended ASCII.

My workaround is to translate these characters to standard ASCII characters within the source query, then translate back after the SQL insert is completed.

Thanks for your assistance.

|||Hi Marc,

How do you translate extended ASCII characters to standard ASCII characters within the source query?

This data importation is given me a lot of headeaches !!! :-((

Your help will be much appreciated.

Thank you,

Nicole
|||Has anyone successfully gotten this to work? We started running into this issue last week when we started testing going from 9.2.0.8 to using 10.2.0.3 migrating to SQL server 2005.

Here is my character set for 9i:

NLS_CHARACTER_SET AL16UTF16
NLS_NUMERIC_CHARACTERS .,
NLS_CHARACTERSET WE8iSO8859P1

10g

NLS_CHARACTER_SET AL16UTF16
NLS_NUMERIC_CHARACTERS .,
NLS_CHARACTERSET AL32UTF

Are these the settings everyone else is using? 10g defaults to the newer version.

Thanks for any help.

Importing unicode data from Oracle Database

Hi everybody,

i try to import data from an oracle database configured in unicode.
When i run the package i have an error message

Error at Data Flow Task [Source - ACCOUNT]: Column "ACCOUNT_CODE" cannot convert between unicode and non-unicode string data types.

how can i resolve this?

regards.Open up your package and set the data type of ACCOUNT_CODE field to be DT_WSTR.

-Jamie|||Thanks for your help.

I have change data type in advanced editor but now i have this error
the output column "ACCOUNT_CODE" on the error output has properties that do not match the properties of its corresponding data source column.
The data appears as string and i can not change the error output data type.

how can i resolve this ?

Thank you.|||Any luck here? I'm having the same problem. Unicode coming in, bound for a text file so I change the output column data type to string. Trouble is, the error output column data type remains Unicode text stream, and I can't change it.|||have you tried using the data conversion transformation component?|||

Indeed, using a Data Conversion component is the way to go. We do not perform implicit conversions.

We have had a number of requests for implicit conversions and we're certainly looking at those. However, it is important to bear in mind that the more SSIS does implicitly, the fewer opportunities you have to capture and respond to errors at a fine grain of detail. Of course, there will be some business cases where such fine-grained error handling is less important than others - Unicode Ansi conversion may well be one of them.

Donald

|||

Importing unicode from Oracle 10g to SQL2005. Source column is defined as VARCHAR2, destination as NVARCHAR.

I have tried a standard OLEDB input/output using the MS Oracle driver, and also with a data conversion component specifying the input and output columns as DT_WSTR. The data-flow runs with no errors, yet the data loaded into SQL is incorrect.

In Oracle, using the DUMP function to see the Unicode value of the column - value is 151. Inspecting the column in SQL Server after the import using the UNICODE function yields a value of 191.

Any other suggestions as to what I might try? Thanks

|||

Could you check what Locale ID and code pages are set on your source connecting to Oracle?

Thanks.

|||

LocaleID is English (United States)

Codepage is 1252.

Using SQL2005 SP2.

thanks

|||

OK. And do you know what code page your string in the Oracle database is supposed to be associated with? If it is not 1252 (ANSI - Latin I) you can expect conversion errors.

Thanks.

|||

The Oracle CharacterSet is WE8ISO8859P1.

I strongly suspect the issue is the MS Oracle OLEDB driver. The source columns can contain characters equivalent to the decimal values 150,151,152,153 - which are extended ASCII.

My workaround is to translate these characters to standard ASCII characters within the source query, then translate back after the SQL insert is completed.

Thanks for your assistance.

|||Hi Marc,

How do you translate extended ASCII characters to standard ASCII characters within the source query?

This data importation is given me a lot of headeaches !!! :-((

Your help will be much appreciated.

Thank you,

Nicole
|||Has anyone successfully gotten this to work? We started running into this issue last week when we started testing going from 9.2.0.8 to using 10.2.0.3 migrating to SQL server 2005.

Here is my character set for 9i:

NLS_CHARACTER_SET AL16UTF16
NLS_NUMERIC_CHARACTERS .,
NLS_CHARACTERSET WE8iSO8859P1

10g

NLS_CHARACTER_SET AL16UTF16
NLS_NUMERIC_CHARACTERS .,
NLS_CHARACTERSET AL32UTF

Are these the settings everyone else is using? 10g defaults to the newer version.

Thanks for any help.
sql

Importing Text File: How to dynamically change the row delimiter

Hi,

I have a dts package that imports a number of text files into a SQL Server 2000 database table. The package has been set up to accept a text file with a row delimiter of carriage return and line feed ({CR}{LF}). Some of the text files I receive only have a line feed ({LF}) as the row delimiter and the dts package fails the file. Is there an activex script i can use that will scan the file and change the row delimiter as required?

i was going to use the filesystemobject which allows me to read a line at a time, however the Readline method doesn't read the new line character. The text files are too big to read into one variable and then do a replace.

Any help would be appreciated

I am using SQL Server 2000 SP3, Windows Server 2000 and Windows XP Professional. All systems are fully patched

Regards JustinThe solution is to identify and change the EOL character that is used in the file before the file is passed through the dts package. The scripting object doesn't appear to contain anything useful that will identify the EOL character so I have come up with this routine in VBA rather then VBScript. It will accept a user defined number of characters in a text file for and returns the EOL character (only a carriage return or line feed).

Public Function gIdentifyEOLCharacter(strFileName As String, _
lngNumOfCharactersToCheck As Long) As String
' identifies the end of line character
Dim fsoSysObj As Scripting.FileSystemObject
Dim tStream As Scripting.TextStream
Dim strText As String, strEOLCharacter As String
On Error GoTo ErrorHere
Set fsoSysObj = New Scripting.FileSystemObject
Set tStream = fsoSysObj.OpenTextFile(strFileName, ForReading)
strText = tStream.Read(lngNumOfCharactersToCheck)
If InStr(strText, Chr$(13)) Then _
strEOLCharacter = strEOLCharacter & "{CR}"
If InStr(strText, Chr$(10)) Then _
strEOLCharacter = strEOLCharacter & "{LF}"
gIdentifyEOLCharacter = strEOLCharacter
ExitHere:
Set fsoSysObj = Nothing
Set tStream = Nothing
Exit Function
ErrorHere:
MsgBox "Error In: Module 'basSupportFunctions'" & vbCrLf _
& "Procedure: gIdentifyEOLCharacter" & _
vbCrLf & "Error Code: " & Err.Number & _
vbCrLf & "Error: " & Err.Description, vbExclamation, "Error Alert"
gIdentifyEOLCharacter = vbNullString
Resume ExitHere
End Function

I can't believe that there is nothing simpler in the scripting object that can return the EOL character - i have looked through the object model and can't see anything that is useful|||I think you can try FINDSTR with /G:file parameter where you'd have nothing but CRLF in the file. If FINDSTR returns a file name then it means that the specified file contains normal EOL combination. Else, - it's not. You can also have 2 files, one with CRLF and the other with just LF. So that if the first one produced 0 results you can do FINDSTR against the second and be certain that LF is the actual delimiter.|||This is just a wild guess but you might run a cmdshell and do:

type inputfile.txt>newfile.txt

before importing. Type may convert LF to CRLF.

I also wonder if you could create a CR placeholder column with a default value of null and always specify LF as the line terminator.

If these files are coming from an FTP transfer, set the type to Ascii instead of bin in FTP to get CRLF terminators.

Friday, March 23, 2012

Importing Problems with XML and XSD into SQL 2000

Im new to everything I'm tring to import an XML file using the bulkload method in an vb script within a DTS package. It initally gave me errors of "relationship expected on 'EventList'" but after some research I specified a relationship within my xsd f
ile for events with an annotation. However, it still gives me "the parent/child table of the relationship on 'EventList' does not match." I think the reason is the parent child relationship is not necessarly linked by any subelement key. Rather the one of
the two "main" elements in this group "Event" appears in as a "sub"element or referenced element for the other "main"element "EventList." What code can i use to establish the SQL:relationship in a way that correctly describes the elements' relationship?
Here's part of the xsd, a little long I apologize:
<xs:annotation><xs:appinfo><sql:relationship name="Events" parent="Event" parent-key="Event" child="EventList" child-key="Event"/></xs:appinfo></xs:annotation><xs:element name="Event" sql:relation="Event" sql:relationship="Events"><xs:complexType><xs:seq
uence><xs:element ref="Action"/><xs:element ref="Date"/><xs:element ref="Time"/><xs:element ref="SeqNo"/><xs:element ref="ConnID"/><xs:element ref="User"/><xs:element ref="Code"/><xs:choice><xs:element ref="UserInfo"/><xs:element ref="EntryInfo"/></xs:cho
ice></xs:sequence><xs:attribute name="ver" type="xs:string"/></xs:complexType></xs:element><xs:element name="EventList" sql:relation="EventList" sql:relationship="Events"><xs:complexType><xs:sequ ence><xs:element ref="Event" minOccurs="0" maxOccurs="unboun
ded"/></xs:sequence></xs:complexType></xs:element>
thanks kind souls
Can you post a sample of the XML you're trying to import as well as the
table defs you're importing into?
Graeme Malcolm
Principal Technologist
Content Master Ltd.
"Mardi Morillo" <mmorillo@.mbcsystems.org> wrote in message
news:4B3F602E-394B-46F7-B711-FDDC69558389@.microsoft.com...
> Im new to everything I'm tring to import an XML file using the bulkload
method in an vb script within a DTS package. It initally gave me errors of
"relationship expected on 'EventList'" but after some research I specified a
relationship within my xsd file for events with an annotation. However, it
still gives me "the parent/child table of the relationship on 'EventList'
does not match." I think the reason is the parent child relationship is not
necessarly linked by any subelement key. Rather the one of the two "main"
elements in this group "Event" appears in as a "sub"element or referenced
element for the other "main"element "EventList." What code can i use to
establish the SQL:relationship in a way that correctly describes the
elements' relationship? Here's part of the xsd, a little long I apologize:
> <xs:annotation><xs:appinfo><sql:relationship name="Events" parent="Event"
parent-key="Event" child="EventList"
child-key="Event"/></xs:appinfo></xs:annotation><xs:element name="Event"
sql:relation="Event"
sql:relationship="Events"><xs:complexType><xs:sequ ence><xs:element
ref="Action"/><xs:element ref="Date"/><xs:element ref="Time"/><xs:element
ref="SeqNo"/><xs:element ref="ConnID"/><xs:element ref="User"/><xs:element
ref="Code"/><xs:choice><xs:element ref="UserInfo"/><xs:element
ref="EntryInfo"/></xs:choice></xs:sequence><xs:attribute name="ver"
type="xs:string"/></xs:complexType></xs:element><xs:element name="EventList"
sql:relation="EventList"
sql:relationship="Events"><xs:complexType><xs:sequ ence><xs:element
ref="Event" minOccurs="0"
maxOccurs="unbounded"/></xs:sequence></xs:complexType></xs:element>
> thanks kind souls
>
|||Heres some of the xml file. By the way i converted the dtd to an xsd that i posted last time:
<?xml version="1.0" encoding="ISO-8859-1"?><!DOCTYPE LaserFicheAuditTrail SYSTEM "LFAudit.dtd"><LaserFicheAuditTrail Version="2.0"><Database>MBC</Database><EventList><Event><Action>LOGIN</Action><Date>28-Feb-2004</Date><Time>22:35:47</Time><SeqNo>1</SeqNo
><ConnID>1</ConnID><User>ADMIN</User><Code>7331</Code><UserInfo><ConnType>RW</ConnType></UserInfo></Event><Event><Action>LOGIN</Action><Date>28-Feb-2004</Date><Time>22:35:54</Time><SeqNo>2</SeqNo><ConnID>2</ConnID><User>ADMIN</User><Code>0</Code><UserInfo
><ConnType>RW</ConnType></UserInfo></Event><Event><Action>LOGOUT</Action><Date>28-Feb-2004</Date><Time>22:36:17</Time><SeqNo>3</SeqNo><ConnID>2</ConnID><User>ADMIN</User><Code>0</Code><UserInfo><ConnType>RW</ConnType></UserInfo></Event><Event><Action>LOGI
N</Action><Date>28-Feb-2004</Date><Time>22:37:05</Time><SeqNo>4</SeqNo><ConnID>3</ConnID><User>ADMIN</User><Code>7331</Code><UserInfo><ConnType>RW</ConnType></UserInfo></Event><Event><Action>LOGIN</Action><Date>28-Feb-2004</Date><Time>22:37:11</Time><SeqN
o>5</SeqNo><ConnID>4</ConnID><User>ADMIN</User><Code>0</Code><UserInfo><ConnType>RW</ConnType></UserInfo></Event><Event><Action>VIEW</Action><Date>28-Feb-2004</Date><Time>22:37:14</Time><SeqNo>6</SeqNo><ConnID>4</ConnID><User>ADMIN</User><Code>0</Code><En
tryInfo><EntryName>(1) SCANNED DOCUMENTS</EntryName><EntryID>23</EntryID><ParentID>1</ParentID><EntryType>FOLDER</EntryType></EntryInfo></Event><Event><Action>VIEW</Action><Date>28-Feb-2004</Date><Time>22:46:42</Time><SeqNo>7</SeqNo><ConnID>4</ConnID><Use
r>ADMIN</User><Code>0</Code><EntryInfo><EntryName>(1) SCANNED DOCUMENTS</EntryName><EntryID>23</EntryID><ParentID>1</ParentID><EntryType>FOLDER</EntryType></EntryInfo></Event>
|||And the database tables?
Graeme Malcolm
Principal Technologist
Content Master Ltd.
"Mardi Morillo" <mmorillo@.mbcsystems.org> wrote in message
news:1C9D3C7F-C25E-46FC-A35E-2845454D5B5B@.microsoft.com...
> Heres some of the xml file. By the way i converted the dtd to an xsd that
i posted last time:
> <?xml version="1.0" encoding="ISO-8859-1"?><!DOCTYPE LaserFicheAuditTrail
SYSTEM "LFAudit.dtd"><LaserFicheAuditTrail
Version="2.0"><Database>MBC</Database><EventList><Event><Action>LOGIN</Actio
n><Date>28-Feb-2004</Date><Time>22:35:47</Time><SeqNo>1</SeqNo><ConnID>1</Co
nnID><User>ADMIN</User><Code>7331</Code><UserInfo><ConnType>RW</ConnType></U
serInfo></Event><Event><Action>LOGIN</Action><Date>28-Feb-2004</Date><Time>2
2:35:54</Time><SeqNo>2</SeqNo><ConnID>2</ConnID><User>ADMIN</User><Code>0</C
ode><UserInfo><ConnType>RW</ConnType></UserInfo></Event><Event><Action>LOGOU
T</Action><Date>28-Feb-2004</Date><Time>22:36:17</Time><SeqNo>3</SeqNo><Conn
ID>2</ConnID><User>ADMIN</User><Code>0</Code><UserInfo><ConnType>RW</ConnTyp
e></UserInfo></Event><Event><Action>LOGIN</Action><Date>28-Feb-2004</Date><T
ime>22:37:05</Time><SeqNo>4</SeqNo><ConnID>3</ConnID><User>ADMIN</User><Code
>7331</Code><UserInfo><ConnType>RW</ConnType></UserInfo></Event><Event><Acti
on>LOGIN</Action><Date>28-Feb-2004</Date><Time>22:37:11</Time><SeqNo>5</SeqN
o><ConnID>4</ConnID><User>ADMIN</User><Code>0</Code><UserInfo><ConnType>RW</
ConnType></UserInfo></Event><Event><Action>VIEW</Action><Date>28-Feb-2004</D
ate><Time>22:37:14</Time><SeqNo>6</SeqNo><ConnID>4</ConnID><User>ADMIN</User
><Code>0</Code><EntryInfo><EntryName>(1) SCANNED
DOCUMENTS</EntryName><EntryID>23</EntryID><ParentID>1</ParentID><EntryType>F
OLDER</EntryType></EntryInfo></Event><Event><Action>VIEW</Action><Date>28-Fe
b-2004</Date><Time>22:46:42</Time><SeqNo>7</SeqNo><ConnID>4</ConnID><User>AD
MIN</User><Code>0</Code><EntryInfo><EntryName>(1) SCANNED
DOCUMENTS</EntryName><EntryID>23</EntryID><ParentID>1</ParentID><EntryType>F
OLDER</EntryType></EntryInfo></Event>
|||CREATE TABLE AUDIT_XML_IMPORT
([Action]VARCHAR(100) NULL,
[AnnotateInfo]VARCHAR(100) NULL,
[AnnotColor]VARCHAR(100) NULL,
[AnnotHeight]VARCHAR(100) NULL,
[AnnotID]VARCHAR(100) NULL,
[AnnotLinked]VARCHAR(100) NULL,
[AnnotStampID]VARCHAR(100) NULL,
[AnnotStampName]VARCHAR(100) NULL,
[AnnotType]VARCHAR(100) NULL,
[AnnotWidth]VARCHAR(100) NULL,
[AnnotX]VARCHAR(100) NULL,
[AnnotY]VARCHAR(100) NULL,
[BackColor]VARCHAR(100) NULL,
[Code]VARCHAR(100) NULL,
[ConnID]VARCHAR(100) NULL,
[ConnType]VARCHAR(100) NULL,
[CopyInfo]VARCHAR(100) NULL,
[CreateInfo]VARCHAR(100) NULL,
[Database]VARCHAR(100) NULL,
[Date]VARCHAR(100) NULL,
[DestVolumeID]VARCHAR(100) NULL,
[DestVolumeName]VARCHAR(100) NULL,
[EntryID]VARCHAR(100) NULL,
[EntryInfo]VARCHAR(100) NULL,
[EntryName]VARCHAR(100) NULL,
[EntryType]VARCHAR(100) NULL,
[Event]VARCHAR(100) NULL,
[EventList]VARCHAR(100) NULL,
[ExportComment]VARCHAR(100) NULL,
[ExportInfo]VARCHAR(100) NULL,
[ExportMethod]VARCHAR(100) NULL,
[ExportPages]VARCHAR(100) NULL,
[ExportReason]VARCHAR(100) NULL,
[FieldAfter]VARCHAR(100) NULL,
[FieldBefore]VARCHAR(100) NULL,
[FieldChange]VARCHAR(100) NULL,
[FieldList]VARCHAR(100) NULL,
[FieldName]VARCHAR(100) NULL,
[FieldsInfo]VARCHAR(100) NULL,
[FieldType]VARCHAR(100) NULL,
[FuzzyParameter]VARCHAR(100) NULL,
[HiliteInfo]VARCHAR(100) NULL,
[HiliteTextInfo]VARCHAR(100) NULL,
[LaserFicheAuditTrail]VARCHAR(100) NULL,
[MigrateInfo]VARCHAR(100) NULL,
[ModifyType]VARCHAR(100) NULL,
[MoveInfo]VARCHAR(100) NULL,
[NewName]VARCHAR(100) NULL,
[NewParentID]VARCHAR(100) NULL,
[NewRefID]VARCHAR(100) NULL,
[NoteInfo]VARCHAR(100) NULL,
[NoteText]VARCHAR(100) NULL,
[NumPages]VARCHAR(100) NULL,
[Page]VARCHAR(100) NULL,
[ParentID]VARCHAR(100) NULL,
[RedactColor]VARCHAR(100) NULL,
[RedactInfo]VARCHAR(100) NULL,
[RedactTextInfo]VARCHAR(100) NULL,
[RenameInfo]VARCHAR(100) NULL,
[SearchInfo]VARCHAR(100) NULL,
[SearchString]VARCHAR(100) NULL,
[SeqNo]VARCHAR(100) NULL,
[SourceID]VARCHAR(100) NULL,
[SourceVolumeID]VARCHAR(100) NULL,
[SourceVolumeName]VARCHAR(100) NULL,
[StampInfo]VARCHAR(100) NULL,
[StampMasterInfo]VARCHAR(100) NULL,
[StampPersonalInfo]VARCHAR(100) NULL,
[StartPage]VARCHAR(100) NULL,
[TemplateID]VARCHAR(100) NULL,
[TemplateName]VARCHAR(100) NULL,
[TextColor]VARCHAR(100) NULL,
[TextEnd]VARCHAR(100) NULL,
[TextStart]VARCHAR(100) NULL,
[Time]VARCHAR(100) NULL,
[User]VARCHAR(100) NULL,
[UserInfo]VARCHAR(100) NULL,
[Version]VARCHAR(100) NULL,
[VolumeID]VARCHAR(100) NULL,
[VolumeName]VARCHAR(100) NULL,)
GO
|||The problem is that you've only got a single table - the relationship you've
declared in the schema is trying to join a table called Event to a table
called EventList - neither of which exist in your database.
I'd suggest that the easiest approach would be to "flatten" the data using
an XSLT before performing the bulk load. See
http://sqlxml.org/faqs.aspx?faq=49.
Graeme Malcolm
Principal Technologist
Content Master Ltd.
"Mardi" <mmorillo@.mbcsystems.org> wrote in message
news:ECC157F3-BD9E-4415-A248-8D85E6EE7A7B@.microsoft.com...
> CREATE TABLE AUDIT_XML_IMPORT
> ([Action]VARCHAR(100) NULL,
> [AnnotateInfo]VARCHAR(100) NULL,
> [AnnotColor]VARCHAR(100) NULL,
> [AnnotHeight]VARCHAR(100) NULL,
> [AnnotID]VARCHAR(100) NULL,
> [AnnotLinked]VARCHAR(100) NULL,
> [AnnotStampID]VARCHAR(100) NULL,
> [AnnotStampName]VARCHAR(100) NULL,
> [AnnotType]VARCHAR(100) NULL,
> [AnnotWidth]VARCHAR(100) NULL,
> [AnnotX]VARCHAR(100) NULL,
> [AnnotY]VARCHAR(100) NULL,
> [BackColor]VARCHAR(100) NULL,
> [Code]VARCHAR(100) NULL,
> [ConnID]VARCHAR(100) NULL,
> [ConnType]VARCHAR(100) NULL,
> [CopyInfo]VARCHAR(100) NULL,
> [CreateInfo]VARCHAR(100) NULL,
> [Database]VARCHAR(100) NULL,
> [Date]VARCHAR(100) NULL,
> [DestVolumeID]VARCHAR(100) NULL,
> [DestVolumeName]VARCHAR(100) NULL,
> [EntryID]VARCHAR(100) NULL,
> [EntryInfo]VARCHAR(100) NULL,
> [EntryName]VARCHAR(100) NULL,
> [EntryType]VARCHAR(100) NULL,
> [Event]VARCHAR(100) NULL,
> [EventList]VARCHAR(100) NULL,
> [ExportComment]VARCHAR(100) NULL,
> [ExportInfo]VARCHAR(100) NULL,
> [ExportMethod]VARCHAR(100) NULL,
> [ExportPages]VARCHAR(100) NULL,
> [ExportReason]VARCHAR(100) NULL,
> [FieldAfter]VARCHAR(100) NULL,
> [FieldBefore]VARCHAR(100) NULL,
> [FieldChange]VARCHAR(100) NULL,
> [FieldList]VARCHAR(100) NULL,
> [FieldName]VARCHAR(100) NULL,
> [FieldsInfo]VARCHAR(100) NULL,
> [FieldType]VARCHAR(100) NULL,
> [FuzzyParameter]VARCHAR(100) NULL,
> [HiliteInfo]VARCHAR(100) NULL,
> [HiliteTextInfo]VARCHAR(100) NULL,
> [LaserFicheAuditTrail]VARCHAR(100) NULL,
> [MigrateInfo]VARCHAR(100) NULL,
> [ModifyType]VARCHAR(100) NULL,
> [MoveInfo]VARCHAR(100) NULL,
> [NewName]VARCHAR(100) NULL,
> [NewParentID]VARCHAR(100) NULL,
> [NewRefID]VARCHAR(100) NULL,
> [NoteInfo]VARCHAR(100) NULL,
> [NoteText]VARCHAR(100) NULL,
> [NumPages]VARCHAR(100) NULL,
> [Page]VARCHAR(100) NULL,
> [ParentID]VARCHAR(100) NULL,
> [RedactColor]VARCHAR(100) NULL,
> [RedactInfo]VARCHAR(100) NULL,
> [RedactTextInfo]VARCHAR(100) NULL,
> [RenameInfo]VARCHAR(100) NULL,
> [SearchInfo]VARCHAR(100) NULL,
> [SearchString]VARCHAR(100) NULL,
> [SeqNo]VARCHAR(100) NULL,
> [SourceID]VARCHAR(100) NULL,
> [SourceVolumeID]VARCHAR(100) NULL,
> [SourceVolumeName]VARCHAR(100) NULL,
> [StampInfo]VARCHAR(100) NULL,
> [StampMasterInfo]VARCHAR(100) NULL,
> [StampPersonalInfo]VARCHAR(100) NULL,
> [StartPage]VARCHAR(100) NULL,
> [TemplateID]VARCHAR(100) NULL,
> [TemplateName]VARCHAR(100) NULL,
> [TextColor]VARCHAR(100) NULL,
> [TextEnd]VARCHAR(100) NULL,
> [TextStart]VARCHAR(100) NULL,
> [Time]VARCHAR(100) NULL,
> [User]VARCHAR(100) NULL,
> [UserInfo]VARCHAR(100) NULL,
> [Version]VARCHAR(100) NULL,
> [VolumeID]VARCHAR(100) NULL,
> [VolumeName]VARCHAR(100) NULL,)
> GO
>

Importing Pervasive SQL into SQL2005 via SSIS

In SQL 2000, I had a working DTS package that would import a Pervasive SQL database into SQL 2000 (There is a good reason, provided on request). The column type definitions came over just fine in SQL 2000 with a few minor changes.

In SQL 2005 (SSIS), i create a Data Source via the Connection Manager (Provider: .Net Providers\Odbc Data Provider) to the Pervasive database (System DSN, <database_odbc>). I then create a Data Destination via the Connection Manager (Provider: Native OLE DB\SQL Native Client) to the SQL database. Both databases reside on the same machine.

I've created a DataSource Reader and used the sql command, "select * from ARCustomer" as an example. The issue is with "data types" for the columns. They don't come close to resembling the results that i had in SQL 2000 DTS.

Is there another method or parameter setting that will preserve the "data types" for the columns being imported from the Pervasive database.

This has been a real stumbling block and any help would be truly appreciated. Thanks in advance for your assistance ... Bernie

connect to the data conversion task before inserting into the destination or provide false for metadata in the properties of both the task.|||

The issue is still with the Data Conversion. I've been reading many of the forums and it seems that, like myself, there are a lot of frustrated developers out there trying to make simple sql2000 dts packages work in sql2005 SSIS. Almost all of them point to the data conversion issue. What a real pain in the arse. I surely hope that Microsoft does something about this, SOON...

Something simple that would take less than an hour could turn into weeks and weeks of fruitless effort.

The client has a pervasive sql database that i did many tests on using sql2000 dts. There's approximately 150 tables. My testing went very well. I recommended to the client that they purchase a new server Win2003 R2 along with SQL2005, advising them that my development time in SQL2005, converting the tables with DTS would be a much better process with the upgrade (is it an really an upgraded when you have all this bickering on unicode (data conversion) issues).

I have spent countless hours trying to get this working in sql2005 ssis, with no success.

I wish in retrospect that i'd just had them get Win2000 & SQL2000. I would have been done with the project by now and i would have had a happy client. I feel like a damn fool, recommending a package that i thought would be a significant DEVELPOR improvement, and have yet to show any results. This does not bode well for others like me at all.

|||

I have been testing and trying for days, almost a week trying to create a simple, single table SSIS package that i created in DTS in minutes. I'm very frustrated with how the whole process gets hung up in the data conversion.

I have a DataReader Source (an ODBC connection (System DSN) to a Pervasive SQL Database). It pulls the table in just fine. I have a DataFlow Task, using a Data Conversion. I have a SQL Server Destination (The table already exists in the SQL 2005 database).|||

Please do this, I had did same thing in my package.

double click on data task flow,

then click on source task go to it's properties windows

there is a properties called ValidateExternalMetadata :- set is False

repeat for destination task properties.

|||

Your recommendation seems to work for Numerical Conversions.

BUT...

Not for string transformations. The strings come in as unicode DT-WSTR.

In the Data Conversion Transformation Editor, I have to manually create a transformation for each and every instance, from DT_WSTR to DT_STR. Once i've done this, it passes muster with running in Debug. This would still require lots of tedious manual, repetitive tasks.

Is there a way to force the DT_WSTR to DT_STR for all of these so i don't have to manually create a Transformation for each?

Or...

Do you have another suggestion on the STR issue that i might try?

Thanks too very much for your assistance!!!

Importing Pervasive SQL into SQL2005 via SSIS

In SQL 2000, I had a working DTS package that would import a Pervasive SQL database into SQL 2000 (There is a good reason, provided on request). The column type definitions came over just fine in SQL 2000 with a few minor changes.

In SQL 2005 (SSIS), i create a Data Source via the Connection Manager (Provider: .Net Providers\Odbc Data Provider) to the Pervasive database (System DSN, <database_odbc>). I then create a Data Destination via the Connection Manager (Provider: Native OLE DB\SQL Native Client) to the SQL database. Both databases reside on the same machine.

I've created a DataSource Reader and used the sql command, "select * from ARCustomer" as an example. The issue is with "data types" for the columns. They don't come close to resembling the results that i had in SQL 2000 DTS.

Is there another method or parameter setting that will preserve the "data types" for the columns being imported from the Pervasive database.

This has been a real stumbling block and any help would be truly appreciated. Thanks in advance for your assistance ... Bernie

connect to the data conversion task before inserting into the destination or provide false for metadata in the properties of both the task.|||

The issue is still with the Data Conversion. I've been reading many of the forums and it seems that, like myself, there are a lot of frustrated developers out there trying to make simple sql2000 dts packages work in sql2005 SSIS. Almost all of them point to the data conversion issue. What a real pain in the arse. I surely hope that Microsoft does something about this, SOON...

Something simple that would take less than an hour could turn into weeks and weeks of fruitless effort.

The client has a pervasive sql database that i did many tests on using sql2000 dts. There's approximately 150 tables. My testing went very well. I recommended to the client that they purchase a new server Win2003 R2 along with SQL2005, advising them that my development time in SQL2005, converting the tables with DTS would be a much better process with the upgrade (is it an really an upgraded when you have all this bickering on unicode (data conversion) issues).

I have spent countless hours trying to get this working in sql2005 ssis, with no success.

I wish in retrospect that i'd just had them get Win2000 & SQL2000. I would have been done with the project by now and i would have had a happy client. I feel like a damn fool, recommending a package that i thought would be a significant DEVELPOR improvement, and have yet to show any results. This does not bode well for others like me at all.

|||

I have been testing and trying for days, almost a week trying to create a simple, single table SSIS package that i created in DTS in minutes. I'm very frustrated with how the whole process gets hung up in the data conversion.

I have a DataReader Source (an ODBC connection (System DSN) to a Pervasive SQL Database). It pulls the table in just fine.

I have a DataFlow Task, using a Data Conversion.

I have a SQL Server Destination (The table already exists in the SQL 2005 database).|||

Please do this, I had did same thing in my package.

double click on data task flow,

then click on source task go to it's properties windows

there is a properties called ValidateExternalMetadata :- set is False

repeat for destination task properties.

|||

Your recommendation seems to work for Numerical Conversions.

BUT...

Not for string transformations. The strings come in as unicode DT-WSTR.

In the Data Conversion Transformation Editor, I have to manually create a transformation for each and every instance, from DT_WSTR to DT_STR. Once i've done this, it passes muster with running in Debug. This would still require lots of tedious manual, repetitive tasks.

Is there a way to force the DT_WSTR to DT_STR for all of these so i don't have to manually create a Transformation for each?

Or...

Do you have another suggestion on the STR issue that i might try?

Thanks too very much for your assistance!!!

Importing non-hidden rows from Excel?

I have a need to import only non-hidden rows from an excel spreadsheet. When I create the package in SSIS, it imports everything. Due to the use of the data on the spreadsheet, we cannot simply delete the data.

Is there a special setting in the Excel Source or Connection manager that can be set to "only import non-hidden rows"?

Also, how do I go about setting the sheet with an index instead of the actual Sheet name? The user changes the sheet name at random, but I know I only need the first two sheets on the file.

Thanks!

Matt Michuta

No, there is no setting like this. You at the mercy of the Excel OLE DB Provider here and it doesn't possess functionality like you are requesting. You will have to filter those rows out in the pipeline.

Not sure about the sheet index problem - I don't ever use the Excel provider. I suspect you can't do that either. Perhaps try the data access forum if no-one here knows: https://forums.microsoft.com/MSDN/ShowForum.aspx?ForumID=87&SiteID=1

-Jamie

|||You should be able to do this with a little bit of creative vba coding in excel. You can write a macro that will copy all the data to a sheet that has a static name and link the action to an event or button. Since you can use sheet indexes and row properties in vba you can get exactly what you need, and since the sheet you're copying to has a static name you can use the excel source in SSIS to import it.

Monday, March 19, 2012

Importing Excel file using DTS use of VB package

Dear All,
I had created a Package for importing a excel file.
while i created the package the data was ported.
and i had saved the package.
now when i try to port the data by executing the package.
the file is not ported.
what could be the problem.
help me......
Regards _PremAny errors?
Is the file empty, are you trying to import the same file?
Instead of DTS you can also use OPENROWSET or OPENDATASOURCE
Here is an OPENROWSET example
INSERT INTO YourTable
SELECT * FROM OPENROWSET( 'Microsoft.Jet.OLEDB.4.0',
'Excel 8.0;Database=C:\testing.xls','SELECT * FROM [Sheet1$]')
Denis the SQL Menace
http://sqlservercode.blogspot.com/|||Also refer
http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=49926
Madhivanan

Importing excel file into Database

Hi, I am very new to ssis, I am trying to set up a package that I can set to run every hour or so, what it will do is look for all excel files in a certain folder and import them into a table on an sql server.

I managed to get it working but my problem is that my data is survey answers and some columns contain a comment.

I get these files on a weekly basis and some weeks the length of the longest comment makes ssis want to use a different data type for the comment column (sometimes it wants DT_NTEXT, other times it wants DT_WSTR).

as this column is filled out by a human I guess aposrtophies and other characters may affext this as well.

when I made the data flow task, I used the wizard on a file which contained about 8 weeks worth of data. when I use 1 weeks worth of data where the comment length is very low, the task gives a validation error saying the metadata is out of sync. if I go back and set the data type for that column to DT_WSTR and rerun the task, it works but then when it tries to process a different weeks worth of data it will fail again

here is an example of an error I get when it thinks the data type is wrong.

[Source - 'raw data$' [1]] Error: The output column "Question ID50# (Verbatim)" (439) on the error output has properties that do not match the properties of its corresponding data source column.

I played around with the data types for a while and managed to get it to process the first file and then try to process the secondfile, in the second file it got around the validation but then got this error:

[Source - 'raw data$' [1]] Error: Failed to retrieve long data for column "Question ID3# (Verbatim)".

is there a way to make it recalculate the data types by itself for each excel file?

I am stuck trying to figure this one out.

sorry if I havent provided enough information, I am not sure which direction to head with this

First, you have to implement an administrative policy: a same type of Excel file must be used in your activities with same list of columns. So, you can use a SSIS package that will be created with DT_NTEXT data type (in this way it can get small or big text strings) and run it when you need; you don't have to rebuild periodically this SSIS package or recreate with wizard as you say.

In plus, using "for each loop container" you can, automatically load a series of excel files from a folder.

|||

Excel reports the metadata based on the actual data in the column. Unfortunately, the only workaround for the DT_WSTR / DT_NTEXT issue that I am aware is of to create two data flows, one with DT_WSTR and one with DT_NTEXT. Then use a precendence constraint to determine which one to execute.

You might check out this thread for some additional detail.

http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2043158&SiteID=1

Friday, March 9, 2012

Importing data from txt file

I have 30 text file with same format that I have to make SSIS package so that I can populate my permanent tables. Because they are just record keeping data we are not ver concerned about validating the data. We just want to copy and dump it in permanent table. Which should mean the least amount of work, because I used one file as a sample, did advance formating to give column names. Now, I did one simple dataflow task that takes the flat file and tranfers the data to OLEDB SQL server database table. Unfortunately, it keeps giving me error. I believe this is because of NULL. I want to take care of this without making staging tables and validating data. Any ideas?

are u running this manually from within your SSIS Package? if so, you should get reasonably good information as to the prob in the Execution Results tab.

Are u sure your incoming ASCII fields are set wide enough to capture the data? If not, truncation errors will kill the process.

Are your table fields setup to accept NULLs?

Seth J Hersh

|||

well, we couldn't find set solution so, as a get arround we used VB.Net script to pad the row to a set width. This seems to have solved the problem for now.

Thank You

Anjali

|||

anj755 wrote:

I have 30 text file with same format that I have to make SSIS package so that I can populate my permanent tables. Because they are just record keeping data we are not ver concerned about validating the data. We just want to copy and dump it in permanent table. Which should mean the least amount of work, because I used one file as a sample, did advance formating to give column names. Now, I did one simple dataflow task that takes the flat file and tranfers the data to OLEDB SQL server database table. Unfortunately, it keeps giving me error. I believe this is because of NULL. I want to take care of this without making staging tables and validating data. Any ideas?

Ummm, it might help if you posted the error. This is a real simple task you are trying to do. (I think.)

Importing data from oracle to sql loosing data after the decimal point

I have created a simple package that uses a sql command to pull data from an oracle database and inserts the data into a sql 2005 table. Some of the data fields that i am pulling from contain two digits after the decimal point, however this data is lost when it gets into sql. I have even tried putting the data into a flat file, and still the data is lost.

In the package I have a ole db source connection which is the oracle database and when i do the preview i see all the data I need. I am very confused and tried a number of things to get the data into sql, but none work. Any ideas would be very helpful.

thanks

What is the data type of the SSIS column that you are storing the Oracle results in? Also double check that you have a "scale" defined for that data type.|||

I have tried money, decimal (38,2) , float and numeric (18,2)

With no luck. What is really weird is that when I use the Destination of flat file the same thing happens I loose all digits after the decimal.

|||Are you just going source to destination? No other transformations?|||thats correct.|||Do you receive any truncation warnings when you run the package? (Or ANY warnings?)|||

I don't get any warnings if I use numeric (38,2)

I let the package create the table. The package wants to use numeric (38) but I changed to numeric (38,2) on the create table statement in the destination window.

Importing data from oracle 10g to SQL Server

Hi,

I created SQL Server package and schedule the job. SQL Server allows us to connected different database for e.g. for oracle using "Oracle Provide for OLD DB" to retrieve our database. A link is create between this database which move the data to SQL Server. I'm sure there is no issues in the link, as I'm using it to retrieve several oracle database which contains both Arabic and English data.

But after Importation it is showing Junk values. Please advise me what step I should take next.

Regards

Azeem

Did you already pump the data in a table ? Did you make sure the data table can hold unicode data (data types have to be of type NVARCHAR / NCHAR or a similiar data type supporting Unicode, look in the BOL concnering types which support storing unicode with the N at the beginning)

Jens K. Suessmeyer

http://www.sqlserver2005.de

|||

You may need to alter your collations on the SQL Server side for unicode fields to ensure that they are appropriate for the text you are storing. Books online has more information about collations and unicode data.

|||

Hi Azeem,

Would you be able to provide more information or a sample of what you mean under "junk values"? Also - the collation and the nchar/nvarchar ideas mentioned above are definitely required prerequisites in this case.

HTH,
Jivko Dobrev - MSFT
--
This posting is provided "AS IS" with no warranties, and confers no rights.

|||

Azeem Anzari wrote:

Hi,

I created SQL Server package and schedule the job. SQL Server allows us to connected different database for e.g. for oracle using "Oracle Provide for OLD DB" to retrieve our database. A link is create between this database which move the data to SQL Server. I'm sure there is no issues in the link, as I'm using it to retrieve several oracle database which contains both Arabic and English data.

But after Importation it is showing Junk values. Please advise me what step I should take next.

Regards

Azeem

You are getting character conversion you need to use Nvarchar for the Arabic column and use Arabic collation, you should also know Lam-Alef glyphs Arabic code pages does not exist in SQL Server you may need to use VS2005 to use Advanced save as option. Post again if you still need help.


http://msdn2.microsoft.com/en-us/library/ms144250.aspx


http://msdn2.microsoft.com/en-us/library/ms180175.aspx

Importing data from oracle 10g to SQL Server

Hi,

I created SQL Server package and schedule the job. SQL Server allows us to connected different database for e.g. for oracle using "Oracle Provide for OLD DB" to retrieve our database. A link is create between this database which move the data to SQL Server. I'm sure there is no issues in the link, as I'm using it to retrieve several oracle database which contains both Arabic and English data.

But after Importation it is showing Junk values. Please advise me what step I should take next.

Regards

Azeem

Did you already pump the data in a table ? Did you make sure the data table can hold unicode data (data types have to be of type NVARCHAR / NCHAR or a similiar data type supporting Unicode, look in the BOL concnering types which support storing unicode with the N at the beginning)

Jens K. Suessmeyer

http://www.sqlserver2005.de

|||

You may need to alter your collations on the SQL Server side for unicode fields to ensure that they are appropriate for the text you are storing. Books online has more information about collations and unicode data.

|||

Hi Azeem,

Would you be able to provide more information or a sample of what you mean under "junk values"? Also - the collation and the nchar/nvarchar ideas mentioned above are definitely required prerequisites in this case.

HTH,
Jivko Dobrev - MSFT
--
This posting is provided "AS IS" with no warranties, and confers no rights.

|||

Azeem Anzari wrote:

Hi,

I created SQL Server package and schedule the job. SQL Server allows us to connected different database for e.g. for oracle using "Oracle Provide for OLD DB" to retrieve our database. A link is create between this database which move the data to SQL Server. I'm sure there is no issues in the link, as I'm using it to retrieve several oracle database which contains both Arabic and English data.

But after Importation it is showing Junk values. Please advise me what step I should take next.

Regards

Azeem

You are getting character conversion you need to use Nvarchar for the Arabic column and use Arabic collation, you should also know Lam-Alef glyphs Arabic code pages does not exist in SQL Server you may need to use VS2005 to use Advanced save as option. Post again if you still need help.


http://msdn2.microsoft.com/en-us/library/ms144250.aspx


http://msdn2.microsoft.com/en-us/library/ms180175.aspx

Importing data from oracle 10g to SQL Server

Hi,

I created SQL Server package and schedule the job. SQL Server allows us to connected different database for e.g. for oracle using "Oracle Provide for OLD DB" to retrieve our database. A link is create between this database which move the data to SQL Server. I'm sure there is no issues in the link, as I'm using it to retrieve several oracle database which contains both Arabic and English data.

But after Importation it is showing Junk values. Please advise me what step I should take next.

Regards

Azeem

Did you already pump the data in a table ? Did you make sure the data table can hold unicode data (data types have to be of type NVARCHAR / NCHAR or a similiar data type supporting Unicode, look in the BOL concnering types which support storing unicode with the N at the beginning)

Jens K. Suessmeyer

http://www.sqlserver2005.de

|||

You may need to alter your collations on the SQL Server side for unicode fields to ensure that they are appropriate for the text you are storing. Books online has more information about collations and unicode data.

|||

Hi Azeem,

Would you be able to provide more information or a sample of what you mean under "junk values"? Also - the collation and the nchar/nvarchar ideas mentioned above are definitely required prerequisites in this case.

HTH,
Jivko Dobrev - MSFT
--
This posting is provided "AS IS" with no warranties, and confers no rights.

|||

Azeem Anzari wrote:

Hi,

I created SQL Server package and schedule the job. SQL Server allows us to connected different database for e.g. for oracle using "Oracle Provide for OLD DB" to retrieve our database. A link is create between this database which move the data to SQL Server. I'm sure there is no issues in the link, as I'm using it to retrieve several oracle database which contains both Arabic and English data.

But after Importation it is showing Junk values. Please advise me what step I should take next.

Regards

Azeem

You are getting character conversion you need to use Nvarchar for the Arabic column and use Arabic collation, you should also know Lam-Alef glyphs Arabic code pages does not exist in SQL Server you may need to use VS2005 to use Advanced save as option. Post again if you still need help.


http://msdn2.microsoft.com/en-us/library/ms144250.aspx


http://msdn2.microsoft.com/en-us/library/ms180175.aspx

Wednesday, March 7, 2012

Importing data

I have created a DTS package for SQL Server, saved this as a VB file and upgraded this to the .NET framework.

It is now saved in a file called Shortages.vb.

What I want to do now is add this to an existing ASP.NET project and be able to call this DTS function by the click of a button.

Is this possible and how can I go about it? Can I just add it to the Click event of a button?You could write a simple stored proceduure to call your DTS, and then execute your sproc from the web page button.|||I have considered that.

I tried to run it by calling it in the format below but

CREATE PROCEDURE doPublish ASexec master..xp_cmdshell 'dtsrun /Ssql1 /NPublish /E'

This is what I got when I tried to run the DTS package within SQL Sever Query Analyser.

If I put the Excel file on the desktop it works fine but it has a problem when its on the H: drive (a shared drive)

DTSRun: Executing...
DTSRun OnStart: Create Table [GE].[dbo].[tester] Step
DTSRun OnFinish: Create Table [GE].[dbo].[tester] Step
DTSRun OnStart: Copy Data from Results to [GE].[dbo].[tester] Step
DTSRun OnError: Copy Data from Results to [GE].[dbo].[tester] Step, Error = -2147467259 (80004005)
Error string: The Microsoft Jet database engine cannot open the file 'H:\Book1.xls'. It is already opened exclusively by another user, or you need permission to view its data.
Error source: Microsoft JET Database Engine
Help file:
Help context: 5003051

Error Detail Records:

Error: -2147467259 (80004005); Provider Error: -534709256 (E020FBF8)
Error string: The Microsoft Jet database engine cannot open the file 'H:\Book1.xls'. It is already opened exclusively by another user, or you need permission to view its data.
Error source: Microsoft JET Database Engine
Help file:
Help context: 5003051

DTSRun OnFinish: Copy Data from Results to [GE].[dbo].[tester] Step
DTSRun: Package execution complete.
NULL

How can I make it run so that I can read from the H: drive?
It works if I run the package immediately when I create it with the DTS Import/Export Wizard.|||Does anyone have an answer tothis? I get the same error when I try to call my DTS package toaccess an excel file on my network.

Friday, February 24, 2012

Importing CSV file with URL using DTS?

I need to be able to create a DTS package that imports a CSV file which is loated at URL. I.E. HTTP://www.url.com/csv/thefile.xls I tried copying the URL an pasting it in the file location when in SQL wazird but I got an error message.

Please help point me to anywhere that I can find some help in doing this? If there is a better way please let me know

ThanksI would post this in the SQL Server Integration Services forum instead of this one. Also, to get a quicker response, please detail exact steps you took to hit the problem, as well as display the exact error message you got.

Importing big data with exception.

Hi, there;

I use ASP.NEP to create a SSIS package to import data from .dbf file. When I import data from a big file (216,173KB) my package throw exception:

An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Communication link failure".
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "TCP Provider: The specified network name is no longer available.
".
The "input "OLE DB Destination Input" (71)" failed because error code 0xC020907B occurred, and the error row disposition on "input "OLE DB Destination Input" (71)" specifies failure on error. An error occurred on the specified object of the specified component.
The ProcessInput method on component "OLE DB Destination" (58) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
The PrimeOutput method on component "OLE DB Source" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
Thread "WorkThread0" has exited with error code 0xC0209029.
Thread "SourceThread0" has exited with error code 0xC0047038.

An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "Communication link failure".
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80004005 Description: "TCP Provider: The specified network name is no longer available.
".
The "input "OLE DB Destination Input" (77)" failed because error code 0xC020907B occurred, and the error row disposition on "input "OLE DB Destination Input" (77)" specifies failure on error. An error occurred on the specified object of the specified component.
The ProcessInput method on component "OLE DB Destination" (64) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.
Thread "WorkThread0" has exited with error code 0xC0209029.
The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
The PrimeOutput method on component "OLE DB Source" (1) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.
Thread "SourceThread0" has exited with error code 0xC0047038

Does anybody know what the problem is?

Thanks.

Sounds like the connection to your destination database timed out and was closed. Most likely the SQL Server instance severed the connection. Are you performing a large sort or an aggregate in the dataflow? You might first try going to a raw file instead of the database table and then create a second data flow that uses the raw file as the source straight into the database table.|||Thanks.

My structure is: my data source files (*.dbf file)are under another machine, some files are very big (216,173KB, 283,845KB, some are even bigger as time goes on). I want to import these data to my new SQL2005 server which is empty now. I create a table before importing data (Works now). In my source component I set " oSrcInstance.SetComponentProperty("SqlCommand", "SELECT * FROM [" + sourceTable+"]");", there is no other aggregate operation between source and destination, just simply pump all data from source to destination. I can do this with my SQL2000 DTS (same as a ASP.NEP project).
So simple as Source (*.dbf)--> Destination (SQL2005).

Thanks.
|||

Are you creating the new table through an Exec SQL task? Is it using the same connection manager as your data flow task? Trying to verify that the package is connecting successfully. Also, how long is the package running before you received this error? Is it occuring if you try a smaller file?

You might try setting the OLE DB Destination data access mode property to one of the fast load options, and specify Rows Per Batch as 10,000 or 20,000, to see if you get different results.

|||

No, I don't use ExecSQL task to create table, I just use SqlConnection to create table and then close+dispose the connection.

Actually I am runnning the package now. It looks like it is running slower and slower. I can see the row numbers increaing in the SQL2005 database.

I am using OLE DB Destination,

oDestInstace.SetComponentProperty("CommandTimeout", 0);

oDestInstace.SetComponentProperty("OpenRowset", "[dbo].[" + destinationTable+"]");

oDestInstace.SetComponentProperty("OpenRowsetVariable", null);

// oDestInstace.SetComponentProperty("SqlCommand", null);

oDestInstace.SetComponentProperty("DefaultCodePage", 1252);

oDestInstace.SetComponentProperty("AlwaysUseDefaultCodePage", false);

oDestInstace.SetComponentProperty("AccessMode", 0);

oDestInstace.SetComponentProperty("FastLoadKeepIdentity", false);

oDestInstace.SetComponentProperty("FastLoadKeepNulls", false);

// oDestInstace.SetComponentProperty("FastLoadOptions", null);

oDestInstace.SetComponentProperty("FastLoadMaxInsertCommitSize", 0);

The exception happens after package runs for 30 minutes or more. It doesn't occur for small files.

Thanks

|||

Try setting FastLoadOptions to TABLOCK, CHECK_CONSTRAINTS, ROWS_PER_BATCH=1000

and FastLoadMaxInsertCommitSize to 1000.

|||

Yes, you are right. I did that and it works perfectly. Apart from FastLoadMaxInsertCommitSize = 0.

Thanks.

Importing an Excel Spreadsheet into SQL Server

Hi,
I'm trying to import an excel file into SQL sever(using an insert statement), i'm creating a DTS package (in enterprise manager) and have VB Script. When i parse it, i get no errors, but when i run the package it says that it ran successfully but nothing happens, it doesnt insert into the table, even though i tested the insert statement. Can anyone help me?? Here's the code:

'************************************************* *********************
' Visual Basic ActiveX Script
'************************************************* ***********************

Function Main()
on error resume next
Set objxl = CreateObject("Excel.Application")
objxl.Visible = False


Dim xlFile
xlFile = "C:\Data\file.xls"
Set objWkb = objxl.Workbooks.Open(xlFile)

'' Connecting to SQL Server
set cn = server.CreateObject("ADODB.Connection")

Dim serverName
serverName = "myserver2"


strCS = "Provider=SQLOLEDB; Data Source=myserver2;Initial
Catalog=mycat; Integrated Security=SSPI"

cn.ConnectionString = strCS
On Error Resume Next
cn.Open
Set objsht = objWkb.Worksheets.Open("Sheet1")
Dim client_name, rb, date_rvd, LOB
Dim sql
Dim row, sequence
row = 2

client_name = Trim(objsht.Cells(row, 2).Value)
Do While IsNull(client_name) = False And client_name <> ""
'client_name = Trim(objsht.Cells(row, 2))
rb = Trim(objsht.Cells(row, 4).value)
date_rvd = Trim(objsht.Cells(row, 6).value)
LOB = "WCS"

sql = "INSERT INTO TEMP_TEST (CLIENT_NAME, RB, DATE_REVIEWED, LOB) VALUES (" & _
" '" & client_name & "' ,'" & rb & "', " & date_rvd & ", '" & LOB & "');"


row = row + 1
MsgBox (sql)
client_name = Trim(objsht.Cells(row, 2).value)
cn.Execute (sql)

Loop

if err.count = 0 then
Main = DTSTaskExecResult_Success
else
Main = DTSTaskExecResult_Failure
end if
End FunctionYou might want to refer to this:
http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=49926

Please can you post the value of the sql variable? To us this is several magnitudes more important than all that vb.|||Why exactly are you using this script? It appears to be a very inneficient method to import the data (you're doing it line-by-line!!).|||Actually - I have read the code now. I would dump the on error resume next when debugging. Probably in production too but especially now.

Also - the loop is unecessary - refer to the link I sent. SQL Server is perfectly capable of dealing with an excel sheet as a table rather than a row at a time. Quicker and easier. As such you can skip all the ActiveX stuff altogether. DTS & ActiveX -> difficult to debug code. If possible try to do it all in T-SQL and run it from a job. You will find most of the guys here won't use ActiveX for very much if anything at all and work with the native T-SQL language.|||Why exactly are you using this script? It appears to be a very inneficient method to import the data (you're doing it line-by-line!!).Damn youngsters and their quick typing skills ;)|||Reminds me of some ActiveX in a DTS my old manager wrote...

It went through every row (46K) and assigned zeroes or blank strings to every NULL value... Needless to say it took aaaaaaaages.

A couple of very tiny T-SQL tweaks and it runns in seconds again ;) No prizes for guessing the function anyhow|||Thanks for the quick response, the sql variable = INSERT INTO TEMP_TEST (CLIENT_NAME, RB, DATE_REVIEWED, LOB) VALUES
( 'Adisseo USA Inc.' ,'Wesley Kent', 2/27/2007, 'WCS');

the reason i'm reading line by line is because i only want certain columns having certain values, it doesnt show that in this code, but eventually i will need to read line by line. Thanks!|||i only want certain columns having certain values
Can you expand on this please?
Rules/logic etc|||como se dice?|||Why not load all of it into a temporary table, extract what you want from it, and delete the temp table again?|||the rule is, read specific columns(not all), and then insert the values in a temp table. there are four tabs that need to be imported, all with different columns to import. but it needs to be in a DTS package. I guess loading it all would be another option|||Why not load all of it into a temporary table, extract what you want from it, and delete the temp table again -- With a DTS package?|||I take it you did not bother looking at the link then? You can specify columns & where clauses using that technique.|||Gotta ask - what is everyone's obsession with using DTS packages? Why stipulate that the answer, whatever it is, must be in a DTS package?

In any event - you can call T-SQL from a DTS package too.|||i got it, the problem was with the excel object "Trim(objsht.Cells(row, 2).Value)"
instead i used Trim(objxl.Cells(row, 2).Value). Thanks for all your suggestion guys!|||i did look at the link pootle, but in some cases i would need to skip lines, and i didnt know how to do that with that link|||Gotta ask - what is everyone's obsession with using DTS packages? ...

ummm... because I haven't learned the other way(s?) yet.
Besides, the graphical interface makes it so easy to follow job flow:

Sunday, February 19, 2012

Importing a SQL File from the Internet...

Hello...

I am trying to create a small SSIS Package that should do the following things

1. Fetch a file from a (Later Several) Webserver(s)

2. Parse the file so the Format is "more acceptable"

3. "Normalize" the Data from the file into 3 Tables for each row (Each row contais a "Unit" , a "Location" and a "Owner" - For each of these Items i have to check for "existance" and if they do exist i need to update IF the Values changed). Also if a row is NOT present i need to remove it from my DB...

So since I am stuck on step one... Is there a "premade" Task that will allow me to fetch this/these Files? I know that i could just write a Scriptblock to fetch thse files, but since there is a FTP Task i thought there would also be an HTTP one? Also some suggestions about the best way to "normalize" the Data would be welcome...

[edit]

I just found the HTTP Task in the wishlist ;)

Ash Sharma has described a method for doing this: http://sqljunkies.com/WebLog/ashvinis/archive/2005/05/25/15653.aspx

Parsing the data would best be done with a script task/component or a custom task/component. Donald Farmer has an example of importing a semi-structured file in his book (http://www.amazon.co.uk/exec/obidos/ASIN/1932577211/qid=1139167425/sr=8-1/ref=sr_8_xs_ap_i1_xgl/203-1655415-8887134) which you will find useful.

-Jamie