Friday, March 30, 2012
importing xml document to multiple related table with identity column.
I am new to XML and looking for some information and suggestion. I need
to import xml document into related table having identity column, could
any one let me know which is the best way to do this. detail
information:
suppose i have 3 xml document call a.xml, b.xml and c.xml. I need to
bring information from these xml document to 3 different table whcih
are related with identity key (primary identity key).
table A: has identity key (A1)which is also a primary key with other
field.
table B: has identity key (B1)which is a primany key, column A1 which
is FK and other field
table C: has identity key (C1) which is a primary key, Column A1 wihic
is FK and other field.
xml document a.xml contain the information or record for table A, b.xml
contain for table B and c.xml contain for table C.
First i would like to bring the information from a.xml to the table A,
withour identity key from xml, it will be generated to the sql server
(this is only one row of data), the identity generated with be the max
of identity. I will like to bring this max of identity or identity
field just generated along with othere infromation from b.xml to table
B, similar to table C.
Could any one let me konw which is the best way to solve this problem.
Thanks in advance.
Indra.Can you post DDL (CREATE TABLE statements) for your tables, an example of
the XML document you're trying to import, and a list of which fields from
the XML correspond to which columns in the tables?
Adam Machanic
SQL Server MVP
http://www.sqljunkies.com/weblog/amachanic
--
<itimilsina@.savannaenergy.com> wrote in message
news:1105992192.804216.302410@.z14g2000cwz.googlegroups.com...
> Hi,
> I am new to XML and looking for some information and suggestion. I need
> to import xml document into related table having identity column, could
> any one let me know which is the best way to do this. detail
> information:
> suppose i have 3 xml document call a.xml, b.xml and c.xml. I need to
> bring information from these xml document to 3 different table whcih
> are related with identity key (primary identity key).
> table A: has identity key (A1)which is also a primary key with other
> field.
> table B: has identity key (B1)which is a primany key, column A1 which
> is FK and other field
> table C: has identity key (C1) which is a primary key, Column A1 wihic
> is FK and other field.
> xml document a.xml contain the information or record for table A, b.xml
> contain for table B and c.xml contain for table C.
> First i would like to bring the information from a.xml to the table A,
> withour identity key from xml, it will be generated to the sql server
> (this is only one row of data), the identity generated with be the max
> of identity. I will like to bring this max of identity or identity
> field just generated along with othere infromation from b.xml to table
> B, similar to table C.
> Could any one let me konw which is the best way to solve this problem.
> Thanks in advance.
> Indra.
>|||You can create a stored procedure which takes these XMLs as parameters
nText.
Load the XMLs into XMLDOcuments using sp_xml_preparedocument to read xml
contents.
you can check how to use this extended stored procedure here..
http://msdn.microsoft.com/library/d...r />
_267o.asp
First prepare the insert statement to Insert Into Table A. Get identity
column with Select Scope_Identity()
use this to insert into Table B and Table C by preparing the insert
statements using XML B and C.
you will have to loop through the records. The best way would be to Insert
all records into Table A. Then write statements insert into
TabeB and C. it can be done simply with just 3 insert statements. but you
need to map tTable A records with B and C.
--
Av.
http://dotnetjunkies.com/WebLog/avnrao
http://www28.brinkster.com/avdotnet
<itimilsina@.savannaenergy.com> wrote in message
news:1105992192.804216.302410@.z14g2000cwz.googlegroups.com...
> Hi,
> I am new to XML and looking for some information and suggestion. I need
> to import xml document into related table having identity column, could
> any one let me know which is the best way to do this. detail
> information:
> suppose i have 3 xml document call a.xml, b.xml and c.xml. I need to
> bring information from these xml document to 3 different table whcih
> are related with identity key (primary identity key).
> table A: has identity key (A1)which is also a primary key with other
> field.
> table B: has identity key (B1)which is a primany key, column A1 which
> is FK and other field
> table C: has identity key (C1) which is a primary key, Column A1 wihic
> is FK and other field.
> xml document a.xml contain the information or record for table A, b.xml
> contain for table B and c.xml contain for table C.
> First i would like to bring the information from a.xml to the table A,
> withour identity key from xml, it will be generated to the sql server
> (this is only one row of data), the identity generated with be the max
> of identity. I will like to bring this max of identity or identity
> field just generated along with othere infromation from b.xml to table
> B, similar to table C.
> Could any one let me konw which is the best way to solve this problem.
> Thanks in advance.
> Indra.
>|||"avnrao" <avn@.newsgroups.com> wrote in message
news:uTIecHT$EHA.612@.TK2MSFTNGP09.phx.gbl...
> First prepare the insert statement to Insert Into Table A. Get identity
> column with Select Scope_Identity()
> use this to insert into Table B and Table C by preparing the insert
> statements using XML B and C.
> you will have to loop through the records. The best way would be to Insert
> all records into Table A. Then write statements insert into
> TabeB and C. it can be done simply with just 3 insert statements. but you
> need to map tTable A records with B and C.
If there are multiple rows inserted, SCOPE_IDENTITY() will not do the
job. And there is no reason to do a loop of any kind. The entire thing
should be able to be done in a set-based manner.|||Hit send too soon on the last message.
Something like this... no loops necessary:
use tempdb
go
create table abc(somedata char(10), abcID int identity(1,1) primary key)
go
create table def(somedata char(10), abcID int references abc(abcid))
go
declare @.xmldoc varchar(8000)
set @.xmldoc =
'<anode>
<another somedata="something">
<blah someother="else"/>
</another>
<another somedata="something2">
<blah someother="else2"/>
</another>
</anode>'
DECLARE @.hdoc int
EXEC sp_xml_preparedocument @.hdoc OUTPUT, @.xmldoc
insert abc (somedata)
SELECT somedata
FROM OPENXML (@.hdoc, '/anode/another')
with (somedata char(10) '@.somedata')
insert def (somedata, abcid)
SELECT x.someother, abc.abcid
FROM
(SELECT somedata, someother
FROM OPENXML (@.hdoc, '/anode/another')
with (
somedata char(10) '@.somedata',
someother char(10) 'blah/@.someother')) x (somedata, someother)
JOIN abc ON abc.somedata = x.somedata
EXEC sp_xml_removedocument @.hdoc
go
select *
from abc
go
select *
from def
go
drop table def
drop table abc
Adam Machanic
SQL Server MVP
http://www.sqljunkies.com/weblog/amachanic
--
importing xml document to multiple related table with identity column.
I am new to XML and looking for some information and suggestion. I need
to import xml document into related table having identity column, could
any one let me know which is the best way to do this. detail
information:
suppose i have 3 xml document call a.xml, b.xml and c.xml. I need to
bring information from these xml document to 3 different table whcih
are related with identity key (primary identity key).
table A: has identity key (A1)which is also a primary key with other
field.
table B: has identity key (B1)which is a primany key, column A1 which
is FK and other field
table C: has identity key (C1) which is a primary key, Column A1 wihic
is FK and other field.
xml document a.xml contain the information or record for table A, b.xml
contain for table B and c.xml contain for table C.
First i would like to bring the information from a.xml to the table A,
withour identity key from xml, it will be generated to the sql server
(this is only one row of data), the identity generated with be the max
of identity. I will like to bring this max of identity or identity
field just generated along with othere infromation from b.xml to table
B, similar to table C.
Could any one let me konw which is the best way to solve this problem.
Thanks in advance.
Indra.
Can you post DDL (CREATE TABLE statements) for your tables, an example of
the XML document you're trying to import, and a list of which fields from
the XML correspond to which columns in the tables?
Adam Machanic
SQL Server MVP
http://www.sqljunkies.com/weblog/amachanic
<itimilsina@.savannaenergy.com> wrote in message
news:1105992192.804216.302410@.z14g2000cwz.googlegr oups.com...
> Hi,
> I am new to XML and looking for some information and suggestion. I need
> to import xml document into related table having identity column, could
> any one let me know which is the best way to do this. detail
> information:
> suppose i have 3 xml document call a.xml, b.xml and c.xml. I need to
> bring information from these xml document to 3 different table whcih
> are related with identity key (primary identity key).
> table A: has identity key (A1)which is also a primary key with other
> field.
> table B: has identity key (B1)which is a primany key, column A1 which
> is FK and other field
> table C: has identity key (C1) which is a primary key, Column A1 wihic
> is FK and other field.
> xml document a.xml contain the information or record for table A, b.xml
> contain for table B and c.xml contain for table C.
> First i would like to bring the information from a.xml to the table A,
> withour identity key from xml, it will be generated to the sql server
> (this is only one row of data), the identity generated with be the max
> of identity. I will like to bring this max of identity or identity
> field just generated along with othere infromation from b.xml to table
> B, similar to table C.
> Could any one let me konw which is the best way to solve this problem.
> Thanks in advance.
> Indra.
>
|||You can create a stored procedure which takes these XMLs as parameters
nText.
Load the XMLs into XMLDOcuments using sp_xml_preparedocument to read xml
contents.
you can check how to use this extended stored procedure here..
http://msdn.microsoft.com/library/de...p_xml_267o.asp
First prepare the insert statement to Insert Into Table A. Get identity
column with Select Scope_Identity()
use this to insert into Table B and Table C by preparing the insert
statements using XML B and C.
you will have to loop through the records. The best way would be to Insert
all records into Table A. Then write statements insert into
TabeB and C. it can be done simply with just 3 insert statements. but you
need to map tTable A records with B and C.
Av.
http://dotnetjunkies.com/WebLog/avnrao
http://www28.brinkster.com/avdotnet
<itimilsina@.savannaenergy.com> wrote in message
news:1105992192.804216.302410@.z14g2000cwz.googlegr oups.com...
> Hi,
> I am new to XML and looking for some information and suggestion. I need
> to import xml document into related table having identity column, could
> any one let me know which is the best way to do this. detail
> information:
> suppose i have 3 xml document call a.xml, b.xml and c.xml. I need to
> bring information from these xml document to 3 different table whcih
> are related with identity key (primary identity key).
> table A: has identity key (A1)which is also a primary key with other
> field.
> table B: has identity key (B1)which is a primany key, column A1 which
> is FK and other field
> table C: has identity key (C1) which is a primary key, Column A1 wihic
> is FK and other field.
> xml document a.xml contain the information or record for table A, b.xml
> contain for table B and c.xml contain for table C.
> First i would like to bring the information from a.xml to the table A,
> withour identity key from xml, it will be generated to the sql server
> (this is only one row of data), the identity generated with be the max
> of identity. I will like to bring this max of identity or identity
> field just generated along with othere infromation from b.xml to table
> B, similar to table C.
> Could any one let me konw which is the best way to solve this problem.
> Thanks in advance.
> Indra.
>
|||"avnrao" <avn@.newsgroups.com> wrote in message
news:uTIecHT$EHA.612@.TK2MSFTNGP09.phx.gbl...
> First prepare the insert statement to Insert Into Table A. Get identity
> column with Select Scope_Identity()
> use this to insert into Table B and Table C by preparing the insert
> statements using XML B and C.
> you will have to loop through the records. The best way would be to Insert
> all records into Table A. Then write statements insert into
> TabeB and C. it can be done simply with just 3 insert statements. but you
> need to map tTable A records with B and C.
If there are multiple rows inserted, SCOPE_IDENTITY() will not do the
job. And there is no reason to do a loop of any kind. The entire thing
should be able to be done in a set-based manner.
|||Hit send too soon on the last message.
Something like this... no loops necessary:
use tempdb
go
create table abc(somedata char(10), abcID int identity(1,1) primary key)
go
create table def(somedata char(10), abcID int references abc(abcid))
go
declare @.xmldoc varchar(8000)
set @.xmldoc =
'<anode>
<another somedata="something">
<blah someother="else"/>
</another>
<another somedata="something2">
<blah someother="else2"/>
</another>
</anode>'
DECLARE @.hdoc int
EXEC sp_xml_preparedocument @.hdoc OUTPUT, @.xmldoc
insert abc (somedata)
SELECT somedata
FROM OPENXML (@.hdoc, '/anode/another')
with (somedata char(10) '@.somedata')
insert def (somedata, abcid)
SELECT x.someother, abc.abcid
FROM
(SELECT somedata, someother
FROM OPENXML (@.hdoc, '/anode/another')
with (
somedata char(10) '@.somedata',
someother char(10) 'blah/@.someother')) x (somedata, someother)
JOIN abc ON abc.somedata = x.somedata
EXEC sp_xml_removedocument @.hdoc
go
select *
from abc
go
select *
from def
go
drop table def
drop table abc
Adam Machanic
SQL Server MVP
http://www.sqljunkies.com/weblog/amachanic
Monday, March 26, 2012
Importing Table/Column Comments for a Data Dictionary
I need to create a Data Dictionary with the following contents:
Table Name
Column Name
Data Type
Comments for each Table
Comment for each column for each table
The first three are easy by using information_schema.columns. However,
there are no comments in our database right now.
Is there an easier way to import comments to the database besides using
sp_addextendedproperty? I do have the Erwin tool at my disposal to use.
Any other third party tools might help here?
The issue here is that I have over 500 tables and over 2500 columns to
add comments for and I am looking for the easiest and fastest way to
do this.
Thanks in advance for any help.
LarryFor extended properties, you may want to take a look at AGS
SQL Server Extended Property Editor.
http://www.ag-software.com/ags_SSEPE_index.aspx
-Sue
On 26 Mar 2004 11:33:09 -0800, lschwei@.oaot.com (Larry)
wrote:
>Hello,
>I need to create a Data Dictionary with the following contents:
>Table Name
>Column Name
>Data Type
>Comments for each Table
>Comment for each column for each table
>The first three are easy by using information_schema.columns. However,
>there are no comments in our database right now.
>Is there an easier way to import comments to the database besides using
>sp_addextendedproperty? I do have the Erwin tool at my disposal to use.
>Any other third party tools might help here?
>The issue here is that I have over 500 tables and over 2500 columns to
>add comments for and I am looking for the easiest and fastest way to
>do this.
>Thanks in advance for any help.
>Larrysql
Importing Table Data Into an XML Column
I've been reading through BOL and various forms and am even more confused now as to what the best option is for what I need to accomplish...here's the scenario.
We have a staging table comprised of the following columns:
CN_CUST_KEY
ITM_SUF_NO
Model
Serial
SaleYear
SaleDate
prd_itm_no
LastName
MiddleName
FirstName
StreetAddress
...etc.
We need to load the data into a table with a similar structure, except that the first seven columns will be inserted into a column called CustomColumns, which has an XML datatype. The other columns will be inserted into their correlating columns.
I've already created the XSD and bound it to the XML column in my table:
Code Snippet
<xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<xsd:element name="CustomColumns">
<xsd:complexType>
<xsd:complexContent>
<xsd:restriction base="xsd:anyType">
<xsd:sequence>
<xsd:element name="CN_CUST_KEY">
<xsd:simpleType>
<xsd:restriction base="xsd:string" />
</< FONT>xsd:simpleType>
</< FONT>xsd:element>
<xsd:element name="ITM_SUF_NO">
<xsd:simpleType>
<xsd:restriction base="xsd:string" />
</< FONT>xsd:simpleType>
</< FONT>xsd:element>
<xsd:element name="Model">
<xsd:simpleType>
<xsd:restriction base="xsd:string" />
</< FONT>xsd:simpleType>
</< FONT>xsd:element>
<xsd:element name="Serial">
<xsd:simpleType>
<xsd:restriction base="xsd:string" />
</< FONT>xsd:simpleType>
</< FONT>xsd:element>
<xsd:element name="SaleYear">
<xsd:simpleType>
<xsd:restriction base="xsd:string" />
</< FONT>xsd:simpleType>
</< FONT>xsd:element>
<xsd:element name="SaleDate">
<xsd:simpleType>
<xsd:restriction base="xsd:string" />
</< FONT>xsd:simpleType>
</< FONT>xsd:element>
<xsd:element name="prd_itm_no">
<xsd:simpleType>
<xsd:restriction base="xsd:string" />
</< FONT>xsd:simpleType>
</< FONT>xsd:element>
</< FONT>xsd:sequence>
</< FONT>xsd:restriction>
</< FONT>xsd:complexContent>
</< FONT>xsd:complexType>
</< FONT>xsd:element>
</< FONT>xsd:schema>
So at the end of the day, the contents of any onw row's CustomColumns content would resemble:
Code Snippet
<CustomColumns>
<Lender></Lender>
<Dealership></Dealership>
<PayoffDate></PayoffDate>
<ConsentSigFlag></ConsentSigFlag>
<ConsentDateFlag></ConsentDateFlag>
<ConsentLenderFlag></ConsentLenderFlag>
<ContactCRAFalg></ContactCRAFalg>
<OwnerSigFlag></OwnerSigFlag>
<OwnerCompanyFlag></OwnerCompanyFlag>
<CertificateNo>QA095407</CertificateNo>
<AgentName>COLONIAL HARLEY DAVIDSON</AgentName>
<InsEFfDate>07/08/2006</InsEFfDate>
<HouseHoldClmQty>0</HouseHoldClmQty>
<HouseHoldClmSeq>0</HouseHoldClmSeq>
</CustomColumns>
...except of course with the appropriate column names.
One of the requirements of the process is that we're able to pop it into an SSIS pacakge for repeatable execution and modification whenever there is a new set of data points.
If anyone can provide any guidance, it would be greatly appreciated.
Thanks!!
I think I'm close....
Here's the query I ran:
Code Snippet
INSERT INTO [clmnt]
(
name1,
addr1,
addr2,
city,
[state],
zip5,
zip4,
CustomColumns)
SELECT
LastName,
StreetAddress1,
StreetAddress2,
City,
[State],
Postal5,
Postal4,
(SELECT CN_CUST_KEY,
ITM_SUF_NO,
Model,
Serial,
SaleYear,
SaleDate,
prd_itm_no
FROM TblClass_Customers FOR XML PATH, ELEMENTS, ROOT('CustomColumns'), TYPE)
FROM TblClass_Customers;
...and here are the results:
Msg 6965, Level 16, State 1, Line 1
XML Validation: Invalid content. Expected element(s):CN_CUST_KEY where element 'row' was specified. Location: /*:CustomColumns[1]/*:row[1]
Friday, March 23, 2012
Importing multiple Excel files.
files. All the Excel files are in one folder and have the same column names.
What is the best way to get them into a table in the database? While
looping though the folder can you just do a bulk insert on the excel files?
"Don" <Don@.discussions.microsoft.com> wrote in message
news:510DB3BC-CAFD-42BC-8262-9D43B577A82D@.microsoft.com...
> Hello, I have a SQL Server 2000 database that I want to load with Excel
> files. All the Excel files are in one folder and have the same column
> names.
> What is the best way to get them into a table in the database? While
> looping though the folder can you just do a bulk insert on the excel
> files?
Use DTS. The following example uses a text file but it could equally be
Excel:
http://www.sqldts.com/default.aspx?246
David Portas, SQL Server MVP
Whenever possible please post enough code to reproduce your problem.
Including CREATE TABLE and INSERT statements usually helps.
State what version of SQL Server you are using and specify the content
of any error messages.
SQL Server Books Online:
http://msdn2.microsoft.com/library/ms130214(en-US,SQL.90).aspx
Importing multiple Excel files.
files. All the Excel files are in one folder and have the same column names.
What is the best way to get them into a table in the database? While
looping though the folder can you just do a bulk insert on the excel files?"Don" <Don@.discussions.microsoft.com> wrote in message
news:510DB3BC-CAFD-42BC-8262-9D43B577A82D@.microsoft.com...
> Hello, I have a SQL Server 2000 database that I want to load with Excel
> files. All the Excel files are in one folder and have the same column
> names.
> What is the best way to get them into a table in the database? While
> looping though the folder can you just do a bulk insert on the excel
> files?
Use DTS. The following example uses a text file but it could equally be
Excel:
http://www.sqldts.com/default.aspx?246
--
David Portas, SQL Server MVP
Whenever possible please post enough code to reproduce your problem.
Including CREATE TABLE and INSERT statements usually helps.
State what version of SQL Server you are using and specify the content
of any error messages.
SQL Server Books Online:
http://msdn2.microsoft.com/library/ms130214(en-US,SQL.90).aspx
--sql
Importing multiple Excel files.
files. All the Excel files are in one folder and have the same column names
.
What is the best way to get them into a table in the database? While
looping though the folder can you just do a bulk insert on the excel files?"Don" <Don@.discussions.microsoft.com> wrote in message
news:510DB3BC-CAFD-42BC-8262-9D43B577A82D@.microsoft.com...
> Hello, I have a SQL Server 2000 database that I want to load with Excel
> files. All the Excel files are in one folder and have the same column
> names.
> What is the best way to get them into a table in the database? While
> looping though the folder can you just do a bulk insert on the excel
> files?
Use DTS. The following example uses a text file but it could equally be
Excel:
http://www.sqldts.com/default.aspx?246
David Portas, SQL Server MVP
Whenever possible please post enough code to reproduce your problem.
Including CREATE TABLE and INSERT statements usually helps.
State what version of SQL Server you are using and specify the content
of any error messages.
SQL Server Books Online:
http://msdn2.microsoft.com/library/ms130214(en-US,SQL.90).aspx
--
Wednesday, March 21, 2012
Importing from xls that has all data in first column
I have a spreadsheet that has all of the data in the first column. The data is delimited by asterisks and there are no column headings. My goal is to import the data into a SQL table. I get the error message "External table is not in the expected format" when I try to select the name of the Excel sheet in the Excel Source Editor. I tried reading using a flat file connection instead of Excel connection, but that didn't work.
In Excel, I can use the Text to Columns feature to convert the data to columns in the spreadsheet. But I have to go into the file manually to do that. Is there a way in SSIS to do the same thing automatically? Or is there a way to save the XLS as a TXT file? Then, I could just use Flat File Connection to read the delimited data.
Thanks.
cpat
cpat wrote:
I have a spreadsheet that has all of the data in the first column. The data is delimited by asterisks and there are no column headings. My goal is to import the data into a SQL table. I get the error message "External table is not in the expected format" when I try to select the name of the Excel sheet in the Excel Source Editor. I tried reading using a flat file connection instead of Excel connection, but that didn't work.
In Excel, I can use the Text to Columns feature to convert the data to columns in the spreadsheet. But I have to go into the file manually to do that. Is there a way in SSIS to do the same thing automatically? Or is there a way to save the XLS as a TXT file? Then, I could just use Flat File Connection to read the delimited data.
Thanks.
cpat
You should read the data in as a single column and then use the Derived Column component to split that column up into all the required columns.
-Jamie
Importing from Excel problems
My main problems are
1) zipcode formatting issues. If the column is a mix of zip and zip+4,
I have problems retrieving all zipcodes.
2) If the last column contains NULL no information is imported.
All this with using the Management console using Import data in SQL
Server 2005. I am simply trying to import the data into NEW databases.
The excel files vary in structure. Right now I am working on case by
case basis.
Does anyone see these types of problems?
What I am doing now is converting the excel file to a tab delimited
file and that seems to work.
TIA.On Dec 11, 12:32 pm, scoots987 <scoots...@.gmail.comwrote:
Quote:
Originally Posted by
What do others do if you need to import excel files into SQL Server?
My main problems are
>
1) zipcode formatting issues. If the column is a mix of zip and zip+4,
I have problems retrieving all zipcodes.
>
2) If the last column contains NULL no information is imported.
>
All this with using the Management console using Import data in SQL
Server 2005. I am simply trying to import the data into NEW databases.
The excel files vary in structure. Right now I am working on case by
case basis.
>
Does anyone see these types of problems?
>
What I am doing now is converting the excel file to a tab delimited
file and that seems to work.
>
TIA.
Hi scoots987,
I usually use a dummy row in my excel files to force the correct data
types and column mappings (typically I import everything as text and
convert it downstream). One of the big problems with importing data
from an Excel file is that SQL Server (well .. the I think it's
actually the OLEDB driver) only looks at the first 8 rows of data to
determine what data types to use. To force it to look at more rows,
you need to change a couple registry settings, which in my experience
is usually off-limits in a managed production environment (check out
http://blog.lab49.com/?p=196 for info .. it's not a SQL blog, but it
explains the issue well).
Good luck!
J|||I've recently had great experience importing Excel into the DB using
SSIS (SQL Server Integration Services) and the OLE DB Excel Connection.
It has transforms and all sorts of goodies to make the import easy and
omplete.
HTH
aj
scoots987 wrote:
Quote:
Originally Posted by
What do others do if you need to import excel files into SQL Server?
My main problems are
>
1) zipcode formatting issues. If the column is a mix of zip and zip+4,
I have problems retrieving all zipcodes.
>
2) If the last column contains NULL no information is imported.
>
All this with using the Management console using Import data in SQL
Server 2005. I am simply trying to import the data into NEW databases.
The excel files vary in structure. Right now I am working on case by
case basis.
>
Does anyone see these types of problems?
>
What I am doing now is converting the excel file to a tab delimited
file and that seems to work.
>
TIA.
Importing from Excel
I have read Allan Mitchell's article about using IMEX=1 (http://www.sqldts.com/default.aspx?254). I added IMEX=1 and modified the registry setting, but when I do that NO data at all is imported.
Any ideas? My code is very long, so I'll only post it if you'd really like to see it.
Much thanks in advance!
BillI would export the excel file into a delimited file - the ms driver for excel causes too many headaches. Also, did you modify the TypeGuessRows setting in your registry ?|||Originally posted by rnealejr
I would export the excel file into a delimited file - the ms driver for excel causes too many headaches. Also, did you modify the TypeGuessRows setting in your registry ?
I did modify the TypeGuessRows setting to 0. Is there a way to programmatically export the Excel file to a CSV file?
Thanks,
Bill|||It has been a while since I used the TypeGuessRows - but I believe setting it to 0 will not work - you have to explicitly supply a value (something big enough to sample your data).
Programmatically - Do you mean from within sql server or any executable ?|||Originally posted by rnealejr
It has been a while since I used the TypeGuessRows - but I believe setting it to 0 will not work - you have to explicitly supply a value (something big enough to sample your data).
Programmatically - Do you mean from within sql server or any executable ?
I read somewhere to set it to 0 and use IMEX=1. I just tried setting it to 16 and 8, but neither of those worked either.
Actually, when I said programatically I meant VBScript, but I guess that's for the VBScript forum!
Thanks,
Bill|||You can create an excel object - open the workbook - use the saveas method for the workbook.|||Originally posted by rnealejr
You can create an excel object - open the workbook - use the saveas method for the workbook.
Where can I find information on using the excel object's saveas function? I'm sure there are a number of options, and I'd like to take a look at them.
Thanks!
Bill|||Here is the snapshot from the help (Also, look for a vbaxl9.chm file on your computer):
Saves changes to the sheet (Syntax 1) or workbook (Syntax 2) in a different file.
Syntax 1
expression.SaveAs(Filename, FileFormat, Password, WriteResPassword, ReadOnlyRecommended, CreateBackup, AddToMru, TextCodePage, TextVisualLayout)
Syntax 2
expression.SaveAs(Filename, FileFormat, Password, WriteResPassword, ReadOnlyRecommended, CreateBackup, AccessMode, ConflictResolution, AddToMru, TextCodePage, TextVisualLayout)
expression Required. An expression that returns a Chart or Worksheet object (Syntax 1) or a Workbook object (Syntax 2).
Filename Optional Variant. A string that indicates the name of the file to be saved. You can include a full path; if you dont, Microsoft Excel saves the file in the current folder.
FileFormat Optional Variant. The file format to use when you save the file. For a list of valid choices, see the FileFormat property. For an existing file, the default format is the last file format specified; for a new file, the default is the format of the version of Excel being used.
Password Optional Variant. A case-sensitive string (no more than 15 characters) that indicates the protection password to be given to the file.
WriteResPassword Optional Variant. A string that indicates the write-reservation password for this file. If a file is saved with the password and the password isnt supplied when the file is opened, the file is opened as read-only.
ReadOnlyRecommended Optional Variant. True to display a message when the file is opened, recommending that the file be opened as read-only.
CreateBackup Optional Variant. True to create a backup file.
AccessMode Optional Variant. The workbook access mode. Can be one of the following XlSaveAsAccessMode constants: xlShared (shared list), xlExclusive (exclusive mode), or xlNoChange (dont change the access mode). If this argument is omitted, the access mode isnt changed. This argument is ignored if you save a shared list without changing the file name. To change the access mode, use the ExclusiveAccess method.
ConflictResolution Optional Variant. Specifies the way change conflicts are resolved if the workbook is a shared list. Can be one of the following XlSaveConflictResolution constants: xlUserResolution (display the conflict-resolution dialog box), xlLocalSessionChanges (automatically accept the local users changes), or xlOtherSessionChanges (accept other changes instead of the local users changes). If this argument is omitted, the conflict-resolution dialog box is displayed.
AddToMru Optional Variant. True to add this workbook to the list of recently used files. The default value is False.
TextCodePage Optional Variant. Not used in U.S. English Microsoft Excel.
TextVisualLayout Optional Variant. Not used in U.S. English Microsoft Excel.
The following are the available file formats:
xlAddIn
xlCSV
xlCSVMac
xlCSVMSDOS
xlCSVWindows
xlCurrentPlatformText
xlDBF2
xlDBF3
xlDBF4
xlDIF
xlExcel2
xlExcel2FarEast
xlExcel3
xlExcel4
xlExcel4Workbook
xlExcel5
xlExcel7
xlExcel9795
xlHTML
xlIntlAddIn
xlIntlMacro
xlSYLK
xlTemplate
xlTextMac
xlTextMSDOS
xlTextPrinter
xlTextWindows
xlUnicodeText
xlWJ2WD1
xlWK1
xlWK1ALL
xlWK1FMT
xlWK3
xlWK4
xlWK3FM3
xlWKS
xlWorkbookNormal
xlWorks2FarEast
xlWQ1
xlWJ3
xlWJ3FJ3|||Awesome, thanks!
So if I do this:
<code>
set xlApp = CreateObject("excel.application")
xlApp.Workbooks.Open <filename>
xlApp.ActiveWorkbook.SaveAs <newFileName>, xlCSVWindows
xlApp.ActiveWorkbook.Close
xlApp.Quit
Set xlApp = Nothing
</code>
It would save it as a CSV file?
Thanks!
Bill|||Yes, that looks good. Let me know if there are any problems.|||Originally posted by rnealejr
Yes, that looks good. Let me know if there are any problems.
Thanks, I'm going to try to give it a test today!|||Originally posted by rnealejr
Yes, that looks good. Let me know if there are any problems.
Okay, I tried that code and I'm getting the following error message:
Microsoft Excel error '800a03ec'
SaveAs method of Workbook class failed
Any ideas?
Thanks!
Bill|||Originally posted by rnealejr
Yes, that looks good. Let me know if there are any problems.
Okay, I may have found the issue, but I'm not sure. I saved the current workbook file format value to a variable and printed it to the screen. The value was -4143. Is the fileformat value supposed to be a number or the values from the help file (i.e., xlCSV)?
Thanks for all your help!
Bill|||I do not have to vb at the moment to confirm but the xlCSV value should be and the xlWindowsCSV is 23.|||Also, sorry for the delay - your post got buried.|||Originally posted by rnealejr
I do not have to vb at the moment to confirm but the xlCSV value should be and the xlWindowsCSV is 23. \
No problem on the delay, I really appreciate the help - is there a list of these values anywhere? I've been looking, but I must not be using the right terminology.
Thanks!
Bill|||This link might be helpful:
link (http://techsupt.windowware.com/TS/T000001033005F9.html)
Importing Fixed-width txt file - problem
Hi,
I was trying to import a fixed-width file to a sql 2005 table.
The total record lenght is 1500. I was trying to import it to a single column.
The strange thing that's happening is: SSIS is inserting only the first 32 chars of the record and the remaining are gone. I tried using nvarchar(max) and varchar(max) but of no use.
I think something somewhere is going wrong but I was unable to figure it out. Earlier I was able to load a similar file into a single column table.
My Header row delimiter is {CR}{LF}
The preview pane shows the complete record but when it transfers to the table, I'm getting 32 chars only.
Can anybody suggest any ideas to figure this out?
Thanks,
Siva.
Your other alternative is to read in the data as binary using a script task, searching for NULLs and replacing them with spaces (or some other valid string character of your choosing). If you write this code, let me know. I need to do just this and haven't had the time to write it.
Monday, March 19, 2012
Importing Excell
1. table that it is going into needs to have an additional column which is
automatically populated with the name of excell file
2. the other table it is going to needs to have an additonal column which is
to be automatically populated with the name of the worksheet.
Help. Thanks
--
HarryTry my column at:
http://www.sqlservercentral.com/col...ngxpexcelxmland
openxmlfordataimports.asp
"Harry" <harry.shambaugh@.sprint.com> wrote in message
news:D6342430-B70A-4B25-9DDD-89CFBEBA482A@.microsoft.com...
> Looking for a way to import Excell with mulitple worksheets.
> 1. table that it is going into needs to have an additional column which is
> automatically populated with the name of excell file
> 2. the other table it is going to needs to have an additonal column which
is
> to be automatically populated with the name of the worksheet.
> Help. Thanks
> --
> Harry|||The link provided is dead.
I have signed up for an account and retried...same results.
Do you have the path?
--
Harry
"sloan" wrote:
> Try my column at:
> [url]http://www.sqlservercentral.com/columnists/sholliday/leveragingxpexcelxmland[/ur
l]
> openxmlfordataimports.asp
>
>
> "Harry" <harry.shambaugh@.sprint.com> wrote in message
> news:D6342430-B70A-4B25-9DDD-89CFBEBA482A@.microsoft.com...
> is
>
>|||The article is there - the link wrapped around the line.
http://www.sqlservercentral.com/col...dataimports.asp
If that link comes up wrong, go to http://www.sqlservercentral.com and do a
site search for: "OPENXML for Data Imports". The article is the first link
in the list of results.
"Harry" <harry.shambaugh@.sprint.com> wrote in message
news:D861EC08-9602-45F4-8962-756FAFE14DDA@.microsoft.com...
> The link provided is dead.
> I have signed up for an account and retried...same results.
> Do you have the path?
> --
> Harry
>
importing excel column with multiple values separated by '/'
table as an area code - time zone look up.
The area code column sometimes has multiple area codes in the area code
cell. eg. 207/208/209.
What is a good way to import those two columns so that 3 table rows are
created for each of those Excel rows that contain these multiple values
separated by the '/' character?
Thank you,
GregOn Wed, 1 Mar 2006 07:32:52 -0800, hazz wrote:
>I would like to import two columns from an excel file into a sql server
>table as an area code - time zone look up.
>The area code column sometimes has multiple area codes in the area code
>cell. eg. 207/208/209.
>What is a good way to import those two columns so that 3 table rows are
>created for each of those Excel rows that contain these multiple values
>separated by the '/' character?
Hi Greg,
Some useful techniques are disccussed at
http://www.sommarskog.se/arrays-in-sql.html
Hugo Kornelis, SQL Server MVP|||Thank you Hugo, I'll take a look !
"Hugo Kornelis" <hugo@.perFact.REMOVETHIS.info.INVALID> wrote in message
news:spdc0210aq8daomldhm1lmbpviqf7hp4gi@.
4ax.com...
> On Wed, 1 Mar 2006 07:32:52 -0800, hazz wrote:
>
> Hi Greg,
> Some useful techniques are disccussed at
> http://www.sommarskog.se/arrays-in-sql.html
> --
> Hugo Kornelis, SQL Server MVP
Monday, March 12, 2012
Importing Dates using derived column transformation
Hi All
I've got a flatfile which contains a column SALEDATE with this data
20070802
''
20070804
''
20070805
20070901
I've got a table with a column SALEDATE datetime.
I use a derived column with this expression
SALEDATE <add as new column> (DT_DBDATE)(SUBSTRING(SaleDate,1,4) + "-" + SUBSTRING(SaleDate,5,2) + "-" + SUBSTRING(SaleDate,7,2)) database date [DT_DBDATE]
This expression fails whenever it comes across a blank field.
How can evaluate my expression to insert a null if it comes across a blank field?
Thank You
SaleDate == "" || ISNULL(SaleDate) ? NULL(DT_DBDATE) : (DT_DBDATE)(SUBSTRING(SaleDate,1,4) + "-" + SUBSTRING(SaleDate,5,2) + "-" + SUBSTRING(SaleDate,7,2))|||Phil thank you for replying.
There are some records that don't even have anything for SALEDATE
20070801
''
20070802
20070901
20070902
The package fails at the row 627 when the empty field comes across.
How should i modify the expression?
|||The package doesn't fail because of the expression -- check your logs -- it likely fails because of an invalid row (missing delimiter?). SSIS expects the metadata to be the same across the whole file.|||You're right627 is the last row in the flatfile. it fails on that row.
I checked the file the last record contains data 20070831
We get flatfiles in a RaggedRight format.
There are no text qualifiers and Header row delimiter is set to {CR}{LF}
Here is the error log:
[Transform Columns [1]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "component "Transform Columns" (1)" failed because error code 0xC0049063 occurred, and the error row disposition on "output column "SALEDATE" (4334)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
[DTS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Transform Columns" (1) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
[DTS.Pipeline] Error: SSIS Error Code DTS_E_THREADFAILED. Thread "WorkThread0" has exited with error code 0xC0209029. There may be error messages posted before this with more information on why the thread has exited.
I'll try different options and see what i can come up with.
Thanks for you help.
|||Sounds like a short row and it is missing its row terminator|||I can't seem to get any rows into a table.
Still getting the same error.
I tried changing the (DT_DBDATE) to (DT_STR) but it turns red.
OnSaleDate == "" || ISNULL(OnSaleDate) ? NULL(DT_DBDATE) : (DT_DBDATE)(SUBSTRING(OnSaleDate,1,4) + "/" + SUBSTRING(OnSaleDate,5,2) + "/" + SUBSTRING(OnSaleDate,7,2))
Did anyone come across this problem?
|||Please share all of your errors.There's nothing wrong with that expression, syntactically.|||
I got the same problem and I fixed cleaning garbage data in the column that caused the error. Ex I expected a number and I got a "-" character.
Good luck
|||Here is how i solved it.
LEN(TRIM(OnSaleDate)) > 1 ? (DT_DBDATE)(SUBSTRING(OnSaleDate,1,4) + "-" + SUBSTRING(OnSaleDate,5,2) + "-" + SUBSTRING(OnSaleDate,7,2)) : NULL(DT_DATE)
Importing Dates using derived column transformation
Hi All
I've got a flatfile which contains a column SALEDATE with this data
20070802
''
20070804
''
20070805
20070901
I've got a table with a column SALEDATE datetime.
I use a derived column with this expression
SALEDATE <add as new column> (DT_DBDATE)(SUBSTRING(SaleDate,1,4) + "-" + SUBSTRING(SaleDate,5,2) + "-" + SUBSTRING(SaleDate,7,2)) database date [DT_DBDATE]
This expression fails whenever it comes across a blank field.
How can evaluate my expression to insert a null if it comes across a blank field?
Thank You
SaleDate == "" || ISNULL(SaleDate) ? NULL(DT_DBDATE) : (DT_DBDATE)(SUBSTRING(SaleDate,1,4) + "-" + SUBSTRING(SaleDate,5,2) + "-" + SUBSTRING(SaleDate,7,2))|||Phil thank you for replying.
There are some records that don't even have anything for SALEDATE
20070801
''
20070802
20070901
20070902
The package fails at the row 627 when the empty field comes across.
How should i modify the expression?
|||The package doesn't fail because of the expression -- check your logs -- it likely fails because of an invalid row (missing delimiter?). SSIS expects the metadata to be the same across the whole file.|||You're right627 is the last row in the flatfile. it fails on that row.
I checked the file the last record contains data 20070831
We get flatfiles in a RaggedRight format.
There are no text qualifiers and Header row delimiter is set to {CR}{LF}
Here is the error log:
[Transform Columns [1]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "component "Transform Columns" (1)" failed because error code 0xC0049063 occurred, and the error row disposition on "output column "SALEDATE" (4334)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
[DTS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Transform Columns" (1) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
[DTS.Pipeline] Error: SSIS Error Code DTS_E_THREADFAILED. Thread "WorkThread0" has exited with error code 0xC0209029. There may be error messages posted before this with more information on why the thread has exited.
I'll try different options and see what i can come up with.
Thanks for you help.
|||Sounds like a short row and it is missing its row terminator|||I can't seem to get any rows into a table.
Still getting the same error.
I tried changing the (DT_DBDATE) to (DT_STR) but it turns red.
OnSaleDate == "" || ISNULL(OnSaleDate) ? NULL(DT_DBDATE) : (DT_DBDATE)(SUBSTRING(OnSaleDate,1,4) + "/" + SUBSTRING(OnSaleDate,5,2) + "/" + SUBSTRING(OnSaleDate,7,2))
Did anyone come across this problem?
|||Please share all of your errors.There's nothing wrong with that expression, syntactically.|||
I got the same problem and I fixed cleaning garbage data in the column that caused the error. Ex I expected a number and I got a "-" character.
Good luck
|||Here is how i solved it.
LEN(TRIM(OnSaleDate)) > 1 ? (DT_DBDATE)(SUBSTRING(OnSaleDate,1,4) + "-" + SUBSTRING(OnSaleDate,5,2) + "-" + SUBSTRING(OnSaleDate,7,2)) : NULL(DT_DATE)
Importing Dates using derived column transformation
Hi All
I've got a flatfile which contains a column SALEDATE with this data
20070802
''
20070804
''
20070805
20070901
I've got a table with a column SALEDATE datetime.
I use a derived column with this expression
SALEDATE <add as new column> (DT_DBDATE)(SUBSTRING(SaleDate,1,4) + "-" + SUBSTRING(SaleDate,5,2) + "-" + SUBSTRING(SaleDate,7,2)) database date [DT_DBDATE]
This expression fails whenever it comes across a blank field.
How can evaluate my expression to insert a null if it comes across a blank field?
Thank You
SaleDate == "" || ISNULL(SaleDate) ? NULL(DT_DBDATE) : (DT_DBDATE)(SUBSTRING(SaleDate,1,4) + "-" + SUBSTRING(SaleDate,5,2) + "-" + SUBSTRING(SaleDate,7,2))|||Phil thank you for replying.
There are some records that don't even have anything for SALEDATE
20070801
''
20070802
20070901
20070902
The package fails at the row 627 when the empty field comes across.
How should i modify the expression?
|||The package doesn't fail because of the expression -- check your logs -- it likely fails because of an invalid row (missing delimiter?). SSIS expects the metadata to be the same across the whole file.|||You're right627 is the last row in the flatfile. it fails on that row.
I checked the file the last record contains data 20070831
We get flatfiles in a RaggedRight format.
There are no text qualifiers and Header row delimiter is set to {CR}{LF}
Here is the error log:
[Transform Columns [1]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "component "Transform Columns" (1)" failed because error code 0xC0049063 occurred, and the error row disposition on "output column "SALEDATE" (4334)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
[DTS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Transform Columns" (1) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
[DTS.Pipeline] Error: SSIS Error Code DTS_E_THREADFAILED. Thread "WorkThread0" has exited with error code 0xC0209029. There may be error messages posted before this with more information on why the thread has exited.
I'll try different options and see what i can come up with.
Thanks for you help.
|||Sounds like a short row and it is missing its row terminator|||I can't seem to get any rows into a table.
Still getting the same error.
I tried changing the (DT_DBDATE) to (DT_STR) but it turns red.
OnSaleDate == "" || ISNULL(OnSaleDate) ? NULL(DT_DBDATE) : (DT_DBDATE)(SUBSTRING(OnSaleDate,1,4) + "/" + SUBSTRING(OnSaleDate,5,2) + "/" + SUBSTRING(OnSaleDate,7,2))
Did anyone come across this problem?
|||Please share all of your errors.There's nothing wrong with that expression, syntactically.|||
I got the same problem and I fixed cleaning garbage data in the column that caused the error. Ex I expected a number and I got a "-" character.
Good luck
|||Here is how i solved it.
LEN(TRIM(OnSaleDate)) > 1 ? (DT_DBDATE)(SUBSTRING(OnSaleDate,1,4) + "-" + SUBSTRING(OnSaleDate,5,2) + "-" + SUBSTRING(OnSaleDate,7,2)) : NULL(DT_DATE)
Importing Dates using derived column transformation
Hi All
I've got a flatfile which contains a column SALEDATE with this data
20070802
''
20070804
''
20070805
20070901
I've got a table with a column SALEDATE datetime.
I use a derived column with this expression
SALEDATE <add as new column> (DT_DBDATE)(SUBSTRING(SaleDate,1,4) + "-" + SUBSTRING(SaleDate,5,2) + "-" + SUBSTRING(SaleDate,7,2)) database date [DT_DBDATE]
This expression fails whenever it comes across a blank field.
How can evaluate my expression to insert a null if it comes across a blank field?
Thank You
SaleDate == "" || ISNULL(SaleDate) ? NULL(DT_DBDATE) : (DT_DBDATE)(SUBSTRING(SaleDate,1,4) + "-" + SUBSTRING(SaleDate,5,2) + "-" + SUBSTRING(SaleDate,7,2))|||Phil thank you for replying.
There are some records that don't even have anything for SALEDATE
20070801
''
20070802
20070901
20070902
The package fails at the row 627 when the empty field comes across.
How should i modify the expression?
|||The package doesn't fail because of the expression -- check your logs -- it likely fails because of an invalid row (missing delimiter?). SSIS expects the metadata to be the same across the whole file.|||You're right
627 is the last row in the flatfile. it fails on that row.
I checked the file the last record contains data 20070831
We get flatfiles in a RaggedRight format.
There are no text qualifiers and Header row delimiter is set to {CR}{LF}
Here is the error log:
[Transform Columns [1]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "component "Transform Columns" (1)" failed because error code 0xC0049063 occurred, and the error row disposition on "output column "SALEDATE" (4334)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
[DTS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Transform Columns" (1) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
[DTS.Pipeline] Error: SSIS Error Code DTS_E_THREADFAILED. Thread "WorkThread0" has exited with error code 0xC0209029. There may be error messages posted before this with more information on why the thread has exited.
I'll try different options and see what i can come up with.
Thanks for you help.
|||Sounds like a short row and it is missing its row terminator|||I can't seem to get any rows into a table.
Still getting the same error.
I tried changing the (DT_DBDATE) to (DT_STR) but it turns red.
OnSaleDate == "" || ISNULL(OnSaleDate) ? NULL(DT_DBDATE) : (DT_DBDATE)(SUBSTRING(OnSaleDate,1,4) + "/" + SUBSTRING(OnSaleDate,5,2) + "/" + SUBSTRING(OnSaleDate,7,2))
Did anyone come across this problem?
|||Please share all of your errors.There's nothing wrong with that expression, syntactically.|||
I got the same problem and I fixed cleaning garbage data in the column that caused the error. Ex I expected a number and I got a "-" character.
Good luck
|||Here is how i solved it.
LEN(TRIM(OnSaleDate)) > 1 ? (DT_DBDATE)(SUBSTRING(OnSaleDate,1,4) + "-" + SUBSTRING(OnSaleDate,5,2) + "-" + SUBSTRING(OnSaleDate,7,2)) : NULL(DT_DATE)
Importing DATE with Timestamp(In a Flat file) Column using SSIS
SSIS is brand new for me.. Playing with since a few hours..
Iam trying to import a Flat File into the SQLSERV DB using SSIS..
One of the column is in this format -- "YYYYMMDDHH24MISS"
How do i get around this to import the data in a readable fashion into the Destination?
Thanks!
MKR
Hi MKR,
What data type are you wanting the result to be?
You can use a derived column component to parse the format of the column and create anything you like -- a DT_DBTIMESTAMP, a string with your own format, etc...
You could turn the string in the format you have above into a string with this format: "YYYY-MM-DD HH:MM: SS" with an expression like this in derived column (where i am assuming the string is in a column called 'Col'):
SUBSTRING(Col, 1, 4) + "-" + SUBSTRING(Col, 5,2) + "-" + SUBSTRING(Col, 7,2) + " " + SUBSTRING(Col, 9,2) + ":" + SUBSTRING(Col, 13,2) + ":" + SUBSTRING(Col, 15,2)
Is that the sort of thing you are looking for?
Thanks
Mark
But as i was telling you earlier.. My Knowledge on SSIS is very limited..
Now that i know we can manipulate the string..
Where do i do this -- I mean, where do i add this SUBSTRING Manipulation..
|||You want to add in in the data flow, using a derived column transformation.|||Thanks! Welch n Mark
importing Data to an existing database column from an .xsl file or .cvs file
good morning,
I want to load data that i receive everydays from my customers in .xls file format (excel) or cvs file format, to the database that i have created on this purpose. but when trying to do that whith SSIS; i got an error message .... that i can't import redudant data in my database column.
Best regards.
Can you please post the exact message?Friday, March 9, 2012
Importing Data Into XML Column - UPDATED
Okay...I as actually able to get fairly far in my attmepts to copy data from a SQL Server table to an XML column. Here is the XSD I created:
<xsdchema xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<xsd:element name="CustomColumns">
<xsd:complexType>
<xsd:complexContent>
<xsd:restriction base="xsd:anyType">
<xsdequence>
<xsd:element name="CN_CUST_KEY" nillable="true">
<xsdimpleType>
<xsd:restriction base="xsd:float" />
</xsdimpleType>
</xsd:element>
<xsd:element name="ITM_SUF_NO" nillable="true">
<xsdimpleType>
<xsd:restriction base="xsd:float" />
</xsdimpleType>
</xsd:element>
<xsd:element name="Model" nillable="true">
<xsdimpleType>
<xsd:restriction base="xsdtring" />
</xsdimpleType>
</xsd:element>
<xsd:element name="Serial" nillable="true">
<xsdimpleType>
<xsd:restriction base="xsdtring" />
</xsdimpleType>
</xsd:element>
<xsd:element name="SaleYear" nillable="true">
<xsdimpleType>
<xsd:restriction base="xsd:int" />
</xsdimpleType>
</xsd:element>
<xsd:element name="SaleDate" nillable="true">
<xsdimpleType>
<xsd:restriction base="xsdateTime" />
</xsdimpleType>
</xsd:element>
<xsd:element name="prd_itm_no" nillable="true">
<xsdimpleType>
<xsd:restriction base="xsd:int" />
</xsdimpleType>
</xsd:element>
</xsdequence>
</xsd:restriction>
</xsd:complexContent>
</xsd:complexType>
</xsd:element>
</xsdchema>
This is the SQL query I'm running against the source table:
INSERT INTO [clmnt]
(
name1,
addr1,
addr2,
city,
[state],
zip5,
zip4,
CustomColumns)
SELECT
LastName,
StreetAddress1,
StreetAddress2,
City,
[State],
Postal5,
Postal4,
(SELECT CN_CUST_KEY,
ITM_SUF_NO,
Model,
Serial,
SaleYear,
SaleDate,
prd_itm_no
FROM TblClass_Customers FOR XML PATH (''), ELEMENTS, ROOT('CustomColumns'), TYPE)
FROM TblClass_Customers;
...and here is the error I'm getting:
Msg 6965, Level 16, State 1, Line 1
XML Validation: Invalid content. Expected element(s)erial where element 'SaleYear' was specified. Location: /*:CustomColumns[1]/*
aleYear[1]
From what I can see, the SELECT statement matches up with the XSD I'm using...the datatypes coincide...
Kind of at a loss here...
This is what the contents of the target xml column should look like:
<CustomColumns>
<CN_CUST_KEY>XXXYYYZZZZ</CN_CUST_KEY>
<ITM_SUF_NO>45</ITM_SUF_NO>
<Model>alpha</Model>
<Serial>djdjxh3455skx</Serial>
<SaleYear>2005</SaleYear>
<SaleDate>10/25</SaleDate>
<prd_itm_no>df345f</prd_itm_no>
</CustomColumns>
One thing I did find...if I just run the portion of the query that selects the columns for my XML column, the query runs, but doesn't give the expected results. It concatenates everything into one long string...
<CustomColumns><row><CN_CUST_KEY>1000422</CN_CUST_KEY><ITM_SUF_NO>9</ITM_SUF_NO><Model>11073954200</Model><SaleYear>2003</SaleYear><SaleDate>2003-08-23 00:00:00</SaleDate><prd_itm_no>73954</prd_itm_no></row><row><CN_CUST_KEY>1000812</CN_CUST_KEY><ITM_SUF_NO>13</ITM_SUF_NO><Model>11063932101</Model><Serial>MM2610444</Serial><SaleYear>2002</SaleYear><SaleDate>2002-08-26 00:00:00</SaleDate><prd_itm_no>63932</prd_itm_no></row><row><CN_CUST_KEY>1001610</CN_CUST_KEY><ITM_SUF_NO>14</ITM_SUF_NO><Model>11062952100</Model><SaleYear>2002</SaleYear><SaleDate>2002-02-03 00:00:00</SaleDate><prd_itm_no>62952</prd_itm_no></row><row><CN_CUST_KEY>1004682</CN_CUST_KEY><ITM_SUF_NO>15</ITM_SUF_NO><Model>11072932100</Model><Serial>ML140325</Serial><SaleYear>2001</SaleYear><SaleDate>2001-04-21 00:00:00</SaleDate><prd_itm_no>72932</prd_itm_no></row><row><CN_CUST_KEY>1004867</CN_CUST_KEY><ITM_SUF_NO>12</ITM_SUF_NO><Model>11073952200</Model><SaleYear>2003</SaleYear><SaleDate>2003-04-08 00:00:00</SaleDate><prd_itm_no>73952</prd_itm_no></row><row><CN_CUST_KEY>1005117</CN_CUST_KEY><ITM_SUF_NO>7</ITM_SUF_NO><Model>11072972100</Model><SaleYear>2002</SaleYear><SaleDate>2002-02-18 00:00:00</SaleDate><prd_itm_no>72972</prd_itm_no></row><row><CN_CUST_KEY>1005320</CN_CUST_KEY><ITM_SUF_NO>27</ITM_SUF_NO><Model>11072972100</Model><SaleYear>2001</SaleYear><SaleDate>2001-08-28 -07-01 00:00:00</SaleDate><prd_itm_no>73942</prd_itm_no></row><row><CN_CUST_KEY>5806620</CN_CUST_KEY><ITM_SUF_NO>2</ITM_SUF_NO><Model>11062952100</Model><Serial>ML4334523</Serial><SaleYear>2002</SaleYear><SaleDate>2002-02-17
It also is inserting the <row> tag as opposed to actually creating an entirely new row. And lastly, the <CustomColumns> tag should encapsulate/wrap each record. When I run the select, every single row is contained within <CustomColumns>...
|||You should have PK or unique columns in the table tblClass_Customers right? Let's assume it's CustomerID. Try change to this:
.....
FROM TblClass_Customers T1 WHERE T1.CustomerID=T2.CustomerID
FOR XML PATH ('CustomColumns'), ELEMENTS, TYPE)
FROM TblClass_Customers T2;
|||Thanks a million!!!
Do you think you could just real quick explain to me what it is that made your chnges do the trick?
I'm learning from this whole XML experience...
Thanks again!!
|||Phe:
The SELECT alone works fine, but when coupled with the INSERT it blows up:
Code Snippet
INSERT INTO [clmnt]
(
name1,
addr1,
addr2,
city,
[state],
zip5,
zip4,
CustomColumns
)
SELECT
LastName,
StreetAddress1,
StreetAddress2,
City,
[State],
Postal5,
Postal4,
(SELECT CN_CUST_KEY,
ITM_SUF_NO,
Model,
Serial,
SaleYear,
SaleDate,
prd_itm_no
FROM TblClass_Customers2 T1 WHERE T1.CN_CUST_KEY = T2.CN_CUST_KEY
FOR XML PATH ('CustomColumns'), ELEMENTS, TYPE)
FROM TblClass_Customers2 T2
The validation error again:
Msg 6965, Level 16, State 1, Line 1
XML Validation: Invalid content. Expected element(s)erial where element 'SaleYear' was specified. Location: /*:CustomColumns[1]/*
aleYear[1]
I figured out hat the issue is, but not sure how to address it...
If a column that I'm using to generate my XML is NULL, the element is ignored...that's causing the XML to fail validation.
How would you tell the query to select the column/element even if there is no value or NULL in the source column?
I know there's the whole ELEMENTS XMISIL directive, but I don't wnat to get all the extra "stuff" it generates...
|||In your schema, you already set nillable to true for certain columns. SO you can just include XSNIL after the ELEMENT jkey word in the query.
If you don't want to include the element with NULL, change the schema to include minOccurs="0" and remove the nillable attribute.
|||
Phe:
Thanks for the reply....
I've actually tried the XSINIL keyword and saw it did bring in the NULL element. But it also brough along some other stuff - attributes and the header:
Code Snippet
<CustomColumns xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<CN_CUST_KEY>1000422</CN_CUST_KEY>
<ITM_SUF_NO>9</ITM_SUF_NO>
<Model>11073954200</Model>
<Serial xsi:nil="true" />
<SaleYear>2003</SaleYear>
<SaleDate>2003-08-23 00:00:00</SaleDate>
<prd_itm_no>73954</prd_itm_no>
</CustomColumns>
Is there any other way to accomplish this without having to have the attribute and the header there? Or is that the only way? I'm going to assume even if the attribute is there, I would just need to modify my queries to account for that particular attribute...
I've never used XQUERY before, and I've been reading through BOL, but again, it's all Greek to me. How would you formulate a basic query against the XML above. For eample, I just want to select all rows where the <Serial> value is NULL (NIL)...?
Am I making sense...? I hope so!!
Thanks again...
|||You can check e.g.
Code Snippet
SELECT @.x.query(
'CustomColumns[Serial/@.xsi:nil = true()]'
);
|||Thanks Martin:
But all that query returned was a NULL - and I'm sure there are at least 1000 rows that have a NULL/NIL element
Last question I swear....!
So what is all the namespace declarations I see in all the examples in BOL? Are those necessary and what purpose do they serve?
For example, I want to return all rows from the CLMNT table where the value of the <serial> element in the CustomColumns XML column is 140325. Here is the query I concocted:
Code Snippet
SELECT clmnt.query('
declare default namespace "http://schemas.microsoft.com/sqlserver/2004/07/EVEN/CustomColumns";
/CustomColumns/Serial
') as Result
FROM [CustomColumns].[Serial]
WHERE Serial = '140325'
I'm sure it's completely way off the mark, but if someone could give some insight that would be great.
Also, does anyone know of some realy good very beginner XQUERY documentation/tutorials? I've been through BOL and frankly it didn't help...
Thanks for tolerating an XML noob...