Showing posts with label created. Show all posts
Showing posts with label created. Show all posts

Friday, March 30, 2012

Importing XML in SQL Server 2005 in .Net 2.0

Yes, I have schema. I have also created database structure from XML
Schema. XML Schema is in such a way that it is not having annotation
suitlable for SQLXML.
You mentioned about loading XML into tables, but how do I update table
into database (I assume you mean Table datastruction in ADO.net)?
Thanks
NeoHello Neo,
Okay, so let me ask this: are you simply interested in loading the XML into
the database in some batch operation or do you both load the orginal data
into SQL Server and modify stored values from XML?
If it this is a simple ETL (former) case, SSIS would be my tool of choice.
If you're looking to both load and update from XML, SSIS is still usable,
but the solution might look a little different.
If you simply putting the XML into a XML-typed column, consider using openro
wset(bulk)
instead.
More details helpful. :)
Thanks!
Kent Tegels
DevelopMentor
http://staff.develop.com/ktegels/

Importing XML for Newbies

I need to store log files that are created in XML format so I can query some
of the fields. I have only minor knowledge of XML, but have been SQL DBA fo
r
a while. I just have not had a need to use XML in the SQL environment. We
are using SQL2000. What is the best way to do the following:
1. Create a DB for storing the log files? Some variation on the usual way?
2. Import the individual files into the DB. Eventually I can write a DTS or
something to import?
3. Query the DB. Create some reports on usage.
If somene could post a link to get me started it would be very helpful.DaveK wrote:
> I need to store log files that are created in XML format so I can query so
me
> of the fields.
Why not just use an XQuery implementation to query them direct? I see no
requirement here to involve a database at all.
///Peter

> I have only minor knowledge of XML, but have been SQL DBA for
> a while. I just have not had a need to use XML in the SQL environment. W
e
> are using SQL2000. What is the best way to do the following:
> 1. Create a DB for storing the log files? Some variation on the usual way
?
> 2. Import the individual files into the DB. Eventually I can write a DTS
or
> something to import?
> 3. Query the DB. Create some reports on usage.|||After a five minute web search and SQL help search I don't see how I can
avoid using SQL to build some sort of datbase just so I can address backup,
security, etc. It looks like XPath and XQuery can do some searching, but th
e
other items that I need to cover are not really addressed. This is an
example of one event log item I want to store.
<CreateDate>7/31/2007</CreateDate>
<CreateTime>10:19:25</CreateTime>
<Logger>xxxxd7yrrt11</Logger>
<Events>
<Event>
<EventID>8001</EventID>
<Description>Login</Description>
<Category>Audit</Category>
<Source>LAN Client</Source>
<SubSource>xxxx_8</SubSource>
<UserName>test.name</UserName>
<UserID>347</UserID>
<Computer>xxx7YRRT11</Computer>
<Date>07/31/2007</Date>
<Time>10:19:49</Time>
<ObjectType>User</ObjectType>
<Details></Details>
</Event>
I have to admit I am not a convert to the XML world, but as I said, I am a
newbie to it as well. In this case it just seems like a fancy way to
eliminate delimited importing. The format is not likely to change.|||DaveK wrote:
> After a five minute web search and SQL help search I don't see how I can
> avoid using SQL to build some sort of datbase just so I can address backup
,
> security, etc. It looks like XPath and XQuery can do some searching, but
the
> other items that I need to cover are not really addressed. This is an
> example of one event log item I want to store.
> <CreateDate>7/31/2007</CreateDate>
> <CreateTime>10:19:25</CreateTime>
> <Logger>xxxxd7yrrt11</Logger>
> <Events>
> <Event>
> <EventID>8001</EventID>
> <Description>Login</Description>
> <Category>Audit</Category>
> <Source>LAN Client</Source>
> <SubSource>xxxx_8</SubSource>
> <UserName>test.name</UserName>
> <UserID>347</UserID>
> <Computer>xxx7YRRT11</Computer>
> <Date>07/31/2007</Date>
> <Time>10:19:49</Time>
> <ObjectType>User</ObjectType>
> <Details></Details>
> </Event>
> I have to admit I am not a convert to the XML world, but as I said, I am a
> newbie to it as well. In this case it just seems like a fancy way to
> eliminate delimited importing. The format is not likely to change.
AFAIK all database systems now offer some kind of "Import XML" plugin.
If yours doesn't, you'll need to turn the XML into CSV or whatever your
system consumes. The easiest way to do this is to write an XSLT script,
and I think there are several quoted or linked in Dave Pawson's XSL FAQ
at http://www.dpawson.co.uk/xsl/
XML is just a fancy way of identifying information: you can see from the
above example that it is much clearer in naming items and positioning
them in the hierarchy than (for example) CSV. If the format is stable,
then a little routine to run XSLT over the data and spit out CSV should
do you just fine.
The following appears to work for the sample above (with the addition of
the missing </Events> end-tag and the enclosing root element
<data>...</data> ):
<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
version="1.0">
<xsl:output method="text"/>
<xsl:strip-space elements="*"/>
<xsl:template match="/">
<xsl:apply-templates select="data/CreateDate"/>
<xsl:apply-templates select="data/CreateTime"/>
<xsl:apply-templates select="data/Logger"/>
<xsl:apply-templates select="data/Events/Event/*"/>
</xsl:template>
<!-- fields that start a record -->
<xsl:template match="CreateDate|EventID">
<xsl:text>"</xsl:text>
<xsl:value-of select="."/>
</xsl:template>
<!-- fields that occur in mid-record -->
<xsl:template match="*">
<xsl:text>,"</xsl:text>
<xsl:value-of select="."/>
<xsl:text>"</xsl:text>
</xsl:template>
<!-- fields that end a record -->
<xsl:template match="Logger|Details">
<xsl:text>,"</xsl:text>
<xsl:value-of select="."/>
<xsl:text>" </xsl:text>
</xsl:template>
</xsl:stylesheet>
$ java -jar /usr/local/saxon/b8.5/saxon8.jar -o test.csv test.xml test.xsl
$ cat test.csv
"7/31/2007,"10:19:25","xxxxd7yrrt11"
"8001,"Login","Audit","LAN
Client","xxxx_8","test.name","347","xxx7YRRT11","07/31/2007","10:19:49","Use
r",""
$
This makes the assumption that your import routine can do something
different with record #1...
///Peter
--
XML FAQ: http://xml.silmaril.ie/|||Thank you. This is very helpful. I did not paste in the whole log so misse
d
the ending tags.

Importing XML for Newbies

I need to store log files that are created in XML format so I can query some
of the fields. I have only minor knowledge of XML, but have been SQL DBA for
a while. I just have not had a need to use XML in the SQL environment. We
are using SQL2000. What is the best way to do the following:
1. Create a DB for storing the log files? Some variation on the usual way?
2. Import the individual files into the DB. Eventually I can write a DTS or
something to import?
3. Query the DB. Create some reports on usage.
If somene could post a link to get me started it would be very helpful.
After a five minute web search and SQL help search I don't see how I can
avoid using SQL to build some sort of datbase just so I can address backup,
security, etc. It looks like XPath and XQuery can do some searching, but the
other items that I need to cover are not really addressed. This is an
example of one event log item I want to store.
<CreateDate>7/31/2007</CreateDate>
<CreateTime>10:19:25</CreateTime>
<Logger>xxxxd7yrrt11</Logger>
<Events>
<Event>
<EventID>8001</EventID>
<Description>Login</Description>
<Category>Audit</Category>
<Source>LAN Client</Source>
<SubSource>xxxx_8</SubSource>
<UserName>test.name</UserName>
<UserID>347</UserID>
<Computer>xxx7YRRT11</Computer>
<Date>07/31/2007</Date>
<Time>10:19:49</Time>
<ObjectType>User</ObjectType>
<Details></Details>
</Event>
I have to admit I am not a convert to the XML world, but as I said, I am a
newbie to it as well. In this case it just seems like a fancy way to
eliminate delimited importing. The format is not likely to change.
|||Thank you. This is very helpful. I did not paste in the whole log so missed
the ending tags.

Wednesday, March 28, 2012

Importing txt files with sql server 2000

HI!

I am importing .txt files. How can i check the errors? I have created a
log file, but the problem is that i lose some characters.
I import for example:

Code
ABC
FZH
JHN

from a text file, but sometimes Code can be 4 caracters long
I import this 3 characters long now. When i add the same structured
text file with some rows lenght 4, it skips the last character, but i
get nothing in the log file.

please help
xgirlSo you already have a table after importing items with 3 characters,
and then later you import items with 4 characters?

Perhaps it set the field width to 3 based on what it found in the first
import.

Sounds like you may need to manually widen the field. Open the table
in design view and make sure the field width is large enough for 4
characters.|||If you are using a DTS for the import, you may want to check the
Transform Data Task. If you are using fixed width delimiting then that
may be the problem.|||I changed from varchar ->nvarchar if you mean that but still i get no
errors of lost characters.

The problem is i have a lot of .txt files and i will get in the future
the same files with different data. if i automaticlly import every
file, how can i be sure the data are not longer and i didn't lost some
characters.

thank you
xgirl|||I changed from varchar ->nvarchar if you mean that but still i get no
errors of lost characters.

The problem is i have a lot of .txt files and i will get in the future
the same files with different data. if i automaticlly import every
file, how can i be sure the data are not longer and i didn't lost some
characters.

thank you
xgirl

Friday, March 23, 2012

Importing RDL from Server to VS

One of my staff members created a report and published it to the server. The
original Visual Studio project has been lost. Is there a way to import the
report definition from the server into Visual Studio for editing?
I realize that we could hand edit the RDL. I just want to try and avoid this.David,
I've found myself in the same situation in the past couple days.
Here's what I did (it may not be the best, most efficient way, but it
worked).
In Report Manager, click Edit on the Properties>General tab for the
given report. You should get a File Download window; select Save and
put the rdl in the folder where you want it.
In VS, create a new report project. Open Solution Explorer and right
click Reports. Select Add>Add new item then pick your rdl. You should
then be able to open it in Designer.
HTH
toolman
David wrote:
> One of my staff members created a report and published it to the server. The
> original Visual Studio project has been lost. Is there a way to import the
> report definition from the server into Visual Studio for editing?
> I realize that we could hand edit the RDL. I just want to try and avoid this.

Importing old data into asp_tables...?

I have created an application that uses the login, create, etc login components in .net. How hard is it to convert all my old users, passwords, usertypes into the new tables. It almost looks like I have to do them by hand and created a new guid(userid), along with the same guid in the aspnet_usersinroles and aspnet_Membership. Is there a script to do this programatically?

Hi,

You can use t-sql cursor to go record by record thru your old table and inserting to asp.net tables. The difficult part is passwords. I think it would be too difficult to decode and encode password in t-sql.

So, probably first import data and then use asp.net to re-encode passwords. As far as I know there are no utilities to do it automatically.

Wednesday, March 21, 2012

Importing from Access question

Hello,
I have a SQL Server database that was created by importing an MS Access
database.
In one of my SQL tables I have added several fields, and have populated some
of these fields with data.
Is there any way to create a DTS package, or do a straight import, where I
pull all records from the original table in Access, but without losing the
data in the new fields in SQL?
Basically I want to rewrite all my SQL data, but retain the new values...
For ex. if I have a record with ID 1000 in both database tables, but in my
SQL I have an extra field, with data...can I import all records from Access,
but for record with ID 1000, retain my new data?
Thanks in advance!
AmberYes, In Access, create a Linked Table (Go File, Get External Data, Link
Tables, and create a linked table that points to the SQL Server Table...
Then in Access, create an Update Query, that updates the Linked Table values
for the columns that are in Access which is based on a join between the
Access table and the SQL Table, on whatever the PK is on both sides..
"amber" wrote:

> Hello,
> I have a SQL Server database that was created by importing an MS Access
> database.
> In one of my SQL tables I have added several fields, and have populated so
me
> of these fields with data.
> Is there any way to create a DTS package, or do a straight import, where I
> pull all records from the original table in Access, but without losing the
> data in the new fields in SQL?
> Basically I want to rewrite all my SQL data, but retain the new values...
> For ex. if I have a record with ID 1000 in both database tables, but in my
> SQL I have an extra field, with data...can I import all records from Acces
s,
> but for record with ID 1000, retain my new data?
> Thanks in advance!
> Amber

importing from access error

Let me describe my situation

I am running SQL Server 2005. I created all the tables/columns/keys I needed. I export the data to Access.

In Access, I add all the data for the tables. Then I use SSIS to import-append the data back in to SQL Server 2005.

I get this error:
-
- Pre-execute (Error)
Messages
Error 0xc0202009: Data Flow Task: An OLE DB error has occurred. Error code: 0x80040E21.
An OLE DB record is available. Source: "Microsoft SQL Native Client" Hresult: 0x80040E21 Description: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.".
(SQL Server Import and Export Wizard)
Error 0xc0202025: Data Flow Task: Cannot create an OLE DB accessor. Verify that the column metadata is valid.
(SQL Server Import and Export Wizard)
Error 0xc004701a: Data Flow Task: component "Destination - tbl_name" (19) failed the pre-execute phase and returned error code 0xC0202025.
(SQL Server Import and Export Wizard)
--

I can use the import tool (SSIS) to create new tables, but not append the data. Creating new tables makes all the tables to lose information about field lengths etc, so this is not an option

any help would be greatly appreciated.

Thanks!

Could you verify the metadata of your existing tables matches the tables you want to import. The error message hints a problem in that area.

Thanks.

|||yea..turns out the metadata was in error...but the access DB's metadata is the result of an export of the SQL Server DB.

Why doesn't Microsoft let you import what you export? just seems to make sense to me.
|||

It is because the data type spectrum is much richer for SQL Server than for Access. By transferring data from SQL Server to Access you are basically downgrading richness of your data and it makes the reverse process difficult without the user intervention.

Even for equally rich data type spectrums of heterogeneous data sources you will end up having to map more than one type to the single destination type. When going in the opposite direction you will have to pick one of possible types as a best match, and the best match will not always be the proper one.

HTH.

Monday, March 19, 2012

Importing existing data

Hi
In vs2005 beta2 I have created a sql server data file from within vs. Is
there
any way to import existing access tables & data into the sql server
datafile? I have tried installed sql server 2004 developer from msdn but did
not see any enterprise manager type tools that come with sql server 2000.
What am I missing?
Thanks
Regards
Hi,
Looks like you have MSDE installed. MSDE will not come with any GUI tools
like Query Analyzer / Enterprise Manager / DTS. You have 2 options to
load the data from MS Access.
1. Install SQL 2000 Tools and use DTS tools to load the data from access.
But you need to purchase sql server license
2. From access export the data to text files and in SQL server (OSQL) use
BULK insert to load the data in sql server database. You could also
use BCP IN to load the data
Thanks
Hari
SQL Server MVP
"John" <John@.nospam.infovis.co.uk> wrote in message
news:emoxENaTFHA.1404@.TK2MSFTNGP09.phx.gbl...
> Hi
> In vs2005 beta2 I have created a sql server data file from within vs. Is
> there
> any way to import existing access tables & data into the sql server
> datafile? I have tried installed sql server 2004 developer from msdn but
did
> not see any enterprise manager type tools that come with sql server 2000.
> What am I missing?
> Thanks
> Regards
>
>

Importing existing data

Hi
In vs2005 beta2 I have created a sql server data file from within vs. Is
there
any way to import existing access tables & data into the sql server
datafile? I have tried installed sql server 2004 developer from msdn but did
not see any enterprise manager type tools that come with sql server 2000.
What am I missing?
Thanks
RegardsHi,
Looks like you have MSDE installed. MSDE will not come with any GUI tools
like Query Analyzer / Enterprise Manager / DTS. You have 2 options to
load the data from MS Access.
1. Install SQL 2000 Tools and use DTS tools to load the data from access.
But you need to purchase sql server license
2. From access export the data to text files and in SQL server (OSQL) use
BULK insert to load the data in sql server database. You could also
use BCP IN to load the data
Thanks
Hari
SQL Server MVP
"John" <John@.nospam.infovis.co.uk> wrote in message
news:emoxENaTFHA.1404@.TK2MSFTNGP09.phx.gbl...
> Hi
> In vs2005 beta2 I have created a sql server data file from within vs. Is
> there
> any way to import existing access tables & data into the sql server
> datafile? I have tried installed sql server 2004 developer from msdn but
did
> not see any enterprise manager type tools that come with sql server 2000.
> What am I missing?
> Thanks
> Regards
>
>

Importing existing data

Hi
In vs2005 beta2 I have created a sql server data file from within vs. Is
there
any way to import existing access tables & data into the sql server
datafile? I have tried installed sql server 2004 developer from msdn but did
not see any enterprise manager type tools that come with sql server 2000.
What am I missing?
Thanks
RegardsHi,
Looks like you have MSDE installed. MSDE will not come with any GUI tools
like Query Analyzer / Enterprise Manager / DTS. You have 2 options to
load the data from MS Access.
1. Install SQL 2000 Tools and use DTS tools to load the data from access.
But you need to purchase sql server license
2. From access export the data to text files and in SQL server (OSQL) use
BULK insert to load the data in sql server database. You could also
use BCP IN to load the data
Thanks
Hari
SQL Server MVP
"John" <John@.nospam.infovis.co.uk> wrote in message
news:emoxENaTFHA.1404@.TK2MSFTNGP09.phx.gbl...
> Hi
> In vs2005 beta2 I have created a sql server data file from within vs. Is
> there
> any way to import existing access tables & data into the sql server
> datafile? I have tried installed sql server 2004 developer from msdn but
did
> not see any enterprise manager type tools that come with sql server 2000.
> What am I missing?
> Thanks
> Regards
>
>

Importing existing data

Hi
In vs2005 beta2 I have created a sql server data file from within vs. Is
there
any way to import existing access tables & data into the sql server
datafile? I have tried installed sql server 2004 developer from msdn but did
not see any enterprise manager type tools that come with sql server 2000.
What am I missing?
Thanks
Regards
Hi,
Looks like you have MSDE installed. MSDE will not come with any GUI tools
like Query Analyzer / Enterprise Manager / DTS. You have 2 options to
load the data from MS Access.
1. Install SQL 2000 Tools and use DTS tools to load the data from access.
But you need to purchase sql server license
2. From access export the data to text files and in SQL server (OSQL) use
BULK insert to load the data in sql server database. You could also
use BCP IN to load the data
Thanks
Hari
SQL Server MVP
"John" <John@.nospam.infovis.co.uk> wrote in message
news:emoxENaTFHA.1404@.TK2MSFTNGP09.phx.gbl...
> Hi
> In vs2005 beta2 I have created a sql server data file from within vs. Is
> there
> any way to import existing access tables & data into the sql server
> datafile? I have tried installed sql server 2004 developer from msdn but
did
> not see any enterprise manager type tools that come with sql server 2000.
> What am I missing?
> Thanks
> Regards
>
>

Importing existing data

Hi
In vs2005 beta2 I have created a sql server data file from within vs. Is
there
any way to import existing access tables & data into the sql server
datafile? I have tried installed sql server 2004 developer from msdn but did
not see any enterprise manager type tools that come with sql server 2000.
What am I missing?
Thanks
Regards
Hi,
Looks like you have MSDE installed. MSDE will not come with any GUI tools
like Query Analyzer / Enterprise Manager / DTS. You have 2 options to
load the data from MS Access.
1. Install SQL 2000 Tools and use DTS tools to load the data from access.
But you need to purchase sql server license
2. From access export the data to text files and in SQL server (OSQL) use
BULK insert to load the data in sql server database. You could also
use BCP IN to load the data
Thanks
Hari
SQL Server MVP
"John" <John@.nospam.infovis.co.uk> wrote in message
news:emoxENaTFHA.1404@.TK2MSFTNGP09.phx.gbl...
> Hi
> In vs2005 beta2 I have created a sql server data file from within vs. Is
> there
> any way to import existing access tables & data into the sql server
> datafile? I have tried installed sql server 2004 developer from msdn but
did
> not see any enterprise manager type tools that come with sql server 2000.
> What am I missing?
> Thanks
> Regards
>
>

Importing Excel file using DTS use of VB package

Dear All,
I had created a Package for importing a excel file.
while i created the package the data was ported.
and i had saved the package.
now when i try to port the data by executing the package.
the file is not ported.
what could be the problem.
help me......
Regards _PremAny errors?
Is the file empty, are you trying to import the same file?
Instead of DTS you can also use OPENROWSET or OPENDATASOURCE
Here is an OPENROWSET example
INSERT INTO YourTable
SELECT * FROM OPENROWSET( 'Microsoft.Jet.OLEDB.4.0',
'Excel 8.0;Database=C:\testing.xls','SELECT * FROM [Sheet1$]')
Denis the SQL Menace
http://sqlservercode.blogspot.com/|||Also refer
http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=49926
Madhivanan

Importing DB4 files into SQL Server 2005

I'm not sure this is the right forum for this, but here goes,

I have some DB4 format files. created by the Borland Database engine.

I need to import these into SQL Server 2005.

I can open them with Access 2000 & 2002, but not 2003.

The Import Wizard also fails with the DB4 format, but can read the Access 2000 and 2002 converted files.

The Import Wizard indicates success, but no tables appear in the Server Management Studio.

There must be a way. HELP!!!!

Try getting FOXPRO driver and use DTS to recognise the DB4 files.

Monday, March 12, 2012

Importing database

I have two database in server name comp1,such as : ayush and test. I have created some tables in ayush database. No tables done at test,having only default tables and other defaults. I want to transfer all tables with data from ayush to test completely. I have tried in import and export wizard,but in programming language it gives error, that database cannot fetch in forward direction. Please help to solve this problem.

Quote:

Originally Posted by shilabhadra

I have two database in server name comp1,such as : ayush and test. I have created some tables in ayush database. No tables done at test,having only default tables and other defaults. I want to transfer all tables with data from ayush to test completely. I have tried in import and export wizard,but in programming language it gives error, that database cannot fetch in forward direction. Please help to solve this problem.


Option1: use enterprise manager. either export from ayush or import to test...

Option2: backup the entire ayush db. restore it as test

importing data to sqlserver express

I would like to import data to a sqlserver express database. The database I am interested in importing was created in MSDE. I tried backing it up and using the restore function in sqlserver express but with no success. I have not been able to find an import function in the Management Studio Express interface. Is it possible to import this information, or should I get busy with my data entry?Restoring the MSDE Database should be fine, which error do you get ?

Jens K. Suessmeyer

http://www.sqlserver2005.de

|||

Thanks for the reply. The error tells me the restore failed for the following reason;

System.Data.SqlClient.SqlError: The backup set holds a backup of a database other than the existing 'TestData' database.

In considering this, I realized that I failed to create all of the tables located in the original database. Could that be the problem?

|||I solved my own problem. In thinking about my options, I remembered this from a long ago project. I moved the .mdf and .ldf files into the MSSql\Data folder on the computer with Sqlserver Express. Then I Attached the database by pointing at the .mdf file. I now have my complete database on SqlServer Express. Thanks.

Friday, March 9, 2012

importing data issue

Hello,
We are moving our data from one sql2k to another. The initial db has all
of the tables in a primary filegroup. We have created multiple filegroups
for the new server from scripts. When we try to import the data from the old
sql to the new we get a bunch of FK and PK violations. Is this a way to shut
all of these off for the import and then turn them all back on or do we have
to do each table separately? Thanks in advance.
JohnThe PK violations implies that there is something wrong
with the import. You need to make sure the data added is
correct. Something like a different collation could cause
this.
FK - you just need to import the tables in the correct
order. ANother option is to remove all the FKs and add
them after the import.
Given that you seem to have problems wit hthe data I
would make sure the data is checked when the FKs are
added.

Importing Data Into XML Column - UPDATED

Okay...I as actually able to get fairly far in my attmepts to copy data from a SQL Server table to an XML column. Here is the XSD I created:

<xsdTongue Tiedchema xmlns:xsd="http://www.w3.org/2001/XMLSchema">

<xsd:element name="CustomColumns">

<xsd:complexType>

<xsd:complexContent>

<xsd:restriction base="xsd:anyType">

<xsdTongue Tiedequence>

<xsd:element name="CN_CUST_KEY" nillable="true">

<xsdTongue TiedimpleType>

<xsd:restriction base="xsd:float" />

</xsdTongue TiedimpleType>

</xsd:element>

<xsd:element name="ITM_SUF_NO" nillable="true">

<xsdTongue TiedimpleType>

<xsd:restriction base="xsd:float" />

</xsdTongue TiedimpleType>

</xsd:element>

<xsd:element name="Model" nillable="true">

<xsdTongue TiedimpleType>

<xsd:restriction base="xsdTongue Tiedtring" />

</xsdTongue TiedimpleType>

</xsd:element>

<xsd:element name="Serial" nillable="true">

<xsdTongue TiedimpleType>

<xsd:restriction base="xsdTongue Tiedtring" />

</xsdTongue TiedimpleType>

</xsd:element>

<xsd:element name="SaleYear" nillable="true">

<xsdTongue TiedimpleType>

<xsd:restriction base="xsd:int" />

</xsdTongue TiedimpleType>

</xsd:element>

<xsd:element name="SaleDate" nillable="true">

<xsdTongue TiedimpleType>

<xsd:restriction base="xsdBig SmileateTime" />

</xsdTongue TiedimpleType>

</xsd:element>

<xsd:element name="prd_itm_no" nillable="true">

<xsdTongue TiedimpleType>

<xsd:restriction base="xsd:int" />

</xsdTongue TiedimpleType>

</xsd:element>

</xsdTongue Tiedequence>

</xsd:restriction>

</xsd:complexContent>

</xsd:complexType>

</xsd:element>

</xsdTongue Tiedchema>

This is the SQL query I'm running against the source table:

INSERT INTO [clmnt]

(

name1,

addr1,

addr2,

city,

[state],

zip5,

zip4,

CustomColumns)

SELECT

LastName,

StreetAddress1,

StreetAddress2,

City,

[State],

Postal5,

Postal4,

(SELECT CN_CUST_KEY,

ITM_SUF_NO,

Model,

Serial,

SaleYear,

SaleDate,

prd_itm_no

FROM TblClass_Customers FOR XML PATH (''), ELEMENTS, ROOT('CustomColumns'), TYPE)

FROM TblClass_Customers;

...and here is the error I'm getting:

Msg 6965, Level 16, State 1, Line 1

XML Validation: Invalid content. Expected element(s)Tongue Tiederial where element 'SaleYear' was specified. Location: /*:CustomColumns[1]/*Tongue TiedaleYear[1]

From what I can see, the SELECT statement matches up with the XSD I'm using...the datatypes coincide...

Kind of at a loss here...

This is what the contents of the target xml column should look like:

<CustomColumns>

<CN_CUST_KEY>XXXYYYZZZZ</CN_CUST_KEY>

<ITM_SUF_NO>45</ITM_SUF_NO>

<Model>alpha</Model>

<Serial>djdjxh3455skx</Serial>

<SaleYear>2005</SaleYear>

<SaleDate>10/25</SaleDate>

<prd_itm_no>df345f</prd_itm_no>

</CustomColumns>

One thing I did find...if I just run the portion of the query that selects the columns for my XML column, the query runs, but doesn't give the expected results. It concatenates everything into one long string...

<CustomColumns><row><CN_CUST_KEY>1000422</CN_CUST_KEY><ITM_SUF_NO>9</ITM_SUF_NO><Model>11073954200</Model><SaleYear>2003</SaleYear><SaleDate>2003-08-23 00:00:00</SaleDate><prd_itm_no>73954</prd_itm_no></row><row><CN_CUST_KEY>1000812</CN_CUST_KEY><ITM_SUF_NO>13</ITM_SUF_NO><Model>11063932101</Model><Serial>MM2610444</Serial><SaleYear>2002</SaleYear><SaleDate>2002-08-26 00:00:00</SaleDate><prd_itm_no>63932</prd_itm_no></row><row><CN_CUST_KEY>1001610</CN_CUST_KEY><ITM_SUF_NO>14</ITM_SUF_NO><Model>11062952100</Model><SaleYear>2002</SaleYear><SaleDate>2002-02-03 00:00:00</SaleDate><prd_itm_no>62952</prd_itm_no></row><row><CN_CUST_KEY>1004682</CN_CUST_KEY><ITM_SUF_NO>15</ITM_SUF_NO><Model>11072932100</Model><Serial>ML140325</Serial><SaleYear>2001</SaleYear><SaleDate>2001-04-21 00:00:00</SaleDate><prd_itm_no>72932</prd_itm_no></row><row><CN_CUST_KEY>1004867</CN_CUST_KEY><ITM_SUF_NO>12</ITM_SUF_NO><Model>11073952200</Model><SaleYear>2003</SaleYear><SaleDate>2003-04-08 00:00:00</SaleDate><prd_itm_no>73952</prd_itm_no></row><row><CN_CUST_KEY>1005117</CN_CUST_KEY><ITM_SUF_NO>7</ITM_SUF_NO><Model>11072972100</Model><SaleYear>2002</SaleYear><SaleDate>2002-02-18 00:00:00</SaleDate><prd_itm_no>72972</prd_itm_no></row><row><CN_CUST_KEY>1005320</CN_CUST_KEY><ITM_SUF_NO>27</ITM_SUF_NO><Model>11072972100</Model><SaleYear>2001</SaleYear><SaleDate>2001-08-28 -07-01 00:00:00</SaleDate><prd_itm_no>73942</prd_itm_no></row><row><CN_CUST_KEY>5806620</CN_CUST_KEY><ITM_SUF_NO>2</ITM_SUF_NO><Model>11062952100</Model><Serial>ML4334523</Serial><SaleYear>2002</SaleYear><SaleDate>2002-02-17

It also is inserting the <row> tag as opposed to actually creating an entirely new row. And lastly, the <CustomColumns> tag should encapsulate/wrap each record. When I run the select, every single row is contained within <CustomColumns>...

|||

You should have PK or unique columns in the table tblClass_Customers right? Let's assume it's CustomerID. Try change to this:

.....

FROM TblClass_Customers T1 WHERE T1.CustomerID=T2.CustomerID

FOR XML PATH ('CustomColumns'), ELEMENTS, TYPE)

FROM TblClass_Customers T2;

|||

Thanks a million!!!

Do you think you could just real quick explain to me what it is that made your chnges do the trick?

I'm learning from this whole XML experience...

Thanks again!!

|||

Phe:

The SELECT alone works fine, but when coupled with the INSERT it blows up:

Code Snippet

INSERT INTO [clmnt]

(

name1,

addr1,

addr2,

city,

[state],

zip5,

zip4,

CustomColumns

)

SELECT

LastName,

StreetAddress1,

StreetAddress2,

City,

[State],

Postal5,

Postal4,

(SELECT CN_CUST_KEY,

ITM_SUF_NO,

Model,

Serial,

SaleYear,

SaleDate,

prd_itm_no

FROM TblClass_Customers2 T1 WHERE T1.CN_CUST_KEY = T2.CN_CUST_KEY

FOR XML PATH ('CustomColumns'), ELEMENTS, TYPE)

FROM TblClass_Customers2 T2

The validation error again:

Msg 6965, Level 16, State 1, Line 1

XML Validation: Invalid content. Expected element(s)Tongue Tiederial where element 'SaleYear' was specified. Location: /*:CustomColumns[1]/*Tongue TiedaleYear[1]

|||

I figured out hat the issue is, but not sure how to address it...

If a column that I'm using to generate my XML is NULL, the element is ignored...that's causing the XML to fail validation.

How would you tell the query to select the column/element even if there is no value or NULL in the source column?

I know there's the whole ELEMENTS XMISIL directive, but I don't wnat to get all the extra "stuff" it generates...

|||

In your schema, you already set nillable to true for certain columns. SO you can just include XSNIL after the ELEMENT jkey word in the query.

If you don't want to include the element with NULL, change the schema to include minOccurs="0" and remove the nillable attribute.

|||

Phe:

Thanks for the reply....

I've actually tried the XSINIL keyword and saw it did bring in the NULL element. But it also brough along some other stuff - attributes and the header:

Code Snippet

<CustomColumns xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">

<CN_CUST_KEY>1000422</CN_CUST_KEY>

<ITM_SUF_NO>9</ITM_SUF_NO>

<Model>11073954200</Model>

<Serial xsi:nil="true" />

<SaleYear>2003</SaleYear>

<SaleDate>2003-08-23 00:00:00</SaleDate>

<prd_itm_no>73954</prd_itm_no>

</CustomColumns>

Is there any other way to accomplish this without having to have the attribute and the header there? Or is that the only way? I'm going to assume even if the attribute is there, I would just need to modify my queries to account for that particular attribute...

I've never used XQUERY before, and I've been reading through BOL, but again, it's all Greek to me. How would you formulate a basic query against the XML above. For eample, I just want to select all rows where the <Serial> value is NULL (NIL)...?

Am I making sense...? I hope so!!

Thanks again...

|||

You can check e.g.

Code Snippet

SELECT @.x.query(

'CustomColumns[Serial/@.xsi:nil = true()]'

);

|||

Thanks Martin:

But all that query returned was a NULL - and I'm sure there are at least 1000 rows that have a NULL/NIL element

Last question I swear....!

So what is all the namespace declarations I see in all the examples in BOL? Are those necessary and what purpose do they serve?

For example, I want to return all rows from the CLMNT table where the value of the <serial> element in the CustomColumns XML column is 140325. Here is the query I concocted:

Code Snippet

SELECT clmnt.query('

declare default namespace "http://schemas.microsoft.com/sqlserver/2004/07/EVEN/CustomColumns";

/CustomColumns/Serial

') as Result

FROM [CustomColumns].[Serial]

WHERE Serial = '140325'

I'm sure it's completely way off the mark, but if someone could give some insight that would be great.

Also, does anyone know of some realy good very beginner XQUERY documentation/tutorials? I've been through BOL and frankly it didn't help...

Thanks for tolerating an XML noob...

Importing Data Into XML Column - UPDATED

Okay...I as actually able to get fairly far in my attmepts to copy data from a SQL Server table to an XML column. Here is the XSD I created:

<xsdTongue Tiedchema xmlns:xsd="http://www.w3.org/2001/XMLSchema">

<xsd:element name="CustomColumns">

<xsd:complexType>

<xsd:complexContent>

<xsd:restriction base="xsd:anyType">

<xsdTongue Tiedequence>

<xsd:element name="CN_CUST_KEY" nillable="true">

<xsdTongue TiedimpleType>

<xsd:restriction base="xsd:float" />

</xsdTongue TiedimpleType>

</xsd:element>

<xsd:element name="ITM_SUF_NO" nillable="true">

<xsdTongue TiedimpleType>

<xsd:restriction base="xsd:float" />

</xsdTongue TiedimpleType>

</xsd:element>

<xsd:element name="Model" nillable="true">

<xsdTongue TiedimpleType>

<xsd:restriction base="xsdTongue Tiedtring" />

</xsdTongue TiedimpleType>

</xsd:element>

<xsd:element name="Serial" nillable="true">

<xsdTongue TiedimpleType>

<xsd:restriction base="xsdTongue Tiedtring" />

</xsdTongue TiedimpleType>

</xsd:element>

<xsd:element name="SaleYear" nillable="true">

<xsdTongue TiedimpleType>

<xsd:restriction base="xsd:int" />

</xsdTongue TiedimpleType>

</xsd:element>

<xsd:element name="SaleDate" nillable="true">

<xsdTongue TiedimpleType>

<xsd:restriction base="xsdBig SmileateTime" />

</xsdTongue TiedimpleType>

</xsd:element>

<xsd:element name="prd_itm_no" nillable="true">

<xsdTongue TiedimpleType>

<xsd:restriction base="xsd:int" />

</xsdTongue TiedimpleType>

</xsd:element>

</xsdTongue Tiedequence>

</xsd:restriction>

</xsd:complexContent>

</xsd:complexType>

</xsd:element>

</xsdTongue Tiedchema>

This is the SQL query I'm running against the source table:

INSERT INTO [clmnt]

(

name1,

addr1,

addr2,

city,

[state],

zip5,

zip4,

CustomColumns)

SELECT

LastName,

StreetAddress1,

StreetAddress2,

City,

[State],

Postal5,

Postal4,

(SELECT CN_CUST_KEY,

ITM_SUF_NO,

Model,

Serial,

SaleYear,

SaleDate,

prd_itm_no

FROM TblClass_Customers FOR XML PATH (''), ELEMENTS, ROOT('CustomColumns'), TYPE)

FROM TblClass_Customers;

...and here is the error I'm getting:

Msg 6965, Level 16, State 1, Line 1

XML Validation: Invalid content. Expected element(s)Tongue Tiederial where element 'SaleYear' was specified. Location: /*:CustomColumns[1]/*Tongue TiedaleYear[1]

From what I can see, the SELECT statement matches up with the XSD I'm using...the datatypes coincide...

Kind of at a loss here...

This is what the contents of the target xml column should look like:

<CustomColumns>

<CN_CUST_KEY>XXXYYYZZZZ</CN_CUST_KEY>

<ITM_SUF_NO>45</ITM_SUF_NO>

<Model>alpha</Model>

<Serial>djdjxh3455skx</Serial>

<SaleYear>2005</SaleYear>

<SaleDate>10/25</SaleDate>

<prd_itm_no>df345f</prd_itm_no>

</CustomColumns>

One thing I did find...if I just run the portion of the query that selects the columns for my XML column, the query runs, but doesn't give the expected results. It concatenates everything into one long string...

<CustomColumns><row><CN_CUST_KEY>1000422</CN_CUST_KEY><ITM_SUF_NO>9</ITM_SUF_NO><Model>11073954200</Model><SaleYear>2003</SaleYear><SaleDate>2003-08-23 00:00:00</SaleDate><prd_itm_no>73954</prd_itm_no></row><row><CN_CUST_KEY>1000812</CN_CUST_KEY><ITM_SUF_NO>13</ITM_SUF_NO><Model>11063932101</Model><Serial>MM2610444</Serial><SaleYear>2002</SaleYear><SaleDate>2002-08-26 00:00:00</SaleDate><prd_itm_no>63932</prd_itm_no></row><row><CN_CUST_KEY>1001610</CN_CUST_KEY><ITM_SUF_NO>14</ITM_SUF_NO><Model>11062952100</Model><SaleYear>2002</SaleYear><SaleDate>2002-02-03 00:00:00</SaleDate><prd_itm_no>62952</prd_itm_no></row><row><CN_CUST_KEY>1004682</CN_CUST_KEY><ITM_SUF_NO>15</ITM_SUF_NO><Model>11072932100</Model><Serial>ML140325</Serial><SaleYear>2001</SaleYear><SaleDate>2001-04-21 00:00:00</SaleDate><prd_itm_no>72932</prd_itm_no></row><row><CN_CUST_KEY>1004867</CN_CUST_KEY><ITM_SUF_NO>12</ITM_SUF_NO><Model>11073952200</Model><SaleYear>2003</SaleYear><SaleDate>2003-04-08 00:00:00</SaleDate><prd_itm_no>73952</prd_itm_no></row><row><CN_CUST_KEY>1005117</CN_CUST_KEY><ITM_SUF_NO>7</ITM_SUF_NO><Model>11072972100</Model><SaleYear>2002</SaleYear><SaleDate>2002-02-18 00:00:00</SaleDate><prd_itm_no>72972</prd_itm_no></row><row><CN_CUST_KEY>1005320</CN_CUST_KEY><ITM_SUF_NO>27</ITM_SUF_NO><Model>11072972100</Model><SaleYear>2001</SaleYear><SaleDate>2001-08-28 -07-01 00:00:00</SaleDate><prd_itm_no>73942</prd_itm_no></row><row><CN_CUST_KEY>5806620</CN_CUST_KEY><ITM_SUF_NO>2</ITM_SUF_NO><Model>11062952100</Model><Serial>ML4334523</Serial><SaleYear>2002</SaleYear><SaleDate>2002-02-17

It also is inserting the <row> tag as opposed to actually creating an entirely new row. And lastly, the <CustomColumns> tag should encapsulate/wrap each record. When I run the select, every single row is contained within <CustomColumns>...

|||

You should have PK or unique columns in the table tblClass_Customers right? Let's assume it's CustomerID. Try change to this:

.....

FROM TblClass_Customers T1 WHERE T1.CustomerID=T2.CustomerID

FOR XML PATH ('CustomColumns'), ELEMENTS, TYPE)

FROM TblClass_Customers T2;

|||

Thanks a million!!!

Do you think you could just real quick explain to me what it is that made your chnges do the trick?

I'm learning from this whole XML experience...

Thanks again!!

|||

Phe:

The SELECT alone works fine, but when coupled with the INSERT it blows up:

Code Snippet

INSERT INTO [clmnt]

(

name1,

addr1,

addr2,

city,

[state],

zip5,

zip4,

CustomColumns

)

SELECT

LastName,

StreetAddress1,

StreetAddress2,

City,

[State],

Postal5,

Postal4,

(SELECT CN_CUST_KEY,

ITM_SUF_NO,

Model,

Serial,

SaleYear,

SaleDate,

prd_itm_no

FROM TblClass_Customers2 T1 WHERE T1.CN_CUST_KEY = T2.CN_CUST_KEY

FOR XML PATH ('CustomColumns'), ELEMENTS, TYPE)

FROM TblClass_Customers2 T2

The validation error again:

Msg 6965, Level 16, State 1, Line 1

XML Validation: Invalid content. Expected element(s)Tongue Tiederial where element 'SaleYear' was specified. Location: /*:CustomColumns[1]/*Tongue TiedaleYear[1]

|||

I figured out hat the issue is, but not sure how to address it...

If a column that I'm using to generate my XML is NULL, the element is ignored...that's causing the XML to fail validation.

How would you tell the query to select the column/element even if there is no value or NULL in the source column?

I know there's the whole ELEMENTS XMISIL directive, but I don't wnat to get all the extra "stuff" it generates...

|||

In your schema, you already set nillable to true for certain columns. SO you can just include XSNIL after the ELEMENT jkey word in the query.

If you don't want to include the element with NULL, change the schema to include minOccurs="0" and remove the nillable attribute.

|||

Phe:

Thanks for the reply....

I've actually tried the XSINIL keyword and saw it did bring in the NULL element. But it also brough along some other stuff - attributes and the header:

Code Snippet

<CustomColumns xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">

<CN_CUST_KEY>1000422</CN_CUST_KEY>

<ITM_SUF_NO>9</ITM_SUF_NO>

<Model>11073954200</Model>

<Serial xsi:nil="true" />

<SaleYear>2003</SaleYear>

<SaleDate>2003-08-23 00:00:00</SaleDate>

<prd_itm_no>73954</prd_itm_no>

</CustomColumns>

Is there any other way to accomplish this without having to have the attribute and the header there? Or is that the only way? I'm going to assume even if the attribute is there, I would just need to modify my queries to account for that particular attribute...

I've never used XQUERY before, and I've been reading through BOL, but again, it's all Greek to me. How would you formulate a basic query against the XML above. For eample, I just want to select all rows where the <Serial> value is NULL (NIL)...?

Am I making sense...? I hope so!!

Thanks again...

|||

You can check e.g.

Code Snippet

SELECT @.x.query(

'CustomColumns[Serial/@.xsi:nil = true()]'

);

|||

Thanks Martin:

But all that query returned was a NULL - and I'm sure there are at least 1000 rows that have a NULL/NIL element

Last question I swear....!

So what is all the namespace declarations I see in all the examples in BOL? Are those necessary and what purpose do they serve?

For example, I want to return all rows from the CLMNT table where the value of the <serial> element in the CustomColumns XML column is 140325. Here is the query I concocted:

Code Snippet

SELECT clmnt.query('

declare default namespace "http://schemas.microsoft.com/sqlserver/2004/07/EVEN/CustomColumns";

/CustomColumns/Serial

') as Result

FROM [CustomColumns].[Serial]

WHERE Serial = '140325'

I'm sure it's completely way off the mark, but if someone could give some insight that would be great.

Also, does anyone know of some realy good very beginner XQUERY documentation/tutorials? I've been through BOL and frankly it didn't help...

Thanks for tolerating an XML noob...