Showing posts with label file. Show all posts
Showing posts with label file. Show all posts

Friday, March 30, 2012

imports table structures in SQL 2000 into Excel

Hi.
Is there anyway to export the table structures : data type,length,NULLABLE,Description into an Excel file using MS SQL Server?

Or I need to do it manually?
Thank you in advanced.
Sincerely

AgustinaRun this in Query Analyzer: (common data types, add the the case statement for more)


select name,
case xtype
when 56 then 'Int'
when 127 then 'BigInt'
when 167 then 'VarChar'
when 175 then 'Char'
when 60 then 'Money'
when 58 then 'SmallDateTime'
when 104 then 'Bit'
when 173 then 'TimeStamp'
when 61 then 'DateTime'
when 48 then 'TinyInt'
else 'Other' end,
length
from syscolumns
where id = (
select id
from sysobjects
where name = 'TheTableName')
order by colid
|||You could look up the Schema. Run this in Query Analyzer and adjust accordingly:
SELECT * FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_CATALOG = '<DATABASE NAME>' AND TABLE_SCHEMA = '<DB OWNER>' AND TABLE_NAME = '<YOUR TABLES NAME>'
sql

Importing XML Sample file and XML Schema

I recently received a schema file from a vendor to import into my SQL Server
.
I am new to XML and have been reading all I can. I tried doing a Bulk Load
in a DTS Package, but it does not read the schema as the correct file format
(asking me ANSI, Unicode, etc). I have a sample data file to test with but
it does not seem to be working. Can anyone give any insight on the easiest
way to do this? I will be receiving a wly feed from this vendor and will
need to import on a schedule. Thanks.Do you need to "shred" it into relational fields or keep the whole XML
document in a single field?
Best regards
Michael
"Andi" <Andi@.discussions.microsoft.com> wrote in message
news:51CB2EE7-C1F2-4F3E-84FB-7BF425C154F2@.microsoft.com...
>I recently received a schema file from a vendor to import into my SQL
>Server.
> I am new to XML and have been reading all I can. I tried doing a Bulk
> Load
> in a DTS Package, but it does not read the schema as the correct file
> format
> (asking me ANSI, Unicode, etc). I have a sample data file to test with
> but
> it does not seem to be working. Can anyone give any insight on the
> easiest
> way to do this? I will be receiving a wly feed from this vendor and
> will
> need to import on a schedule. Thanks.|||I need to shred it. There are multiple tables involved.
"Michael Rys [MSFT]" wrote:

> Do you need to "shred" it into relational fields or keep the whole XML
> document in a single field?
> Best regards
> Michael
> "Andi" <Andi@.discussions.microsoft.com> wrote in message
> news:51CB2EE7-C1F2-4F3E-84FB-7BF425C154F2@.microsoft.com...
>
>|||You have a couple of options:
Use the SQLXML 3.0 XML Bulkload component. You will need to add annotations
to your schema (see the documentation).
Use OpenXML. You pass the XML to a stored proc as a TEXT or NTEXT parameter
(you need to make sure that the XML document is in an encoding compatible
with the SQL Server collation codepage).
Write your own ASP/ASP.Net mid-tier code to perform the shredding.
Best regards
Michael
"Andi" <Andi@.discussions.microsoft.com> wrote in message
news:FE4E6BEE-4BB2-452E-8A9B-BC9F0E20C3CE@.microsoft.com...
>I need to shred it. There are multiple tables involved.
> "Michael Rys [MSFT]" wrote:
>|||Hello - found the issue - apparently, the:
targetNamespace="http://tempuri.org/fra-bel-xml-MicrosoftBusinessSolutionsAx
apta30_2.xsd"
needs to be removed from the <xs:schema> tag.
Now, can anyone tell me why? I'm not an XML expert but learning as much as I
can...
AB
"Michael Rys [MSFT]" wrote:

> You have a couple of options:
> Use the SQLXML 3.0 XML Bulkload component. You will need to add annotation
s
> to your schema (see the documentation).
> Use OpenXML. You pass the XML to a stored proc as a TEXT or NTEXT paramete
r
> (you need to make sure that the XML document is in an encoding compatible
> with the SQL Server collation codepage).
> Write your own ASP/ASP.Net mid-tier code to perform the shredding.
> Best regards
> Michael
> "Andi" <Andi@.discussions.microsoft.com> wrote in message
> news:FE4E6BEE-4BB2-452E-8A9B-BC9F0E20C3CE@.microsoft.com...
>
>|||The targetnamespace indicates that the elements need to belong to the
namespace to be matched by the schema definitions.
Best regards
Michael
"Illustris" <Illustris@.discussions.microsoft.com> wrote in message
news:1EC9D995-DADE-4321-B9C4-AC8A55E07E51@.microsoft.com...
> Hello - found the issue - apparently, the:
> targetNamespace="http://tempuri.org/fra-bel-xml-MicrosoftBusinessSolutions
Axapta30_2.xsd"
> needs to be removed from the <xs:schema> tag.
> Now, can anyone tell me why? I'm not an XML expert but learning as much as
> I
> can...
> AB
> "Michael Rys [MSFT]" wrote:
>

Importing XML Sample file and XML Schema

I recently received a schema file from a vendor to import into my SQL Server.
I am new to XML and have been reading all I can. I tried doing a Bulk Load
in a DTS Package, but it does not read the schema as the correct file format
(asking me ANSI, Unicode, etc). I have a sample data file to test with but
it does not seem to be working. Can anyone give any insight on the easiest
way to do this? I will be receiving a weekly feed from this vendor and will
need to import on a schedule. Thanks.
Do you need to "shred" it into relational fields or keep the whole XML
document in a single field?
Best regards
Michael
"Andi" <Andi@.discussions.microsoft.com> wrote in message
news:51CB2EE7-C1F2-4F3E-84FB-7BF425C154F2@.microsoft.com...
>I recently received a schema file from a vendor to import into my SQL
>Server.
> I am new to XML and have been reading all I can. I tried doing a Bulk
> Load
> in a DTS Package, but it does not read the schema as the correct file
> format
> (asking me ANSI, Unicode, etc). I have a sample data file to test with
> but
> it does not seem to be working. Can anyone give any insight on the
> easiest
> way to do this? I will be receiving a weekly feed from this vendor and
> will
> need to import on a schedule. Thanks.
|||I need to shred it. There are multiple tables involved.
"Michael Rys [MSFT]" wrote:

> Do you need to "shred" it into relational fields or keep the whole XML
> document in a single field?
> Best regards
> Michael
> "Andi" <Andi@.discussions.microsoft.com> wrote in message
> news:51CB2EE7-C1F2-4F3E-84FB-7BF425C154F2@.microsoft.com...
>
>
|||You have a couple of options:
Use the SQLXML 3.0 XML Bulkload component. You will need to add annotations
to your schema (see the documentation).
Use OpenXML. You pass the XML to a stored proc as a TEXT or NTEXT parameter
(you need to make sure that the XML document is in an encoding compatible
with the SQL Server collation codepage).
Write your own ASP/ASP.Net mid-tier code to perform the shredding.
Best regards
Michael
"Andi" <Andi@.discussions.microsoft.com> wrote in message
news:FE4E6BEE-4BB2-452E-8A9B-BC9F0E20C3CE@.microsoft.com...[vbcol=seagreen]
>I need to shred it. There are multiple tables involved.
> "Michael Rys [MSFT]" wrote:
|||Hello - found the issue - apparently, the:
targetNamespace="http://tempuri.org/fra-bel-xml-MicrosoftBusinessSolutionsAxapta30_2.xsd"
needs to be removed from the <xs:schema> tag.
Now, can anyone tell me why? I'm not an XML expert but learning as much as I
can...
AB
"Michael Rys [MSFT]" wrote:

> You have a couple of options:
> Use the SQLXML 3.0 XML Bulkload component. You will need to add annotations
> to your schema (see the documentation).
> Use OpenXML. You pass the XML to a stored proc as a TEXT or NTEXT parameter
> (you need to make sure that the XML document is in an encoding compatible
> with the SQL Server collation codepage).
> Write your own ASP/ASP.Net mid-tier code to perform the shredding.
> Best regards
> Michael
> "Andi" <Andi@.discussions.microsoft.com> wrote in message
> news:FE4E6BEE-4BB2-452E-8A9B-BC9F0E20C3CE@.microsoft.com...
>
>
|||The targetnamespace indicates that the elements need to belong to the
namespace to be matched by the schema definitions.
Best regards
Michael
"Illustris" <Illustris@.discussions.microsoft.com> wrote in message
news:1EC9D995-DADE-4321-B9C4-AC8A55E07E51@.microsoft.com...[vbcol=seagreen]
> Hello - found the issue - apparently, the:
> targetNamespace="http://tempuri.org/fra-bel-xml-MicrosoftBusinessSolutionsAxapta30_2.xsd"
> needs to be removed from the <xs:schema> tag.
> Now, can anyone tell me why? I'm not an XML expert but learning as much as
> I
> can...
> AB
> "Michael Rys [MSFT]" wrote:
sql

Importing XML Files to SQL Database using SSIS

Hi,
I need to Import an XML file into SQL Database using SSIS, Plz let me know
how do I acheive this. I am quite new to SSIS, So if you can Point me to some
samples on importing XML files using SSIS, nothing like that. Any additional
references, if any, are most welcome.
Thanks in Advance,
Sachin R. Chavan.
Hello Sachin,

> I need to Import an XML file into SQL Database using SSIS, Plz let me
> know how do I acheive this. I am quite new to SSIS, So if you can
> Point me to some samples on importing XML files using SSIS, nothing
> like that. Any additional references, if any, are most welcome.
a. Get this book and read it: http://www.wrox.com/WileyCDA/WroxTitle/productCd-0764584359.html
b. Use the XML source.
Thanks!
Kent Tegels
DevelopMentor
http://staff.develop.com/ktegels/
sql

Importing XML Files to SQL Database using SSIS

Hi,
I need to Import an XML file into SQL Database using SSIS, Plz let me know
how do I acheive this. I am quite new to SSIS, So if you can Point me to som
e
samples on importing XML files using SSIS, nothing like that. Any additional
references, if any, are most welcome.
Thanks in Advance,
Sachin R. Chavan.Hello Sachin,

> I need to Import an XML file into SQL Database using SSIS, Plz let me
> know how do I acheive this. I am quite new to SSIS, So if you can
> Point me to some samples on importing XML files using SSIS, nothing
> like that. Any additional references, if any, are most welcome.
a. Get this book and read it: http://www.wrox.com/WileyCDA/WroxTi...>
4584359.html
b. Use the XML source.
Thanks!
Kent Tegels
DevelopMentor
http://staff.develop.com/ktegels/

Importing XML File via URL using MSSQL 2005

I have an XML file that is hosted on a mainframe web server. I need to read
the file via a URL and import this into my SQL database.
I'm fairly new to SQL 2005 but this seems to be a simple task. I've looked
at using the Import Export wizard but I don't see a simpl way to do this.
I've tried selecting "Flat File" as my data source but this does not seem to
be able to read XML. I've tried the SQLSML data sources but I don't think
this will work if the web server is not IIS.
Does anyone have any ideas?"Bryan" <Bryan@.discussions.microsoft.com> wrote in message
news:37B66D17-FE91-4CFD-B502-DB80BD2B8360@.microsoft.com...
>I have an XML file that is hosted on a mainframe web server. I need to
>read
> the file via a URL and import this into my SQL database.
> I'm fairly new to SQL 2005 but this seems to be a simple task. I've
> looked
> at using the Import Export wizard but I don't see a simpl way to do this.
> I've tried selecting "Flat File" as my data source but this does not seem
> to
> be able to read XML. I've tried the SQLSML data sources but I don't think
> this will work if the web server is not IIS.
> Does anyone have any ideas?
Write a SQLCLR function in C# or VB to retrieve the data from the web
server.

Importing XML File using SSIS (DTS)

I can use MS Access to import an XML file and it builds the columns and loads the data. I am trying to use SQL Server 2005 to do the same thing and there doesn't seen to be a way to easily accomplish the same task that works so well in Access.

Has anyone tried this? If so and you were successful, what were the steps to set up the package?

Thanks.

Have you tried to use XML Source adapter?

- Grab a Data Flow task from the toolbox

- Double-click on the added task

- Double-click on the XML Source in the data flow toolbox

- Double-click added component on the diagram

- Set-up your XML sorce

HTH.

|||

Hi HTH,

After reading your post, i was trying to dothe same thing using the below xml:

<country
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="country.xsd">
<name>France</name>
<population>59.7</population>
</country>

and i even generated the XSD using SSIS and got this :

<?xml version="1.0"?>
<xs:schema xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" attributeFormDefault="unqualified" elementFormDefault="qualified" xmlns:xs="http://www.w3.org/2001/XMLSchema">
<xs:element name="country">
<xs:complexType>
<xs:sequence>
<xs:element minOccurs="0" name="name" type="xs:string" />
<xs:element minOccurs="0" name="population" type="xs:decimal" />
</xs:sequence>
</xs:complexType>
</xs:element>
</xs:schema>

and now after pointing things correctly in my connection manager when i go to my columns all i get is one column which says Name..

why is it so? ideally i should get Name and Pollution right..? Can you advice..?

|||

Try wrapping the xml in a root node:

<root>
<country
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="country.xsd">
<name>France</name>
<population>59.7</population>
</country>
</root>

Thanks
Mark

|||

Mark,

Thanks It worked.

So the impression that i get is that all XML's should be waped in a root node. Is it True

Importing XML File using SSIS (DTS)

I can use MS Access to import an XML file and it builds the columns and loads the data. I am trying to use SQL Server 2005 to do the same thing and there doesn't seen to be a way to easily accomplish the same task that works so well in Access.

Has anyone tried this? If so and you were successful, what were the steps to set up the package?

Thanks.

Have you tried to use XML Source adapter?

- Grab a Data Flow task from the toolbox

- Double-click on the added task

- Double-click on the XML Source in the data flow toolbox

- Double-click added component on the diagram

- Set-up your XML sorce

HTH.

|||

Hi HTH,

After reading your post, i was trying to dothe same thing using the below xml:

<country
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="country.xsd">
<name>France</name>
<population>59.7</population>
</country>

and i even generated the XSD using SSIS and got this :

<?xml version="1.0"?>
<xs:schema xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" attributeFormDefault="unqualified" elementFormDefault="qualified" xmlns:xs="http://www.w3.org/2001/XMLSchema">
<xs:element name="country">
<xs:complexType>
<xs:sequence>
<xs:element minOccurs="0" name="name" type="xs:string" />
<xs:element minOccurs="0" name="population" type="xs:decimal" />
</xs:sequence>
</xs:complexType>
</xs:element>
</xs:schema>

and now after pointing things correctly in my connection manager when i go to my columns all i get is one column which says Name..

why is it so? ideally i should get Name and Pollution right..? Can you advice..?

|||

Try wrapping the xml in a root node:

<root>
<country
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="country.xsd">
<name>France</name>
<population>59.7</population>
</country>
</root>

Thanks
Mark

|||

Mark,

Thanks It worked.

So the impression that i get is that all XML's should be waped in a root node. Is it True

importing xml file to sql server

Hello there
I have an xml file that i need to compare it to data on server. and i don't
have xlt file (which is the base of xml as far as i know)
I need to import the xml file with the entire structure to sql server in
order to compare it with data i have
how can i do that?Roy Goldhammer wrote:

> I need to import the xml file with the entire structure to sql server in
> order to compare it with data i have
> how can i do that?
Have a look at the OPENXML clause
<http://msdn2.microsoft.com/en-us/library/ms191268.aspx>
Martin Honnen -- MVP XML
http://JavaScript.FAQTs.com/sql

importing xml file to sql server

Hello there
I have an xml file that i need to compare it to data on server. and i don't
have xlt file (which is the base of xml as far as i know)
I need to import the xml file with the entire structure to sql server in
order to compare it with data i have
how can i do that?
Roy Goldhammer wrote:

> I need to import the xml file with the entire structure to sql server in
> order to compare it with data i have
> how can i do that?
Have a look at the OPENXML clause
<http://msdn2.microsoft.com/en-us/library/ms191268.aspx>
Martin Honnen -- MVP XML
http://JavaScript.FAQTs.com/

importing xml file

Hello there
I'm new on xml
I have xml file in 500MB size, which i want to import it to table on sql
server
how can i do that?How exactly do you want to get it into a table? As an XML datatype instance
in a row? Or are you planning on shredding information into one or multiple
tables? Do you have a schema describing the XML or is it schema less?
Best regards
Michael
"Roy Goldhammer" <roy@.hotmail.com> wrote in message
news:uKZfcIWNHHA.1248@.TK2MSFTNGP02.phx.gbl...
> Hello there
> I'm new on xml
> I have xml file in 500MB size, which i want to import it to table on sql
> server
> how can i do that?
>|||to one table
i have one file without schema with data
"Michael Rys [MSFT]" <mrys@.online.microsoft.com> wrote in message
news:eE7romhNHHA.5000@.TK2MSFTNGP03.phx.gbl...
> How exactly do you want to get it into a table? As an XML datatype
> instance in a row? Or are you planning on shredding information into one
> or multiple tables? Do you have a schema describing the XML or is it
> schema less?
> Best regards
> Michael
> "Roy Goldhammer" <roy@.hotmail.com> wrote in message
> news:uKZfcIWNHHA.1248@.TK2MSFTNGP02.phx.gbl...
>|||I would look at creating an annotated schema for the data and use the SQL
XML bulkload if you have to perform this several times. If you have to do it
once, look into OpenXML (in SQL Server 2000) or the nodes() method (in SQL
Server 2005),
More information about any of these three approaches can be found in the
archives or Books Online.
Best regards
Michael
"Roy Goldhammer" <roy@.hotmail.com> wrote in message
news:%235t48EXOHHA.3944@.TK2MSFTNGP06.phx.gbl...
> to one table
> i have one file without schema with data
> "Michael Rys [MSFT]" <mrys@.online.microsoft.com> wrote in message
> news:eE7romhNHHA.5000@.TK2MSFTNGP03.phx.gbl...
>

importing xml file

Hello there
I'm new on xml
I have xml file in 500MB size, which i want to import it to table on sql
server
how can i do that?
How exactly do you want to get it into a table? As an XML datatype instance
in a row? Or are you planning on shredding information into one or multiple
tables? Do you have a schema describing the XML or is it schema less?
Best regards
Michael
"Roy Goldhammer" <roy@.hotmail.com> wrote in message
news:uKZfcIWNHHA.1248@.TK2MSFTNGP02.phx.gbl...
> Hello there
> I'm new on xml
> I have xml file in 500MB size, which i want to import it to table on sql
> server
> how can i do that?
>
|||to one table
i have one file without schema with data
"Michael Rys [MSFT]" <mrys@.online.microsoft.com> wrote in message
news:eE7romhNHHA.5000@.TK2MSFTNGP03.phx.gbl...
> How exactly do you want to get it into a table? As an XML datatype
> instance in a row? Or are you planning on shredding information into one
> or multiple tables? Do you have a schema describing the XML or is it
> schema less?
> Best regards
> Michael
> "Roy Goldhammer" <roy@.hotmail.com> wrote in message
> news:uKZfcIWNHHA.1248@.TK2MSFTNGP02.phx.gbl...
>
|||I would look at creating an annotated schema for the data and use the SQL
XML bulkload if you have to perform this several times. If you have to do it
once, look into OpenXML (in SQL Server 2000) or the nodes() method (in SQL
Server 2005),
More information about any of these three approaches can be found in the
archives or Books Online.
Best regards
Michael
"Roy Goldhammer" <roy@.hotmail.com> wrote in message
news:%235t48EXOHHA.3944@.TK2MSFTNGP06.phx.gbl...
> to one table
> i have one file without schema with data
> "Michael Rys [MSFT]" <mrys@.online.microsoft.com> wrote in message
> news:eE7romhNHHA.5000@.TK2MSFTNGP03.phx.gbl...
>

importing XML Data into SQL Server 2000

Hi All,
I need to import data from a large XML file.
I do not want to use XML Bulk upload or use ADO components.
I am looking for a way to use sql server scripts to open
this file through a ODBC driver , OpenRowset function.
Are there any ODBC or OLEDB drivers exist that can be used
in OpenRowset function ?
Not that I know of.
You may want to look at using a stored proc and OpenXML (although XML
Bulkload is more efficient).
Sorry.
Michael
"vaga" <anonymous@.discussions.microsoft.com> wrote in message
news:00a101c49b3b$9b4c2150$a301280a@.phx.gbl...
> Hi All,
> I need to import data from a large XML file.
> I do not want to use XML Bulk upload or use ADO components.
> I am looking for a way to use sql server scripts to open
> this file through a ODBC driver , OpenRowset function.
> Are there any ODBC or OLEDB drivers exist that can be used
> in OpenRowset function ?
>

Importing XML data into a SQL Server database

Hello,
I am having a bit of a problem, and maybe someone out there can
help. The Schema file (test_schema.xml) is:
<?xml version="1.0" ?>
<xs:schema attributeFormDefault="unqualified"
elementFormDefault="qualified"
xmlns:xs="http://www.w3.org/2001/XMLSchema"
xmlns:sql="urn:schemas-microsoft-com:mapping-schema">
<xs:element name="Root">
<xs:complexType>
<xs:sequence>
<xs:element name="Truserv" sql:mapped="false">
<xs:complexType>
<xs:sequence>
<xs:element name="Item" sql:relation="Test_Table">
<xs:complexType>
<xs:attribute name="item_nbr" type="xs:unsignedInt"
use="required" sql:field="Test_Col1" />
</xs:complexType>
</xs:element>
</xs:sequence>
</xs:complexType>
</xs:element>
</xs:sequence>
</xs:complexType>
</xs:element>
</xs:schema>
The XML file is:
<ROOT>
<TRU_SERV>
<Items test_attribute1="This is a test" />
</TRU_SERV>
</ROOT>
and my vb script code is:
Set objBL = CreateObject("SQLXMLBulkLoad.SQLXMLBulkLoad.3.0")
objBL.ConnectionString = "provider=SQLOLEDB.1;data
source=aaa;database=xxx;uid=yyy;pwd=zzz"
objBL.ErrorLogFile = "error.log"
objBL.Execute "test_schema.xml", "test_xml.xml"
Set objBL = Nothing
When I run the vb code, the code will execute, but it will not insert
the data into the database. It just runs and from my end, seems to do
nothing. It does connect to the database, since I am not getting any
errors. If anyone out there has any ideas, please let me know.
-Jay
(patel@.cs.utk.edu)
In the XML file one of the tags is called <TRU_SERV>
where as in the schema it is called <xs:element name="Truserv" sql:mapped="false"> These names must match exactly for this to work.
|||also you have the same problem with "item" and "items"

Importing XML Data 2 SQL SERVER

I am importing XML data to SQL SERVER using Bulk load component.
The XML file is like the following:
<?xml version="1.0" encoding="UTF-8" ?>
<legion-event-file>
<header>
<file-id>247</file-id>
<file-generated-date-time>2004-11-21T05:00:09</file-generated-date-time>
</header>
<item>
<item-id>001</item-id>
<legion-operation-id>15091</legion-operation-id>
<legion-customer-id>42828</legion-customer-id>
<legion-user-id>43722</legion-user-id>
<username>yinoue</username>
<performed-by-username>yinoue</performed-by-username>
<performed-by-user-id>43722</performed-by-user-id>
<b-customer-id>1454103</b-customer-id>
<operation-date-time>2004-11-20T10:24:23</operation-date-time>
<preferred-email>yinoue@.bmail.com.au</preferred-email>
<activation>
<indicator>INDICATOR</indicator>
<b-contract-type-code>BNETU</b-contract-type-code>
<provided-activation-code>492189440</provided-activation-code>
</activation>
</item>
</legion-event-file>
And the XSD/XDR is like:
<xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema"
xmlns:sql="urn:schemas-microsoft-com:mapping-schema">
<xsd:complexType name="ActivationType">
<xsd:sequence>
<xsd:element name="indicator" type="xsd:string"
sql:field="activation_indicator" />
<xsd:element name="b-contract-type-code" type="xsd:string"
sql:field="b_contract_type_code" />
<xsd:element name="provided-activation-code" type="xsd:string"
sql:field="provided_activation_code" />
</xsd:sequence>
</xsd:complexType>
<xsd:complexType name="DeactivationType">
<xsd:sequence>
<xsd:element name="indicator" type="xsd:string"
sql:field="deactivation_indicator" />
</xsd:sequence>
</xsd:complexType>
<xsd:element name="activation" type="ActivationType"
sql:relation="activation" />
<xsd:element name="deactivation" type="DeactivationType"
sql:relation="deactivation"/>
<xsd:complexType name="ItemType">
<xsd:sequence>
<xsd:element name="item-id" type="xsd:string" sql:field="item_id" />
<xsd:element name="legion-operation-id" type="xsd:string"
sql:field="legion_operation_id" />
<xsd:element name="legion-customer-id" type="xsd:string"
sql:field="legion_customer_id" />
<xsd:element name="legion-user-id" type="xsd:string"
sql:field="legion_user_id" />
<xsd:element name="username" type="xsd:string" sql:field="username" />
<xsd:element name="performed-by-username" type="xsd:string"
sql:field="performed_by_username" />
<xsd:element name="performed-by-user-id" type="xsd:string"
sql:field="performed_by_user_id" />
<xsd:element name="b-customer-id" type="xsd:string"
sql:field="b_customer_id" />
<xsd:element name="operation-date-time" type="xsd:string"
sql:field="operation_date_time" />
<xsd:element name="preferred-email" type="xsd:string"
sql:field="preferred_email" />
<xsd:element ref="activation">
<xsd:annotation>
<xsd:appinfo>
<sql:relationship parent="item" parent-key="item_id"
child="activation" child-key="item_id" inverse="true" />
</xsd:appinfo>
</xsd:annotation>
</xsd:element>
<xsd:element ref="deactivation">
<xsd:annotation>
<xsd:appinfo>
<sql:relationship parent="item" parent-key="item_id"
child="deactivation" child-key="item_id" inverse="true" />
</xsd:appinfo>
</xsd:annotation>
</xsd:element>
</xsd:sequence>
</xsd:complexType>
<xsd:element name="item" type="ItemType" sql:key-fields="item_id"
sql:relation="item" />
<xsd:complexType name="HeaderType">
<xsd:sequence>
<xsd:element name="file-id" type="xsd:string" sql:field="legion_file_id" />
<xsd:element name="file-generated-date-time" type="xsd:string"
sql:field="file_generated_date_time" sql:datatype="varchar" />
</xsd:sequence>
</xsd:complexType>
<xsd:element name="header" type="HeaderType"
sql:key-fields="legion_file_id" sql:relation="legion_file"/>
<xsd:element name="legion-event-file" sql:is-constant="1">
<xsd:complexType>
<xsd:sequence>
<xsd:element ref="header"/>
<xsd:element ref="item">
<xsd:annotation>
<xsd:appinfo>
<sql:relationship parent="legion_file" parent-key="legion_file_id"
child="item" child-key="legion_file_id" inverse="true" />
</xsd:appinfo>
</xsd:annotation>
</xsd:element>
</xsd:sequence>
</xsd:complexType>
</xsd:element>
</xsd:schema>
It gets error like:
[CDATA[No data was provided for column 'legion_file_id' on table 'item', and
this column cannot contain NULL values.]]
Can anyone tell me whether I should get rid of the non-relation element like
'header'?
Thanks for any feedback
Your problem seems to be some data is missing that is required. Why would
you get rid of "header"? That might solve the problem by eliminating the
field that is missing a value.
Irwin Dolobowsky
Program Manager, SqlXml
http://msdn.microsoft.com/xml
This posting is provided "AS IS" with no warranties, and confers no rights.
"Anson Luo" <Anson Luo@.discussions.microsoft.com> wrote in message
news:E95B527F-4D1F-4FFC-A488-D84FEA1369C4@.microsoft.com...
>I am importing XML data to SQL SERVER using Bulk load component.
> The XML file is like the following:
> <?xml version="1.0" encoding="UTF-8" ?>
> <legion-event-file>
> <header>
> <file-id>247</file-id>
> <file-generated-date-time>2004-11-21T05:00:09</file-generated-date-time>
> </header>
> <item>
> <item-id>001</item-id>
> <legion-operation-id>15091</legion-operation-id>
> <legion-customer-id>42828</legion-customer-id>
> <legion-user-id>43722</legion-user-id>
> <username>yinoue</username>
> <performed-by-username>yinoue</performed-by-username>
> <performed-by-user-id>43722</performed-by-user-id>
> <b-customer-id>1454103</b-customer-id>
> <operation-date-time>2004-11-20T10:24:23</operation-date-time>
> <preferred-email>yinoue@.bmail.com.au</preferred-email>
> <activation>
> <indicator>INDICATOR</indicator>
> <b-contract-type-code>BNETU</b-contract-type-code>
> <provided-activation-code>492189440</provided-activation-code>
> </activation>
> </item>
> </legion-event-file>
> And the XSD/XDR is like:
> <xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema"
> xmlns:sql="urn:schemas-microsoft-com:mapping-schema">
> <xsd:complexType name="ActivationType">
> <xsd:sequence>
> <xsd:element name="indicator" type="xsd:string"
> sql:field="activation_indicator" />
> <xsd:element name="b-contract-type-code" type="xsd:string"
> sql:field="b_contract_type_code" />
> <xsd:element name="provided-activation-code" type="xsd:string"
> sql:field="provided_activation_code" />
> </xsd:sequence>
> </xsd:complexType>
> <xsd:complexType name="DeactivationType">
> <xsd:sequence>
> <xsd:element name="indicator" type="xsd:string"
> sql:field="deactivation_indicator" />
> </xsd:sequence>
> </xsd:complexType>
> <xsd:element name="activation" type="ActivationType"
> sql:relation="activation" />
> <xsd:element name="deactivation" type="DeactivationType"
> sql:relation="deactivation"/>
> <xsd:complexType name="ItemType">
> <xsd:sequence>
> <xsd:element name="item-id" type="xsd:string" sql:field="item_id" />
> <xsd:element name="legion-operation-id" type="xsd:string"
> sql:field="legion_operation_id" />
> <xsd:element name="legion-customer-id" type="xsd:string"
> sql:field="legion_customer_id" />
> <xsd:element name="legion-user-id" type="xsd:string"
> sql:field="legion_user_id" />
> <xsd:element name="username" type="xsd:string" sql:field="username" />
> <xsd:element name="performed-by-username" type="xsd:string"
> sql:field="performed_by_username" />
> <xsd:element name="performed-by-user-id" type="xsd:string"
> sql:field="performed_by_user_id" />
> <xsd:element name="b-customer-id" type="xsd:string"
> sql:field="b_customer_id" />
> <xsd:element name="operation-date-time" type="xsd:string"
> sql:field="operation_date_time" />
> <xsd:element name="preferred-email" type="xsd:string"
> sql:field="preferred_email" />
> <xsd:element ref="activation">
> <xsd:annotation>
> <xsd:appinfo>
> <sql:relationship parent="item" parent-key="item_id"
> child="activation" child-key="item_id" inverse="true" />
> </xsd:appinfo>
> </xsd:annotation>
> </xsd:element>
> <xsd:element ref="deactivation">
> <xsd:annotation>
> <xsd:appinfo>
> <sql:relationship parent="item" parent-key="item_id"
> child="deactivation" child-key="item_id" inverse="true" />
> </xsd:appinfo>
> </xsd:annotation>
> </xsd:element>
> </xsd:sequence>
> </xsd:complexType>
> <xsd:element name="item" type="ItemType" sql:key-fields="item_id"
> sql:relation="item" />
> <xsd:complexType name="HeaderType">
> <xsd:sequence>
> <xsd:element name="file-id" type="xsd:string" sql:field="legion_file_id"
> />
> <xsd:element name="file-generated-date-time" type="xsd:string"
> sql:field="file_generated_date_time" sql:datatype="varchar" />
> </xsd:sequence>
> </xsd:complexType>
> <xsd:element name="header" type="HeaderType"
> sql:key-fields="legion_file_id" sql:relation="legion_file"/>
> <xsd:element name="legion-event-file" sql:is-constant="1">
> <xsd:complexType>
> <xsd:sequence>
> <xsd:element ref="header"/>
> <xsd:element ref="item">
> <xsd:annotation>
> <xsd:appinfo>
> <sql:relationship parent="legion_file" parent-key="legion_file_id"
> child="item" child-key="legion_file_id" inverse="true" />
> </xsd:appinfo>
> </xsd:annotation>
> </xsd:element>
> </xsd:sequence>
> </xsd:complexType>
> </xsd:element>
> </xsd:schema>
> It gets error like:
> [CDATA[No data was provided for column 'legion_file_id' on table 'item',
> and
> this column cannot contain NULL values.]]
> Can anyone tell me whether I should get rid of the non-relation element
> like
> 'header'?
> Thanks for any feedback
>
sql

Wednesday, March 28, 2012

Importing txt files with sql server 2000

HI!

I am importing .txt files. How can i check the errors? I have created a
log file, but the problem is that i lose some characters.
I import for example:

Code
ABC
FZH
JHN

from a text file, but sometimes Code can be 4 caracters long
I import this 3 characters long now. When i add the same structured
text file with some rows lenght 4, it skips the last character, but i
get nothing in the log file.

please help
xgirlSo you already have a table after importing items with 3 characters,
and then later you import items with 4 characters?

Perhaps it set the field width to 3 based on what it found in the first
import.

Sounds like you may need to manually widen the field. Open the table
in design view and make sure the field width is large enough for 4
characters.|||If you are using a DTS for the import, you may want to check the
Transform Data Task. If you are using fixed width delimiting then that
may be the problem.|||I changed from varchar ->nvarchar if you mean that but still i get no
errors of lost characters.

The problem is i have a lot of .txt files and i will get in the future
the same files with different data. if i automaticlly import every
file, how can i be sure the data are not longer and i didn't lost some
characters.

thank you
xgirl|||I changed from varchar ->nvarchar if you mean that but still i get no
errors of lost characters.

The problem is i have a lot of .txt files and i will get in the future
the same files with different data. if i automaticlly import every
file, how can i be sure the data are not longer and i didn't lost some
characters.

thank you
xgirl

Importing TXT files problems

Hello,
I'm fairly new to SQL and hopefully this is a simple question to answer.
I've been trying to import a TXT file into an SQL database and I'm having trouble making it work correctly. It is a ASCII text file where the fields vary by the number of characters. This can be 2 characters up to 40 (STATE would be 2 characters, CITY is
32 characters, etc.)
I can import the file with DTS. I go in and select exactly where I want the field breaks to be. Then it imports everything as Characters with column headers of Col001, Col002 etc. I don't want everything as characters or Col001 etc. Some columns would be
INT, NUMERIC(x,x), etc. but everytime I change these values in DTS it won't import the data correctly.
Also, I have an SQL script that I wrote for a table where I can create the field lengths, data type etc., FWIW. This seems to be going nowhere fast.
What am I doing wrong? Should I be using something else than DTS?
Any suggestions are greatly appreciated.
Thanks,
JB
We have had really good luck with BULK INSERT using TAB DELIMITED data.
Maybe you can get it to be TAB DELIMITED using EXCEL - IMPORT the text and
save as .TDF.
We also are a VB shop, so on top of that we have developed several different
versions of a TEXTIMPORT program - some that take the lengths of each SQL
column and cut the input data based on that, others that look for specific
items while cutting up the data. This way we can also deal with odd data
issues for various columns (the TEXTIMPORT uses a little .INI file that
controls specifics like that).
I believe that BULK INSERT and BCP have control files to allow greater
flexibility - maybe check BOL (books online)...
"JonBlack" <anonymous@.discussions.microsoft.com> wrote in message
news:AE462262-F98D-40A1-BD57-6E2066BE2A6D@.microsoft.com...
> Hello,
> I'm fairly new to SQL and hopefully this is a simple question to answer.
> I've been trying to import a TXT file into an SQL database and I'm having
trouble making it work correctly. It is a ASCII text file where the fields
vary by the number of characters. This can be 2 characters up to 40 (STATE
would be 2 characters, CITY is 32 characters, etc.)
> I can import the file with DTS. I go in and select exactly where I want
the field breaks to be. Then it imports everything as Characters with column
headers of Col001, Col002 etc. I don't want everything as characters or
Col001 etc. Some columns would be INT, NUMERIC(x,x), etc. but everytime I
change these values in DTS it won't import the data correctly.
> Also, I have an SQL script that I wrote for a table where I can create the
field lengths, data type etc., FWIW. This seems to be going nowhere fast.
> What am I doing wrong? Should I be using something else than DTS?
> Any suggestions are greatly appreciated.
> Thanks,
> JB
|||Hi Steve,
Thanks for your reply. I wouldn't mind importing it into EXCEL and saving it in a better format. Unfortunately I have over 100,000 records, which rules EXCEL out. I think BULK IMPORT and BCP can't import files based on character widths but could use tabs
as you suggested. Using VB to write an import script could be an option though.
Thanks again,
JB
-- Steve Z wrote: --
We have had really good luck with BULK INSERT using TAB DELIMITED data.
Maybe you can get it to be TAB DELIMITED using EXCEL - IMPORT the text and
save as .TDF.
We also are a VB shop, so on top of that we have developed several different
versions of a TEXTIMPORT program - some that take the lengths of each SQL
column and cut the input data based on that, others that look for specific
items while cutting up the data. This way we can also deal with odd data
issues for various columns (the TEXTIMPORT uses a little .INI file that
controls specifics like that).
I believe that BULK INSERT and BCP have control files to allow greater
flexibility - maybe check BOL (books online)...
|||Is it a column a fix length or delimited by coma ?
JonBlack wrote:
> Hello,
> I'm fairly new to SQL and hopefully this is a simple question to answer.
> I've been trying to import a TXT file into an SQL database and I'm having trouble making it work correctly. It is a ASCII text file where the fields vary by the number of characters. This can be 2 characters up to 40 (STATE would be 2 characters, CITY i
s 32 characters, etc.)
> I can import the file with DTS. I go in and select exactly where I want the field breaks to be. Then it imports everything as Characters with column headers of Col001, Col002 etc. I don't want everything as characters or Col001 etc. Some columns would b
e INT, NUMERIC(x,x), etc. but everytime I change these values in DTS it won't import the data correctly.
> Also, I have an SQL script that I wrote for a table where I can create the field lengths, data type etc., FWIW. This seems to be going nowhere fast.
> What am I doing wrong? Should I be using something else than DTS?
> Any suggestions are greatly appreciated.
> Thanks,
> JB
|||The columns are fixed length. For example Column1 is 5 characters long, Column2 is 2 characters long, ColumnC is 30 characters long, ColumnD is 10 characters long, etc. Also, some of the columns are numeric and integer as well but are represented by chara
cters until imported.
Thanks,
JB

Importing TXT files problems

Hello,
I'm fairly new to SQL and hopefully this is a simple question to answer.
I've been trying to import a TXT file into an SQL database and I'm having tr
ouble making it work correctly. It is a ASCII text file where the fields var
y by the number of characters. This can be 2 characters up to 40 (STATE woul
d be 2 characters, CITY is
32 characters, etc.)
I can import the file with DTS. I go in and select exactly where I want the
field breaks to be. Then it imports everything as Characters with column hea
ders of Col001, Col002 etc. I don't want everything as characters or Col001
etc. Some columns would be
INT, NUMERIC(x,x), etc. but everytime I change these values in DTS it won't
import the data correctly.
Also, I have an SQL script that I wrote for a table where I can create the f
ield lengths, data type etc., FWIW. This seems to be going nowhere fast.
What am I doing wrong? Should I be using something else than DTS?
Any suggestions are greatly appreciated.
Thanks,
JBWe have had really good luck with BULK INSERT using TAB DELIMITED data.
Maybe you can get it to be TAB DELIMITED using EXCEL - IMPORT the text and
save as .TDF.
We also are a VB shop, so on top of that we have developed several different
versions of a TEXTIMPORT program - some that take the lengths of each SQL
column and cut the input data based on that, others that look for specific
items while cutting up the data. This way we can also deal with odd data
issues for various columns (the TEXTIMPORT uses a little .INI file that
controls specifics like that).
I believe that BULK INSERT and BCP have control files to allow greater
flexibility - maybe check BOL (books online)...
"JonBlack" <anonymous@.discussions.microsoft.com> wrote in message
news:AE462262-F98D-40A1-BD57-6E2066BE2A6D@.microsoft.com...
> Hello,
> I'm fairly new to SQL and hopefully this is a simple question to answer.
> I've been trying to import a TXT file into an SQL database and I'm having
trouble making it work correctly. It is a ASCII text file where the fields
vary by the number of characters. This can be 2 characters up to 40 (STATE
would be 2 characters, CITY is 32 characters, etc.)
> I can import the file with DTS. I go in and select exactly where I want
the field breaks to be. Then it imports everything as Characters with column
headers of Col001, Col002 etc. I don't want everything as characters or
Col001 etc. Some columns would be INT, NUMERIC(x,x), etc. but everytime I
change these values in DTS it won't import the data correctly.
> Also, I have an SQL script that I wrote for a table where I can create the
field lengths, data type etc., FWIW. This seems to be going nowhere fast.
> What am I doing wrong? Should I be using something else than DTS?
> Any suggestions are greatly appreciated.
> Thanks,
> JB|||Hi Steve,
Thanks for your reply. I wouldn't mind importing it into EXCEL and saving it
in a better format. Unfortunately I have over 100,000 records, which rules
EXCEL out. I think BULK IMPORT and BCP can't import files based on character
widths but could use tabs
as you suggested. Using VB to write an import script could be an option thou
gh.
Thanks again,
JB
-- Steve Z wrote: --
We have had really good luck with BULK INSERT using TAB DELIMITED data.
Maybe you can get it to be TAB DELIMITED using EXCEL - IMPORT the text and
save as .TDF.
We also are a VB shop, so on top of that we have developed several different
versions of a TEXTIMPORT program - some that take the lengths of each SQL
column and cut the input data based on that, others that look for specific
items while cutting up the data. This way we can also deal with odd data
issues for various columns (the TEXTIMPORT uses a little .INI file that
controls specifics like that).
I believe that BULK INSERT and BCP have control files to allow greater
flexibility - maybe check BOL (books online)...|||Is it a column a fix length or delimited by coma ?
JonBlack wrote:
> Hello,
> I'm fairly new to SQL and hopefully this is a simple question to answer.
> I've been trying to import a TXT file into an SQL database and I'm having trouble
making it work correctly. It is a ASCII text file where the fields vary by the numbe
r of characters. This can be 2 characters up to 40 (STATE would be 2 characters, CIT
Y i
s 32 characters, etc.)
> I can import the file with DTS. I go in and select exactly where I want the field
breaks to be. Then it imports everything as Characters with column headers of Col001
, Col002 etc. I don't want everything as characters or Col001 etc. Some columns woul
d b
e INT, NUMERIC(x,x), etc. but everytime I change these values in DTS it won't import the dat
a correctly.
> Also, I have an SQL script that I wrote for a table where I can create the
field lengths, data type etc., FWIW. This seems to be going nowhere fast.
> What am I doing wrong? Should I be using something else than DTS?
> Any suggestions are greatly appreciated.
> Thanks,
> JB|||The columns are fixed length. For example Column1 is 5 characters long, Colu
mn2 is 2 characters long, ColumnC is 30 characters long, ColumnD is 10 chara
cters long, etc. Also, some of the columns are numeric and integer as well b
ut are represented by chara
cters until imported.
Thanks,
JB

importing txt file to multiple table in sql 2000

Hi There,

I am looking for information on how to import the txt or csv file to
the multiple table in sql 2000. If you have any kind of inf. please
let me know wheather we can do this and how.

below is the detail information.

I received txt file every day which contain the information from 3
different related table in my sql 2000 database. Right now we are
keyin the information from the web site (which is link to the txt
file) to our database, but i am wondering if we can import those
record in the tables.

the header of the file goes to table1 and when we insert the record in
table1, it should generate the autoidentityrecord (PK), and that PK is
link to other table2 and table3 where rest of the information from txt
file goes. For table2 and table3 there are multiple record per txt
files.

in our txt file each row is separated with row header, like HTC100
WITH ROW NO. 1,2,3.., which indecate this information goes to table
and 1,2...are the different row number.

Please let me know whether we can achive this task or not.

Thanks for all you help in advance.

Indra.

I have pasted my txt file below:
========

"FHS000",20041029,0900,,"10",1,"TRAILB10_20041029_1B",3,"2.20","Pason-DataHub",,"#
Well 1098831406 Tour 2004/10/29 Trailblazer 10 148",1,"EDR_3-0-10_HF2
ETS 2.2"
"CON000",1,0000,0759
"CON000",2,0800,1559
"CON000",3,1600,2359
"HWI010","0312857","COMPTON BRANT 15-7-18-24","COMPTON PETROLEUM
CORP.","TRAILBLAZER DRILLING
CORP.","15-07-018-24W4","100/15-07-018-24W4/00","HANK
PARANYCH","CURTIS FIESEL",20041029,,,"10",20041027,0600,,,"148","DD04485","VERT.","NO",,
"HCO030",1,"Daily Walk Around Inspection","HP","CF"
"HCO030",2,"Detailed Inspection - Weekly (using checklist)","HP","CF"
"HCO030",3,"H2S Signs Posted (if required)",,
"HCO030",4,"Well License & Stick Diagram Posted","HP","CF"
"HCO030",5,"Flare Lines Staked","HP","CF"
"HCO030",6,"BOP Drills Performed","HP","CF"
"HCO030",7,"Visually Inspect BOP's - Flarelines and Degasser
Lines","HP","CF"
"HDC040",1,"Rig Site Health and Safety Meeting (one/crew/month)","CF"
"HDC040",2,"C.A.O.D.C. Rig Safety Inspection Checklist
(one/rig/month)","CF"
"HDC040",3,"Mast Inspection Before Raising or Lowering","CF"
"HDC040",4,"Crown Saver Checked","CF"
"HDC040",5,"Motor Kills Checked","CF"
"HFU050",2300,2100,,
"HWE060",-5,"Deg C","COOL","WEST","SLIPPERY",,
"HCS070",1,177.8,,"mm",25.3,"STELCO","J-55",8,108.44,3.84,108.44,
"HCS070",2,114.3,,"mm",14.14,"STELCO","J-55",72,979.50,3.84,979.0,
"HDP080",1,127,79.4,"kg/m","E",57,127,"mm","3 1/2 IF",10,"DC",
"HDP080",2,89,19.7,"kg/m","E",68,120,"mm","3 1/2 IF",15,"DP",
"HPU090",1,"F-800","EMSCO",254,"mm",,,,
"HPU090",2,"F-800","EMSCO",254,"mm",,,,
"HTC100",1,"Rig up and tear down"
"HTC100",2,"Drill Actual"
"HTC100",3,"Reaming"
"HTC100",4,"Coring"
"HTC100",5,"Condition Mud & Circulate"
"HTC100",6,"Trips"
"HTC100",7,"Rig Service"
"HTC100",8,"Repair Rig"
"HTC100",9,"Cut off drilling line"
"HTC100",10,"Deviation Survey"
"HTC100",11,"Wire Line Logs"
"HTC100",12,"Run Case & Cement"
"HTC100",13,"Wait on Cement"
"HTC100",14,"Nipple up B.O.P."
"HTC100",15,"Test B.O.P."
"HTC100",16,"Drill Stem Test"
"HTC100",17,"Plug Back"
"HTC100",18,"Squeeze Cement"
"HTC100",19,"Fishing"
"HTC100",20,"Directional Work"
"HTC100",21,"Safety Meeting"
"HTC100",24,"WOD"
"HSS110",1,1,"SWACO","N","110",,"84",,
"HPA130","COMPTON BRANT 15-7-18-24",20041029,"COMPTON PETROLEUM
CORP.","TRAILBLAZER DRILLING CORP.","CURTIS
FIESEL","10","ALBERTA","N",253
"TCP130",1,,,,"kPa",140,,,,"mm",,
"TCP130",2,,,,"kPa",140,,,,"mm",,
"TCP130",3,,,,"kPa",140,,,,"mm",,
"TTL160",1,1,0.00,0.25,0.25,21,"SAFETY MEETING WITH TONG HAND"
"TTL160",1,2,0.25,1.75,1.50,12,"RIG TO AND RUN CASING"
"TTL160",1,3,1.75,2.00,0.25,7,"RIG SERVICE"
"TTL160",1,4,2.00,2.50,0.50,5,"CONDITION MUD & CIRC."
"TTL160",1,5,2.50,2.75,0.25,21,"SAFETY MEETING WITH CEMENTERS"
"TTL160",1,6,2.75,3.50,0.75,12,"RIG TO AND CEMENT CASING"
"TTL160",1,7,3.50,6.00,2.50,1,"SET SLIPS, TEAR OUT RIG, CLEAN TANKS"
"TTL160",1,8,6.00,8.00,2.00,24,"WAIT ON DAYLIGHT/TRUCKS"
"TTL160",1,9,,,,,"CEMENT WITH BJ USING 13 TONNES OF BVF-1500 NP + .7%
FL-5,GIVING 15.5 m3 OF GOOD"
"TTL160",1,10,,,,,"SLURRY @. 1718 kg/m3,PLUG BUMPED & HELD @. 03:30 HRS
OCT 29/04."
"TTL160",1,11,,,,,"RIG RELEASED @. 08:00 HRS OCT 29/04"
"TTL160",1,12,,,,,"MOVE TO 12-3-18-25W4"
"TDI170",1,"JEFF CASE",8,10,475,"Deg C",,,"RUNNING CASING",,,,,
"TLN175",1,"VISUALLY INSPECT PINS, RAMS AND STOOLS PRIOR TO LAYING
OVER DERRICK"
"TPA180",1,1,"DRILLER",647172865,"JEFF CASE",8,,,"JC"
"TPA180",1,2,"DERRICK HAND",648519056,"BRYAN VANHAM",8,,,"BV"
"TPA180",1,3,"MOTOR HAND",651056533,"NEIL WILLIAMS",8,,,"NW"
"TPA180",1,4,"FLOOR HAND",640352662,"TARAS WOITAS",8,,,"TW"
"TPI190",1,"REG",25,,,,,,
"TPI190",2,"REG",25,,,,,,
"TPI190",3,"REG",25,,,,,,
=====[posted and mailed, please reply in news]

Indra (itimilsina@.savannaenergy.com) writes:
> I am looking for information on how to import the txt or csv file to
> the multiple table in sql 2000. If you have any kind of inf. please
> let me know wheather we can do this and how.
> below is the detail information.
> I received txt file every day which contain the information from 3
> different related table in my sql 2000 database. Right now we are
> keyin the information from the web site (which is link to the txt
> file) to our database, but i am wondering if we can import those
> record in the tables.
> the header of the file goes to table1 and when we insert the record in
> table1, it should generate the autoidentityrecord (PK), and that PK is
> link to other table2 and table3 where rest of the information from txt
> file goes. For table2 and table3 there are multiple record per txt
> files.
> in our txt file each row is separated with row header, like HTC100
> WITH ROW NO. 1,2,3.., which indecate this information goes to table
> and 1,2...are the different row number.
> Please let me know whether we can achive this task or not.

Of course, it is possible. However, as far as I can see only by means of
writing a program that reads and parses the file. The standard tools for
loading files into SQL Server are BCP and DTS. BCP cannot cope with your
file, because there is a mix of record formats. BCP can only import files
with uniform records.

DTS is more versatile than BCP, but I as far as can see, you will still
have to write code to have DTS to import the file. I need to add the
disclaimer that I have zero experience of DTS.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||I agree with Erland. Could it be done with BCP and TSQL? Yes. Would it be
an efficient, maintainable, error resilient code? No.

It would be far better to write a piece of code or better get the sender to
supply X number of files, one for each table. The original creator of the
file obviously has control over the file format.

Danny

"Erland Sommarskog" <esquel@.sommarskog.se> wrote in message
news:Xns95998D7498D77Yazorman@.127.0.0.1...
> [posted and mailed, please reply in news]
> Indra (itimilsina@.savannaenergy.com) writes:
>> I am looking for information on how to import the txt or csv file to
>> the multiple table in sql 2000. If you have any kind of inf. please
>> let me know wheather we can do this and how.
>>
>> below is the detail information.
>>
>> I received txt file every day which contain the information from 3
>> different related table in my sql 2000 database. Right now we are
>> keyin the information from the web site (which is link to the txt
>> file) to our database, but i am wondering if we can import those
>> record in the tables.
>>
>> the header of the file goes to table1 and when we insert the record in
>> table1, it should generate the autoidentityrecord (PK), and that PK is
>> link to other table2 and table3 where rest of the information from txt
>> file goes. For table2 and table3 there are multiple record per txt
>> files.
>>
>> in our txt file each row is separated with row header, like HTC100
>> WITH ROW NO. 1,2,3.., which indecate this information goes to table
>> and 1,2...are the different row number.
>>
>> Please let me know whether we can achive this task or not.
> Of course, it is possible. However, as far as I can see only by means of
> writing a program that reads and parses the file. The standard tools for
> loading files into SQL Server are BCP and DTS. BCP cannot cope with your
> file, because there is a mix of record formats. BCP can only import files
> with uniform records.
> DTS is more versatile than BCP, but I as far as can see, you will still
> have to write code to have DTS to import the file. I need to add the
> disclaimer that I have zero experience of DTS.
> --
> Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
> Books Online for SQL Server SP3 at
> http://www.microsoft.com/sql/techin.../2000/books.asp|||Danny (istdrs@.flash.net) writes:
> I agree with Erland. Could it be done with BCP and TSQL? Yes.

No. :-) Since there is a variable number of fields on each row, BCP would
be lost.

> It would be far better to write a piece of code or better get the sender
> to supply X number of files, one for each table. The original creator
> of the file obviously has control over the file format.

Yes, fixing the file format may very well be a good idea.

Interesting enough, it could still be one file, if the shorter rows were
padded with extra fields. Note that just adding ,,,,, would not be enough,
since the " too are delimiters as far as BCP is concerned, so the usage of
" must be consistent from record to record. (Which does not seem to be the
case in the current file.)
--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se

Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||Hi Erland,

I am still trying to solve this problem of inserting txt file and
manupulating data to insert in differet table in sql 2000. I have
problem of inserting nvarchar which is coming from txt file to datetime
field in sql table.

If you look through my txt file there is a row start with TTL160 AND WE
HAVE COL4 AND COL5 WITH DATA LIKE 1.00 AND 12.23 ETC,(WHICH IS IN
NVARCHAR), I NEED TO INSERT THIS VALUE AS 1:00 OR 12:23 IN MY ANOTHER
TABEL IN DATETIME FIEDL. I COULD USED REPLACE(COLUMN, '.',':') TO
REPLACE FROM 12.23 TO 12:23 BUT WHEN I INSERT THIS TO SQL DATETIME FIELD
ITS GIVING ME ERROR " Arithmetic overflow error converting expression
to data type datetime".I try to use Cast function, still same error,
could MICROSOFT have a look please.

Thanks for help.

Indra.

*** Sent via Developersdex http://www.developersdex.com ***
Don't just participate in USENET...get rewarded for it!

importing to swl server from a flat file PK problem

I am trying to import into sql server from a .csv flat file i made in excel. The problem is the Primary Key. It is set to auto increment.
When i leave an empty column i get:
"Directcopyxform conversion error:destination does not allow null"

when i just omit the column entirely, i get a wrong datatype error because it basically tries to copy all the columns 1 shift to the left

How am i supposed to represent and auto PK in .csv file. ThanksUse a staging table. Where you can load the information to, perform any processing, then insert into your table. Finally cleaning out the staging table at the end.