Friday, March 30, 2012
Importing XML to SQL Server: National "Do Not Call" database
It is in XML format (*.XML) and I understand it consists of 2 fields: area
code and phone number and about 50 million records.
What is the best way to get this file into a SQL Server table?One option would be to look at using SQLXML and XML Bulk
Load. You can find more information, links and a download
link at:
http://msdn.microsoft.com/nhp/Default.asp?contentid=28001300
-Sue
On Thu, 18 Sep 2003 16:56:48 -0700, "DaveF"
<davef@.comcast.net> wrote:
>I need to import the "National Do Not Call" database into SQL server.
>It is in XML format (*.XML) and I understand it consists of 2 fields: area
>code and phone number and about 50 million records.
>What is the best way to get this file into a SQL Server table?
>
Importing XML for Newbies
of the fields. I have only minor knowledge of XML, but have been SQL DBA fo
r
a while. I just have not had a need to use XML in the SQL environment. We
are using SQL2000. What is the best way to do the following:
1. Create a DB for storing the log files? Some variation on the usual way?
2. Import the individual files into the DB. Eventually I can write a DTS or
something to import?
3. Query the DB. Create some reports on usage.
If somene could post a link to get me started it would be very helpful.DaveK wrote:
> I need to store log files that are created in XML format so I can query so
me
> of the fields.
Why not just use an XQuery implementation to query them direct? I see no
requirement here to involve a database at all.
///Peter
> I have only minor knowledge of XML, but have been SQL DBA for
> a while. I just have not had a need to use XML in the SQL environment. W
e
> are using SQL2000. What is the best way to do the following:
> 1. Create a DB for storing the log files? Some variation on the usual way
?
> 2. Import the individual files into the DB. Eventually I can write a DTS
or
> something to import?
> 3. Query the DB. Create some reports on usage.|||After a five minute web search and SQL help search I don't see how I can
avoid using SQL to build some sort of datbase just so I can address backup,
security, etc. It looks like XPath and XQuery can do some searching, but th
e
other items that I need to cover are not really addressed. This is an
example of one event log item I want to store.
<CreateDate>7/31/2007</CreateDate>
<CreateTime>10:19:25</CreateTime>
<Logger>xxxxd7yrrt11</Logger>
<Events>
<Event>
<EventID>8001</EventID>
<Description>Login</Description>
<Category>Audit</Category>
<Source>LAN Client</Source>
<SubSource>xxxx_8</SubSource>
<UserName>test.name</UserName>
<UserID>347</UserID>
<Computer>xxx7YRRT11</Computer>
<Date>07/31/2007</Date>
<Time>10:19:49</Time>
<ObjectType>User</ObjectType>
<Details></Details>
</Event>
I have to admit I am not a convert to the XML world, but as I said, I am a
newbie to it as well. In this case it just seems like a fancy way to
eliminate delimited importing. The format is not likely to change.|||DaveK wrote:
> After a five minute web search and SQL help search I don't see how I can
> avoid using SQL to build some sort of datbase just so I can address backup
,
> security, etc. It looks like XPath and XQuery can do some searching, but
the
> other items that I need to cover are not really addressed. This is an
> example of one event log item I want to store.
> <CreateDate>7/31/2007</CreateDate>
> <CreateTime>10:19:25</CreateTime>
> <Logger>xxxxd7yrrt11</Logger>
> <Events>
> <Event>
> <EventID>8001</EventID>
> <Description>Login</Description>
> <Category>Audit</Category>
> <Source>LAN Client</Source>
> <SubSource>xxxx_8</SubSource>
> <UserName>test.name</UserName>
> <UserID>347</UserID>
> <Computer>xxx7YRRT11</Computer>
> <Date>07/31/2007</Date>
> <Time>10:19:49</Time>
> <ObjectType>User</ObjectType>
> <Details></Details>
> </Event>
> I have to admit I am not a convert to the XML world, but as I said, I am a
> newbie to it as well. In this case it just seems like a fancy way to
> eliminate delimited importing. The format is not likely to change.
AFAIK all database systems now offer some kind of "Import XML" plugin.
If yours doesn't, you'll need to turn the XML into CSV or whatever your
system consumes. The easiest way to do this is to write an XSLT script,
and I think there are several quoted or linked in Dave Pawson's XSL FAQ
at http://www.dpawson.co.uk/xsl/
XML is just a fancy way of identifying information: you can see from the
above example that it is much clearer in naming items and positioning
them in the hierarchy than (for example) CSV. If the format is stable,
then a little routine to run XSLT over the data and spit out CSV should
do you just fine.
The following appears to work for the sample above (with the addition of
the missing </Events> end-tag and the enclosing root element
<data>...</data> ):
<xsl:stylesheet xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
version="1.0">
<xsl:output method="text"/>
<xsl:strip-space elements="*"/>
<xsl:template match="/">
<xsl:apply-templates select="data/CreateDate"/>
<xsl:apply-templates select="data/CreateTime"/>
<xsl:apply-templates select="data/Logger"/>
<xsl:apply-templates select="data/Events/Event/*"/>
</xsl:template>
<!-- fields that start a record -->
<xsl:template match="CreateDate|EventID">
<xsl:text>"</xsl:text>
<xsl:value-of select="."/>
</xsl:template>
<!-- fields that occur in mid-record -->
<xsl:template match="*">
<xsl:text>,"</xsl:text>
<xsl:value-of select="."/>
<xsl:text>"</xsl:text>
</xsl:template>
<!-- fields that end a record -->
<xsl:template match="Logger|Details">
<xsl:text>,"</xsl:text>
<xsl:value-of select="."/>
<xsl:text>" </xsl:text>
</xsl:template>
</xsl:stylesheet>
$ java -jar /usr/local/saxon/b8.5/saxon8.jar -o test.csv test.xml test.xsl
$ cat test.csv
"7/31/2007,"10:19:25","xxxxd7yrrt11"
"8001,"Login","Audit","LAN
Client","xxxx_8","test.name","347","xxx7YRRT11","07/31/2007","10:19:49","Use
r",""
$
This makes the assumption that your import routine can do something
different with record #1...
///Peter
--
XML FAQ: http://xml.silmaril.ie/|||Thank you. This is very helpful. I did not paste in the whole log so misse
d
the ending tags.
Importing XML for Newbies
of the fields. I have only minor knowledge of XML, but have been SQL DBA for
a while. I just have not had a need to use XML in the SQL environment. We
are using SQL2000. What is the best way to do the following:
1. Create a DB for storing the log files? Some variation on the usual way?
2. Import the individual files into the DB. Eventually I can write a DTS or
something to import?
3. Query the DB. Create some reports on usage.
If somene could post a link to get me started it would be very helpful.
After a five minute web search and SQL help search I don't see how I can
avoid using SQL to build some sort of datbase just so I can address backup,
security, etc. It looks like XPath and XQuery can do some searching, but the
other items that I need to cover are not really addressed. This is an
example of one event log item I want to store.
<CreateDate>7/31/2007</CreateDate>
<CreateTime>10:19:25</CreateTime>
<Logger>xxxxd7yrrt11</Logger>
<Events>
<Event>
<EventID>8001</EventID>
<Description>Login</Description>
<Category>Audit</Category>
<Source>LAN Client</Source>
<SubSource>xxxx_8</SubSource>
<UserName>test.name</UserName>
<UserID>347</UserID>
<Computer>xxx7YRRT11</Computer>
<Date>07/31/2007</Date>
<Time>10:19:49</Time>
<ObjectType>User</ObjectType>
<Details></Details>
</Event>
I have to admit I am not a convert to the XML world, but as I said, I am a
newbie to it as well. In this case it just seems like a fancy way to
eliminate delimited importing. The format is not likely to change.
|||Thank you. This is very helpful. I did not paste in the whole log so missed
the ending tags.
Wednesday, March 28, 2012
Importing textfiles in to text fields how ?
I have text files.
(Less than a hundred files, sizes between 3 K and 150 K)
I would like to import these files into the database, first
in a table called : Long_text_table.
Each text file goes into only one field.
The id and label fields will be assigned by hand with the
correct values, so that the long text can be moved to
the correct field in the destination table with an sql statement.
How can I get the texts in to this Long_text_table ?
(Using the standard SQL-server tools).
ben brugman
The import table will be something like :
CREATE TABLE [dbo].[Long_text_table] (
[id1] [int] IDENTITY (1, 1) NOT NULL ,
[id2] [int] NULL ,
[id3] [int] NULL ,
[label1] [varchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[label2] [varchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[longtext] [text] COLLATE SQL_Latin1_General_CP1_CI_AS NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
You can do it with DTS as outlined in
http://www.sqldts.com/?246
Set the field and row delimiter of the Import text file to a combination of
characters that you are sure do not appear in the text file and the whole
file will be treated as one field.
You can just type over the list with delimiters in the DTS designer, you are
not limited to {CR}{LF} etc.
Jacco Schalkwijk
SQL Server MVP
"ben brugman" <ben@.niethier.nl> wrote in message
news:eCvSwC8pEHA.3252@.TK2MSFTNGP14.phx.gbl...
> Hello,
> I have text files.
> (Less than a hundred files, sizes between 3 K and 150 K)
> I would like to import these files into the database, first
> in a table called : Long_text_table.
> Each text file goes into only one field.
> The id and label fields will be assigned by hand with the
> correct values, so that the long text can be moved to
> the correct field in the destination table with an sql statement.
> How can I get the texts in to this Long_text_table ?
> (Using the standard SQL-server tools).
> ben brugman
>
> The import table will be something like :
> CREATE TABLE [dbo].[Long_text_table] (
> [id1] [int] IDENTITY (1, 1) NOT NULL ,
> [id2] [int] NULL ,
> [id3] [int] NULL ,
> [label1] [varchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [label2] [varchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [longtext] [text] COLLATE SQL_Latin1_General_CP1_CI_AS NULL
> ) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
> GO
>
|||Thanks for your attention,
I'll have a look into this. (On the first glance this is not familiar to
me).
For importing files in a single field I have found
the DTS readfile transformation. I tried this with one file
and it works.
Next week I'll try to integrate both methods.
(If I understand the example then).
Otherwise I have to type the filenames by hand one at the
time. (Not a huge problem).
thanks again
ben
"Jacco Schalkwijk" <jacco.please.reply@.to.newsgroups.mvps.org.invalid > wrote
in message news:ubR5bS8pEHA.2456@.TK2MSFTNGP10.phx.gbl...
> You can do it with DTS as outlined in
> http://www.sqldts.com/?246
> Set the field and row delimiter of the Import text file to a combination
of
> characters that you are sure do not appear in the text file and the whole
> file will be treated as one field.
> You can just type over the list with delimiters in the DTS designer, you
are
> not limited to {CR}{LF} etc.
> --
> Jacco Schalkwijk
> SQL Server MVP
>
> "ben brugman" <ben@.niethier.nl> wrote in message
> news:eCvSwC8pEHA.3252@.TK2MSFTNGP14.phx.gbl...
>
Importing textfiles in to text fields how ?
I have text files.
(Less than a hundred files, sizes between 3 K and 150 K)
I would like to import these files into the database, first
in a table called : Long_text_table.
Each text file goes into only one field.
The id and label fields will be assigned by hand with the
correct values, so that the long text can be moved to
the correct field in the destination table with an sql statement.
How can I get the texts in to this Long_text_table ?
(Using the standard SQL-server tools).
ben brugman
The import table will be something like :
CREATE TABLE [dbo].[Long_text_table] (
[id1] [int] IDENTITY (1, 1) NOT NULL ,
[id2] [int] NULL ,
[id3] [int] NULL ,
[label1] [varchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[label2] [varchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[longtext] [text] COLLATE SQL_Latin1_General_CP1_CI_AS NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GOYou can do it with DTS as outlined in
http://www.sqldts.com/?246
Set the field and row delimiter of the Import text file to a combination of
characters that you are sure do not appear in the text file and the whole
file will be treated as one field.
You can just type over the list with delimiters in the DTS designer, you are
not limited to {CR}{LF} etc.
--
Jacco Schalkwijk
SQL Server MVP
"ben brugman" <ben@.niethier.nl> wrote in message
news:eCvSwC8pEHA.3252@.TK2MSFTNGP14.phx.gbl...
> Hello,
> I have text files.
> (Less than a hundred files, sizes between 3 K and 150 K)
> I would like to import these files into the database, first
> in a table called : Long_text_table.
> Each text file goes into only one field.
> The id and label fields will be assigned by hand with the
> correct values, so that the long text can be moved to
> the correct field in the destination table with an sql statement.
> How can I get the texts in to this Long_text_table ?
> (Using the standard SQL-server tools).
> ben brugman
>
> The import table will be something like :
> CREATE TABLE [dbo].[Long_text_table] (
> [id1] [int] IDENTITY (1, 1) NOT NULL ,
> [id2] [int] NULL ,
> [id3] [int] NULL ,
> [label1] [varchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [label2] [varchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [longtext] [text] COLLATE SQL_Latin1_General_CP1_CI_AS NULL
> ) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
> GO
>|||Thanks for your attention,
I'll have a look into this. (On the first glance this is not familiar to
me).
For importing files in a single field I have found
the DTS readfile transformation. I tried this with one file
and it works.
Next week I'll try to integrate both methods.
(If I understand the example then).
Otherwise I have to type the filenames by hand one at the
time. (Not a huge problem).
thanks again
ben
"Jacco Schalkwijk" <jacco.please.reply@.to.newsgroups.mvps.org.invalid> wrote
in message news:ubR5bS8pEHA.2456@.TK2MSFTNGP10.phx.gbl...
> You can do it with DTS as outlined in
> http://www.sqldts.com/?246
> Set the field and row delimiter of the Import text file to a combination
of
> characters that you are sure do not appear in the text file and the whole
> file will be treated as one field.
> You can just type over the list with delimiters in the DTS designer, you
are
> not limited to {CR}{LF} etc.
> --
> Jacco Schalkwijk
> SQL Server MVP
>
> "ben brugman" <ben@.niethier.nl> wrote in message
> news:eCvSwC8pEHA.3252@.TK2MSFTNGP14.phx.gbl...
> > Hello,
> >
> > I have text files.
> > (Less than a hundred files, sizes between 3 K and 150 K)
> >
> > I would like to import these files into the database, first
> > in a table called : Long_text_table.
> >
> > Each text file goes into only one field.
> > The id and label fields will be assigned by hand with the
> > correct values, so that the long text can be moved to
> > the correct field in the destination table with an sql statement.
> >
> > How can I get the texts in to this Long_text_table ?
> > (Using the standard SQL-server tools).
> >
> > ben brugman
> >
> >
> > The import table will be something like :
> >
> > CREATE TABLE [dbo].[Long_text_table] (
> > [id1] [int] IDENTITY (1, 1) NOT NULL ,
> > [id2] [int] NULL ,
> > [id3] [int] NULL ,
> > [label1] [varchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> > [label2] [varchar] (50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> > [longtext] [text] COLLATE SQL_Latin1_General_CP1_CI_AS NULL
> > ) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
> > GO
> >
> >
>
importing text in SQL server
Some of the text fields contain special characters...like:
when I import a text like this into SQL Server I get:
??le-de-France
What can I do to prevent this from happening? (other fieldtype?)Using Unicode is a good choice...That maps to NTEXT data type in SQL Server|||Using Unicode is a good choice...That maps to NTEXT/NVARCHAR data type in SQL Server|||Could you please describe the steps I need to take to make this work?
I cant get it done..:$...sql
Friday, March 23, 2012
Importing Null Date Fields
I'm using SQL Server Express and am trying to import a CVS file. The CVS file contains a string field (named DAS) that represents a Date. This field can be null.
I've tried using the DTS Wizard to import this CVS file and convert the DAS field to a Date, which works great until it hits a record with a NULL DAS field. It then throws a convertion error.
Still using the DTS Wizard, I've changed the DataType of the DAS field in the Source file to [DT_DATE], it works fine but all the null dates are converted to 12/30/1899.
Is there a way (DTS Wizard or something else) that will allow me to import these CVS files with null Date fields and keep them as null in SQL Server table.
Thanks for any help,
Jon
Hi,
I read your post and I can advice you the next:
1. Once you have date 12/30/1899 instead all NULL values in SQL Server => you may easy to sort that records, to mark them and place NULL value with “copy” and “paste”. It is easy and quickly.
2. You may write a program instead DTS Wizard /I don’t know that wizard/. That program will read from your CVS file and write to SQL server. From the program you will have full control on all fields. I personally prefer to write a program when I have some unusual case.
I hope that advices will solve the problem. But if you still can’t make NULL values – let me know.
Regards,
Hristo Markov
|||Off the top of my head, I'm thinking there is an option to "Keep Nulls" inside the wizard. I haven't run it and I'm not on my machine w/ SSIS installed so I can't verify it. If you do see such an option, that is how you tell SSIS to preserve the NULLs.If this isn't an option, then you'll have to go one step deeper and either do custom transformations or write an EXECUTE SQL TASK that will go through and update any record of 12/30/1899 to be NULL.
Wednesday, March 21, 2012
importing LongText fields
I've got a large db I downloaded in a .SQL format that came originally from
MySQL. I'm trying to alter the code to allow me to run it under MS-SQL.
I've got it mostly translated over, but I'm running into a problem with
MySQL's LONGTEXT data type. I created a unique data type (using SQL 2K5)
called LONGTEXT that I assigned to a VARCHAR(8000), and I'm still running
into problems with fields being too long to enter.
I guess the crux of my question is this: is there any way to go over the
8000-character limitation?
Regards,
Scott McNair
"Scott McNair" <scott.mcnair@.sfmco.takethispartout.com> wrote in message
news:Xns95DAA939F3E2Dsfmco@.207.46.248.16...
> Hi all,
> I've got a large db I downloaded in a .SQL format that came originally
> from
> MySQL. I'm trying to alter the code to allow me to run it under MS-SQL.
> I've got it mostly translated over, but I'm running into a problem with
> MySQL's LONGTEXT data type. I created a unique data type (using SQL 2K5)
> called LONGTEXT that I assigned to a VARCHAR(8000), and I'm still running
> into problems with fields being too long to enter.
> I guess the crux of my question is this: is there any way to go over the
> 8000-character limitation?
> Regards,
> Scott McNair
Take a look at text and ntext. You have a 2GB/1GB limit respectively.
HTH
Rick Sawtell
MCT, MCSD, MCDBA
|||"Rick Sawtell" <quickening@.msn.com> wrote in news:ODEu8w#9EHA.2568
@.TK2MSFTNGP10.phx.gbl:
> Take a look at text and ntext. You have a 2GB/1GB limit respectively.
Perfect. I created a UDDT associating "longtext" with text, and it imports
more-or-less perfectly.
|||My apology for btting in this reply but I have the same issue where I need to
store +- up to 16000 characters in one field and I tried text or ntext but
the max size value is 16, can you tell me how to set up that field to accept
the 16000 i need or more?
Thanks
Xaviervp
"Scott McNair" wrote:
> "Rick Sawtell" <quickening@.msn.com> wrote in news:ODEu8w#9EHA.2568
> @.TK2MSFTNGP10.phx.gbl:
>
> Perfect. I created a UDDT associating "longtext" with text, and it imports
> more-or-less perfectly.
>
|||As Rick stated, the max text/ntext/image size is 2GB. You don't need to do
anything special.
The reported 16-byte length is the default text-in-row length. The default
length of 16 will hold the pointer to the separately stored value.
Hope this helps.
Dan Guzman
SQL Server MVP
"xaviervp" <xaviervp@.discussions.microsoft.com> wrote in message
news:66C5708A-B2F2-44ED-BC46-727AFD090EEB@.microsoft.com...[vbcol=seagreen]
> My apology for btting in this reply but I have the same issue where I need
> to
> store +- up to 16000 characters in one field and I tried text or ntext but
> the max size value is 16, can you tell me how to set up that field to
> accept
> the 16000 i need or more?
> Thanks
> Xaviervp
> "Scott McNair" wrote:
importing LongText fields
I've got a large db I downloaded in a .SQL format that came originally from
MySQL. I'm trying to alter the code to allow me to run it under MS-SQL.
I've got it mostly translated over, but I'm running into a problem with
MySQL's LONGTEXT data type. I created a unique data type (using SQL 2K5)
called LONGTEXT that I assigned to a VARCHAR(8000), and I'm still running
into problems with fields being too long to enter.
I guess the crux of my question is this: is there any way to go over the
8000-character limitation?
Regards,
Scott McNair"Scott McNair" <scott.mcnair@.sfmco.takethispartout.com> wrote in message
news:Xns95DAA939F3E2Dsfmco@.207.46.248.16...
> Hi all,
> I've got a large db I downloaded in a .SQL format that came originally
> from
> MySQL. I'm trying to alter the code to allow me to run it under MS-SQL.
> I've got it mostly translated over, but I'm running into a problem with
> MySQL's LONGTEXT data type. I created a unique data type (using SQL 2K5)
> called LONGTEXT that I assigned to a VARCHAR(8000), and I'm still running
> into problems with fields being too long to enter.
> I guess the crux of my question is this: is there any way to go over the
> 8000-character limitation?
> Regards,
> Scott McNair
Take a look at text and ntext. You have a 2GB/1GB limit respectively.
HTH
Rick Sawtell
MCT, MCSD, MCDBA|||"Rick Sawtell" <quickening@.msn.com> wrote in news:ODEu8w#9EHA.2568
@.TK2MSFTNGP10.phx.gbl:
> Take a look at text and ntext. You have a 2GB/1GB limit respectively.
Perfect. I created a UDDT associating "longtext" with text, and it imports
more-or-less perfectly.|||My apology for btting in this reply but I have the same issue where I need t
o
store +- up to 16000 characters in one field and I tried text or ntext but
the max size value is 16, can you tell me how to set up that field to accept
the 16000 i need or more?
Thanks
Xaviervp
"Scott McNair" wrote:
> "Rick Sawtell" <quickening@.msn.com> wrote in news:ODEu8w#9EHA.2568
> @.TK2MSFTNGP10.phx.gbl:
>
> Perfect. I created a UDDT associating "longtext" with text, and it import
s
> more-or-less perfectly.
>|||As Rick stated, the max text/ntext/image size is 2GB. You don't need to do
anything special.
The reported 16-byte length is the default text-in-row length. The default
length of 16 will hold the pointer to the separately stored value.
Hope this helps.
Dan Guzman
SQL Server MVP
"xaviervp" <xaviervp@.discussions.microsoft.com> wrote in message
news:66C5708A-B2F2-44ED-BC46-727AFD090EEB@.microsoft.com...[vbcol=seagreen]
> My apology for btting in this reply but I have the same issue where I need
> to
> store +- up to 16000 characters in one field and I tried text or ntext but
> the max size value is 16, can you tell me how to set up that field to
> accept
> the 16000 i need or more?
> Thanks
> Xaviervp
> "Scott McNair" wrote:
>
importing LongText fields
I've got a large db I downloaded in a .SQL format that came originally from
MySQL. I'm trying to alter the code to allow me to run it under MS-SQL.
I've got it mostly translated over, but I'm running into a problem with
MySQL's LONGTEXT data type. I created a unique data type (using SQL 2K5)
called LONGTEXT that I assigned to a VARCHAR(8000), and I'm still running
into problems with fields being too long to enter.
I guess the crux of my question is this: is there any way to go over the
8000-character limitation?
Regards,
Scott McNair"Scott McNair" <scott.mcnair@.sfmco.takethispartout.com> wrote in message
news:Xns95DAA939F3E2Dsfmco@.207.46.248.16...
> Hi all,
> I've got a large db I downloaded in a .SQL format that came originally
> from
> MySQL. I'm trying to alter the code to allow me to run it under MS-SQL.
> I've got it mostly translated over, but I'm running into a problem with
> MySQL's LONGTEXT data type. I created a unique data type (using SQL 2K5)
> called LONGTEXT that I assigned to a VARCHAR(8000), and I'm still running
> into problems with fields being too long to enter.
> I guess the crux of my question is this: is there any way to go over the
> 8000-character limitation?
> Regards,
> Scott McNair
Take a look at text and ntext. You have a 2GB/1GB limit respectively.
HTH
Rick Sawtell
MCT, MCSD, MCDBA|||"Rick Sawtell" <quickening@.msn.com> wrote in news:ODEu8w#9EHA.2568
@.TK2MSFTNGP10.phx.gbl:
> Take a look at text and ntext. You have a 2GB/1GB limit respectively.
Perfect. I created a UDDT associating "longtext" with text, and it imports
more-or-less perfectly.|||My apology for btting in this reply but I have the same issue where I need to
store +- up to 16000 characters in one field and I tried text or ntext but
the max size value is 16, can you tell me how to set up that field to accept
the 16000 i need or more?
Thanks
Xaviervp
"Scott McNair" wrote:
> "Rick Sawtell" <quickening@.msn.com> wrote in news:ODEu8w#9EHA.2568
> @.TK2MSFTNGP10.phx.gbl:
> > Take a look at text and ntext. You have a 2GB/1GB limit respectively.
> Perfect. I created a UDDT associating "longtext" with text, and it imports
> more-or-less perfectly.
>|||As Rick stated, the max text/ntext/image size is 2GB. You don't need to do
anything special.
The reported 16-byte length is the default text-in-row length. The default
length of 16 will hold the pointer to the separately stored value.
--
Hope this helps.
Dan Guzman
SQL Server MVP
"xaviervp" <xaviervp@.discussions.microsoft.com> wrote in message
news:66C5708A-B2F2-44ED-BC46-727AFD090EEB@.microsoft.com...
> My apology for btting in this reply but I have the same issue where I need
> to
> store +- up to 16000 characters in one field and I tried text or ntext but
> the max size value is 16, can you tell me how to set up that field to
> accept
> the 16000 i need or more?
> Thanks
> Xaviervp
> "Scott McNair" wrote:
>> "Rick Sawtell" <quickening@.msn.com> wrote in news:ODEu8w#9EHA.2568
>> @.TK2MSFTNGP10.phx.gbl:
>> > Take a look at text and ntext. You have a 2GB/1GB limit respectively.
>> Perfect. I created a UDDT associating "longtext" with text, and it
>> imports
>> more-or-less perfectly.sql
Importing from Access question
I have a SQL Server database that was created by importing an MS Access
database.
In one of my SQL tables I have added several fields, and have populated some
of these fields with data.
Is there any way to create a DTS package, or do a straight import, where I
pull all records from the original table in Access, but without losing the
data in the new fields in SQL?
Basically I want to rewrite all my SQL data, but retain the new values...
For ex. if I have a record with ID 1000 in both database tables, but in my
SQL I have an extra field, with data...can I import all records from Access,
but for record with ID 1000, retain my new data?
Thanks in advance!
AmberYes, In Access, create a Linked Table (Go File, Get External Data, Link
Tables, and create a linked table that points to the SQL Server Table...
Then in Access, create an Update Query, that updates the Linked Table values
for the columns that are in Access which is based on a join between the
Access table and the SQL Table, on whatever the PK is on both sides..
"amber" wrote:
> Hello,
> I have a SQL Server database that was created by importing an MS Access
> database.
> In one of my SQL tables I have added several fields, and have populated so
me
> of these fields with data.
> Is there any way to create a DTS package, or do a straight import, where I
> pull all records from the original table in Access, but without losing the
> data in the new fields in SQL?
> Basically I want to rewrite all my SQL data, but retain the new values...
> For ex. if I have a record with ID 1000 in both database tables, but in my
> SQL I have an extra field, with data...can I import all records from Acces
s,
> but for record with ID 1000, retain my new data?
> Thanks in advance!
> Amber
Importing from access
I then imported the same table into a ms querry and the date field pulled the data in excel.
So there is data when I pull it into excel, but none when I pull it into crystal.
Any suggestions?Check the settings for --> File / Options / Fields / DateTimesql
Importing from a Flat File
I have a flat file which has five fields, and need to import into a table that has 6 fields, the first of which will be data from a variable.
i have Used a Dataflow task which has a flat file source task . i need help to get the data from that into the table
Table field1 field2 field3 field4 field5 field6
@.Variable
File Col0 Col1 Col2 Col3 Col4
can anyone explain how i can achieve thsi
I would suggest using a derived column task to add a column to the pipeline that can be set to your variable.
Then I would use an OLE DB destination. You can set the target table in the OLE DB Destination.
Place the derived column task right after the flat file source then pipe the data into the destination.
HTH
Mark
http://spaces.msn.com/mgarnerbi
Monday, March 19, 2012
Importing fixed field length files
Im importing data from a file, and saving it into a table, nothing complex.
The file contains fixed length fields, and the table is integers ( default
NULL ). When a value in the file is NULL, it is simply a series of spaces
indicating no value.
When EM parses the spaces, instead of inserting NULL into the table (because
it didn't find a digit), it complains about incompatable types?
Is the parsing engine really so stupid that it cannot convert spaces to a
single NULL when it knows it should be looking for numbers?!
How can i make this work?
Jeo
Does no-one know how to do the conversion?
"JoeB" <joe@.kybert__***NO_SPAM*sdfsdfsd****___.com> wrote in message
news:e$f0MfchEHA.3944@.tk2msftngp13.phx.gbl...
> Hi,
> Im importing data from a file, and saving it into a table, nothing
> complex.
> The file contains fixed length fields, and the table is integers ( default
> NULL ). When a value in the file is NULL, it is simply a series of spaces
> indicating no value.
> When EM parses the spaces, instead of inserting NULL into the table
> (because it didn't find a digit), it complains about incompatable types?
> Is the parsing engine really so stupid that it cannot convert spaces to a
> single NULL when it knows it should be looking for numbers?!
>
> How can i make this work?
>
> Jeo
>
>
|||Hiya,
There are two ways I can think of doing this, the quick dirty way and
the complicated clever way,
1.Import the integer into a text field and then use an update
query to populate the interger field. You can then use a combination
of trim and case statements to get the values you want. It'll work,
but not ideal.
2.I assume you're using DTS to import the data. In which case
you can change the transformation the this column from a simple copy
column to a an activeX script. You can then create some VB script to
change the value to null if it finds all spaces.
If it was me then it would depend on whether this is a one off or a
regular job. For a one off I would do option 1, if I'm going to have
to do this regularly then I would take the time and do option 2.
Darious
On Tue, 24 Aug 2004 11:12:19 +0100, "JoeB"
<joe@.kybert__***NO_SPAM*sdfsdfsd****___.com> wrote:
>Does no-one know how to do the conversion?
>
>"JoeB" <joe@.kybert__***NO_SPAM*sdfsdfsd****___.com> wrote in message
>news:e$f0MfchEHA.3944@.tk2msftngp13.phx.gbl...
>
|||Cheers, Im going to try option one as i cannot write VB code.
Thanks.
Joe
"Darious" <news@.areyoukidding.com> wrote in message
news:q2dni0haq1pemlcbvmnpbmg37js7q5a5em@.4ax.com...
> Hiya,
> There are two ways I can think of doing this, the quick dirty way and
> the complicated clever way,
> 1. Import the integer into a text field and then use an update
> query to populate the interger field. You can then use a combination
> of trim and case statements to get the values you want. It'll work,
> but not ideal.
> 2. I assume you're using DTS to import the data. In which case
> you can change the transformation the this column from a simple copy
> column to a an activeX script. You can then create some VB script to
> change the value to null if it finds all spaces.
> If it was me then it would depend on whether this is a one off or a
> regular job. For a one off I would do option 1, if I'm going to have
> to do this regularly then I would take the time and do option 2.
> Darious
> On Tue, 24 Aug 2004 11:12:19 +0100, "JoeB"
> <joe@.kybert__***NO_SPAM*sdfsdfsd****___.com> wrote:
>
Importing EBCDIC file with COMP3 fields
Hi All,
I have a file with several columns in Comp-3.
I have downloaded the UnPack Decimal component and, as it needs byte stream (dt_bytes) as input, so I set up an appropriate Flat File Source columns.
But I get this error:
[Flat File Source [2201]] Error: Data conversion failed. The data conversion for column "DTCDC" returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
[Flat File Source [2201]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "output column "DTCDC" (2250)" failed because error code 0xC0209084 occurred, and the error row disposition on "output column "DTCDC" (2250)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
DTCDC is first of the columns packed. These are mostly date columns packed into 5 bytes - should be unpacked to normal SQL date.
I've tried different locale , as suggested in other threads, but it didn't help.
Can anybody help me with this issue how can I do it in (VB.NET script ,importing as String?).
Thanks in advance
Michal
Did you remember to go into the advanced editor and mark the fields as binary with the correct column width?
|||Hi EWisdahl,
Indeed it helped
UnPackDecimal does pretty good job too.
Thanks,
Michal
|||Just as a reminder, if your question is answered, mark the reply as the answer. It helps out those who are looking for answers to the same type of questions later, as well as those who try to look through all of the unaswered posts...|||If your file gets too complicated there is a custom source component at www.aminosoftware.com that is supposed to do conversion to ascii on the fly including comp-3 with redefines, occurs, and other nasty EBCDIC nuances.Importing Decimal Data Types into SQL Server 2005
I have a simple Integration Services project and the problem is that decimal fields are importing as real (I'm loosing the digits behind the decimal point).
The project contains a data flow task importing a flat file (.csv) to an SQL Server destination. My .csv file has two decimal type fields. Example:
Field 1: 12345.67
Field 2: .123456
My database table that I'm importing to has two fields. The only way that I can get this data to import is to define the fields as "float" in both the text file and database table. I want the database table fields to be defined as decimal or numeric, not float:
Field 1: decimal(7,2)
Field 2: decimal(6,6)
When all fields are defined as decimal (in both the flat file and database file), I get the following results:
Field 1: 12345.00
Field 2: .000000
How does one import decimal data from a flat file (.csv)?
Thank you in advance!
I answered my own question. My eyes were just not seeing the DataScale property of the Advanced connection manager editor window. Once I plugged in the scale value my decimals imported perfectly.
jn
Wednesday, March 7, 2012
Importing Data from Excel to update existing fields
I think I've figured out how to use DTS and update the two fields, but I'm afraid that when everything runs new entries will be created with component information. Is it possible to specify that only rows where Pf_ID matches some row in column A that same row's column B will be used to update the data in In_Stock. I may have just made things too confusing than they need to be, but I don't have much experience with EM or Excel.
I'm also considering trying to write a macro that will match Pf_IDs in an exported excel file of the products table and take rows out of the excel file with current quantity information putting them in a new excel file to import into the website's database.
Please help, this is getting really confusing.I would bcp the data into a staging table, then use sql to do whatever you have to do.
But that's just me
MOO|||I'm not sure what bcp stands for ... I've only seen dts. I think your idea of using a staging table to query data from sounds good, but I kind of want this to be as simple a process as possible. Something that relatively anyone with instructions could click buttons and have done routinely.|||use dts to bring the excel data into a staging table
(a table created just to store data in the middle of a coomplex import)
then use an update statement against the staged data to update the destination data
(use the transact sql task)
For a more straightforward (but more advanced) method you might consider using the data driven query task and you could eliminate the staging table.
once you get this running you can set it up as a batch file or script or even call it from a sp. so anyone can run it.|||I've called tech support and checked permissions on the product table, but niether of us can seem to get this query to work when importing data from an excel workbook, Peach$. Is there something wrong with my query?
Here's what the support said when he tried:
It doesnt appear to be working running as SA either. I get a Error Line 3 near ) and then the db is read-only.
I cleared active connections to the DB and even stopped access, but it still appears to be locked by something, or perhaps the table is protected somehow?
--Raj
Here is the actual query:
update ASI.dbo.Product
set ASI.dbo.Product.In_Stock = (select `Peach$`.`Quantity On Hand`
from `Peach$`
where ASI.dbo.Product.Pf_ID = `Peach$`.`Item ID`)
where exists
(select `Peach$`.`Quantity On Hand`
from `Peach$`
where ASI.dbo.Product.Pf_ID = `Peach$`.`Item ID`);
Sorry for the lag time with a reply. I hope this gets read now.
Importing data from Excel to SQL
The field is a DateTime field and when imported places a date of 1900-01-01 00:00:00.000 I need that field to be NULL. Is there a way to represent NULL in excel so when it is importing it is not transformed to a date?
ThanksThe problem is not how your data is represented in Excel, but how it is being imported, or possibly the defaults on your SQL table. How are you importing the Excel data?
blindman
Importing data & inserting value
1. Drop & recreate a temp table with an additional column as a default value based on the corresponding file extension (manually entered into the DTS routine).
2. Import a single file into a temp table.
3. Transfer the temp table data to the permanent table with Keep Null Values ticked.
Ive only got as far as importing 2 of the files (which works!) but is getting messy already with 10 steps so far!
There must be an easy way as a file is being imported to add a value on each row rather then going through all these processes!
Can anyone help?
Thanks in advance!You can use the bcp import utility with a format file. See in BOL for more info on format files.
Friday, February 24, 2012
importing chinese text
I have some multibyte characters and I want to put them into table fields.
how I can do this? Just storing them will result in a ? when reading /
showing them.
thanks & regards
MarkI'm doing similar thing without any problems.
My Chinese characters are in a text file, I used DTS to import them
into my SQL DB, they are shown on web pages perfectly.
How did you read/show them?
SL at http://www.source4book.com