I've been able to import tables successfully from Oracle
to a SQL Server database. However, when I try to import
views from Oracle it wants to import them as tables in
SQL Server. Any ideas how I can import views as views
from Oracle to SQL Server?
A view is a virtual table. Technically, to import just the
view you would only be importing the view definition, not
the data. Import the data and then create a view in SQL
Server.
-Sue
On Mon, 23 Aug 2004 13:49:01 -0700, "Scott"
<anonymous@.discussions.microsoft.com> wrote:
>I've been able to import tables successfully from Oracle
>to a SQL Server database. However, when I try to import
>views from Oracle it wants to import them as tables in
>SQL Server. Any ideas how I can import views as views
>from Oracle to SQL Server?
Showing posts with label ive. Show all posts
Showing posts with label ive. Show all posts
Wednesday, March 28, 2012
Importing views from Oracle
Importing TXT files problems
Hello,
I'm fairly new to SQL and hopefully this is a simple question to answer.
I've been trying to import a TXT file into an SQL database and I'm having trouble making it work correctly. It is a ASCII text file where the fields vary by the number of characters. This can be 2 characters up to 40 (STATE would be 2 characters, CITY is
32 characters, etc.)
I can import the file with DTS. I go in and select exactly where I want the field breaks to be. Then it imports everything as Characters with column headers of Col001, Col002 etc. I don't want everything as characters or Col001 etc. Some columns would be
INT, NUMERIC(x,x), etc. but everytime I change these values in DTS it won't import the data correctly.
Also, I have an SQL script that I wrote for a table where I can create the field lengths, data type etc., FWIW. This seems to be going nowhere fast.
What am I doing wrong? Should I be using something else than DTS?
Any suggestions are greatly appreciated.
Thanks,
JB
We have had really good luck with BULK INSERT using TAB DELIMITED data.
Maybe you can get it to be TAB DELIMITED using EXCEL - IMPORT the text and
save as .TDF.
We also are a VB shop, so on top of that we have developed several different
versions of a TEXTIMPORT program - some that take the lengths of each SQL
column and cut the input data based on that, others that look for specific
items while cutting up the data. This way we can also deal with odd data
issues for various columns (the TEXTIMPORT uses a little .INI file that
controls specifics like that).
I believe that BULK INSERT and BCP have control files to allow greater
flexibility - maybe check BOL (books online)...
"JonBlack" <anonymous@.discussions.microsoft.com> wrote in message
news:AE462262-F98D-40A1-BD57-6E2066BE2A6D@.microsoft.com...
> Hello,
> I'm fairly new to SQL and hopefully this is a simple question to answer.
> I've been trying to import a TXT file into an SQL database and I'm having
trouble making it work correctly. It is a ASCII text file where the fields
vary by the number of characters. This can be 2 characters up to 40 (STATE
would be 2 characters, CITY is 32 characters, etc.)
> I can import the file with DTS. I go in and select exactly where I want
the field breaks to be. Then it imports everything as Characters with column
headers of Col001, Col002 etc. I don't want everything as characters or
Col001 etc. Some columns would be INT, NUMERIC(x,x), etc. but everytime I
change these values in DTS it won't import the data correctly.
> Also, I have an SQL script that I wrote for a table where I can create the
field lengths, data type etc., FWIW. This seems to be going nowhere fast.
> What am I doing wrong? Should I be using something else than DTS?
> Any suggestions are greatly appreciated.
> Thanks,
> JB
|||Hi Steve,
Thanks for your reply. I wouldn't mind importing it into EXCEL and saving it in a better format. Unfortunately I have over 100,000 records, which rules EXCEL out. I think BULK IMPORT and BCP can't import files based on character widths but could use tabs
as you suggested. Using VB to write an import script could be an option though.
Thanks again,
JB
-- Steve Z wrote: --
We have had really good luck with BULK INSERT using TAB DELIMITED data.
Maybe you can get it to be TAB DELIMITED using EXCEL - IMPORT the text and
save as .TDF.
We also are a VB shop, so on top of that we have developed several different
versions of a TEXTIMPORT program - some that take the lengths of each SQL
column and cut the input data based on that, others that look for specific
items while cutting up the data. This way we can also deal with odd data
issues for various columns (the TEXTIMPORT uses a little .INI file that
controls specifics like that).
I believe that BULK INSERT and BCP have control files to allow greater
flexibility - maybe check BOL (books online)...
|||Is it a column a fix length or delimited by coma ?
JonBlack wrote:
> Hello,
> I'm fairly new to SQL and hopefully this is a simple question to answer.
> I've been trying to import a TXT file into an SQL database and I'm having trouble making it work correctly. It is a ASCII text file where the fields vary by the number of characters. This can be 2 characters up to 40 (STATE would be 2 characters, CITY i
s 32 characters, etc.)
> I can import the file with DTS. I go in and select exactly where I want the field breaks to be. Then it imports everything as Characters with column headers of Col001, Col002 etc. I don't want everything as characters or Col001 etc. Some columns would b
e INT, NUMERIC(x,x), etc. but everytime I change these values in DTS it won't import the data correctly.
> Also, I have an SQL script that I wrote for a table where I can create the field lengths, data type etc., FWIW. This seems to be going nowhere fast.
> What am I doing wrong? Should I be using something else than DTS?
> Any suggestions are greatly appreciated.
> Thanks,
> JB
|||The columns are fixed length. For example Column1 is 5 characters long, Column2 is 2 characters long, ColumnC is 30 characters long, ColumnD is 10 characters long, etc. Also, some of the columns are numeric and integer as well but are represented by chara
cters until imported.
Thanks,
JB
I'm fairly new to SQL and hopefully this is a simple question to answer.
I've been trying to import a TXT file into an SQL database and I'm having trouble making it work correctly. It is a ASCII text file where the fields vary by the number of characters. This can be 2 characters up to 40 (STATE would be 2 characters, CITY is
32 characters, etc.)
I can import the file with DTS. I go in and select exactly where I want the field breaks to be. Then it imports everything as Characters with column headers of Col001, Col002 etc. I don't want everything as characters or Col001 etc. Some columns would be
INT, NUMERIC(x,x), etc. but everytime I change these values in DTS it won't import the data correctly.
Also, I have an SQL script that I wrote for a table where I can create the field lengths, data type etc., FWIW. This seems to be going nowhere fast.
What am I doing wrong? Should I be using something else than DTS?
Any suggestions are greatly appreciated.
Thanks,
JB
We have had really good luck with BULK INSERT using TAB DELIMITED data.
Maybe you can get it to be TAB DELIMITED using EXCEL - IMPORT the text and
save as .TDF.
We also are a VB shop, so on top of that we have developed several different
versions of a TEXTIMPORT program - some that take the lengths of each SQL
column and cut the input data based on that, others that look for specific
items while cutting up the data. This way we can also deal with odd data
issues for various columns (the TEXTIMPORT uses a little .INI file that
controls specifics like that).
I believe that BULK INSERT and BCP have control files to allow greater
flexibility - maybe check BOL (books online)...
"JonBlack" <anonymous@.discussions.microsoft.com> wrote in message
news:AE462262-F98D-40A1-BD57-6E2066BE2A6D@.microsoft.com...
> Hello,
> I'm fairly new to SQL and hopefully this is a simple question to answer.
> I've been trying to import a TXT file into an SQL database and I'm having
trouble making it work correctly. It is a ASCII text file where the fields
vary by the number of characters. This can be 2 characters up to 40 (STATE
would be 2 characters, CITY is 32 characters, etc.)
> I can import the file with DTS. I go in and select exactly where I want
the field breaks to be. Then it imports everything as Characters with column
headers of Col001, Col002 etc. I don't want everything as characters or
Col001 etc. Some columns would be INT, NUMERIC(x,x), etc. but everytime I
change these values in DTS it won't import the data correctly.
> Also, I have an SQL script that I wrote for a table where I can create the
field lengths, data type etc., FWIW. This seems to be going nowhere fast.
> What am I doing wrong? Should I be using something else than DTS?
> Any suggestions are greatly appreciated.
> Thanks,
> JB
|||Hi Steve,
Thanks for your reply. I wouldn't mind importing it into EXCEL and saving it in a better format. Unfortunately I have over 100,000 records, which rules EXCEL out. I think BULK IMPORT and BCP can't import files based on character widths but could use tabs
as you suggested. Using VB to write an import script could be an option though.
Thanks again,
JB
-- Steve Z wrote: --
We have had really good luck with BULK INSERT using TAB DELIMITED data.
Maybe you can get it to be TAB DELIMITED using EXCEL - IMPORT the text and
save as .TDF.
We also are a VB shop, so on top of that we have developed several different
versions of a TEXTIMPORT program - some that take the lengths of each SQL
column and cut the input data based on that, others that look for specific
items while cutting up the data. This way we can also deal with odd data
issues for various columns (the TEXTIMPORT uses a little .INI file that
controls specifics like that).
I believe that BULK INSERT and BCP have control files to allow greater
flexibility - maybe check BOL (books online)...
|||Is it a column a fix length or delimited by coma ?
JonBlack wrote:
> Hello,
> I'm fairly new to SQL and hopefully this is a simple question to answer.
> I've been trying to import a TXT file into an SQL database and I'm having trouble making it work correctly. It is a ASCII text file where the fields vary by the number of characters. This can be 2 characters up to 40 (STATE would be 2 characters, CITY i
s 32 characters, etc.)
> I can import the file with DTS. I go in and select exactly where I want the field breaks to be. Then it imports everything as Characters with column headers of Col001, Col002 etc. I don't want everything as characters or Col001 etc. Some columns would b
e INT, NUMERIC(x,x), etc. but everytime I change these values in DTS it won't import the data correctly.
> Also, I have an SQL script that I wrote for a table where I can create the field lengths, data type etc., FWIW. This seems to be going nowhere fast.
> What am I doing wrong? Should I be using something else than DTS?
> Any suggestions are greatly appreciated.
> Thanks,
> JB
|||The columns are fixed length. For example Column1 is 5 characters long, Column2 is 2 characters long, ColumnC is 30 characters long, ColumnD is 10 characters long, etc. Also, some of the columns are numeric and integer as well but are represented by chara
cters until imported.
Thanks,
JB
Wednesday, March 21, 2012
importing LongText fields
Hi all,
I've got a large db I downloaded in a .SQL format that came originally from
MySQL. I'm trying to alter the code to allow me to run it under MS-SQL.
I've got it mostly translated over, but I'm running into a problem with
MySQL's LONGTEXT data type. I created a unique data type (using SQL 2K5)
called LONGTEXT that I assigned to a VARCHAR(8000), and I'm still running
into problems with fields being too long to enter.
I guess the crux of my question is this: is there any way to go over the
8000-character limitation?
Regards,
Scott McNair
"Scott McNair" <scott.mcnair@.sfmco.takethispartout.com> wrote in message
news:Xns95DAA939F3E2Dsfmco@.207.46.248.16...
> Hi all,
> I've got a large db I downloaded in a .SQL format that came originally
> from
> MySQL. I'm trying to alter the code to allow me to run it under MS-SQL.
> I've got it mostly translated over, but I'm running into a problem with
> MySQL's LONGTEXT data type. I created a unique data type (using SQL 2K5)
> called LONGTEXT that I assigned to a VARCHAR(8000), and I'm still running
> into problems with fields being too long to enter.
> I guess the crux of my question is this: is there any way to go over the
> 8000-character limitation?
> Regards,
> Scott McNair
Take a look at text and ntext. You have a 2GB/1GB limit respectively.
HTH
Rick Sawtell
MCT, MCSD, MCDBA
|||"Rick Sawtell" <quickening@.msn.com> wrote in news:ODEu8w#9EHA.2568
@.TK2MSFTNGP10.phx.gbl:
> Take a look at text and ntext. You have a 2GB/1GB limit respectively.
Perfect. I created a UDDT associating "longtext" with text, and it imports
more-or-less perfectly.
|||My apology for btting in this reply but I have the same issue where I need to
store +- up to 16000 characters in one field and I tried text or ntext but
the max size value is 16, can you tell me how to set up that field to accept
the 16000 i need or more?
Thanks
Xaviervp
"Scott McNair" wrote:
> "Rick Sawtell" <quickening@.msn.com> wrote in news:ODEu8w#9EHA.2568
> @.TK2MSFTNGP10.phx.gbl:
>
> Perfect. I created a UDDT associating "longtext" with text, and it imports
> more-or-less perfectly.
>
|||As Rick stated, the max text/ntext/image size is 2GB. You don't need to do
anything special.
The reported 16-byte length is the default text-in-row length. The default
length of 16 will hold the pointer to the separately stored value.
Hope this helps.
Dan Guzman
SQL Server MVP
"xaviervp" <xaviervp@.discussions.microsoft.com> wrote in message
news:66C5708A-B2F2-44ED-BC46-727AFD090EEB@.microsoft.com...[vbcol=seagreen]
> My apology for btting in this reply but I have the same issue where I need
> to
> store +- up to 16000 characters in one field and I tried text or ntext but
> the max size value is 16, can you tell me how to set up that field to
> accept
> the 16000 i need or more?
> Thanks
> Xaviervp
> "Scott McNair" wrote:
I've got a large db I downloaded in a .SQL format that came originally from
MySQL. I'm trying to alter the code to allow me to run it under MS-SQL.
I've got it mostly translated over, but I'm running into a problem with
MySQL's LONGTEXT data type. I created a unique data type (using SQL 2K5)
called LONGTEXT that I assigned to a VARCHAR(8000), and I'm still running
into problems with fields being too long to enter.
I guess the crux of my question is this: is there any way to go over the
8000-character limitation?
Regards,
Scott McNair
"Scott McNair" <scott.mcnair@.sfmco.takethispartout.com> wrote in message
news:Xns95DAA939F3E2Dsfmco@.207.46.248.16...
> Hi all,
> I've got a large db I downloaded in a .SQL format that came originally
> from
> MySQL. I'm trying to alter the code to allow me to run it under MS-SQL.
> I've got it mostly translated over, but I'm running into a problem with
> MySQL's LONGTEXT data type. I created a unique data type (using SQL 2K5)
> called LONGTEXT that I assigned to a VARCHAR(8000), and I'm still running
> into problems with fields being too long to enter.
> I guess the crux of my question is this: is there any way to go over the
> 8000-character limitation?
> Regards,
> Scott McNair
Take a look at text and ntext. You have a 2GB/1GB limit respectively.
HTH
Rick Sawtell
MCT, MCSD, MCDBA
|||"Rick Sawtell" <quickening@.msn.com> wrote in news:ODEu8w#9EHA.2568
@.TK2MSFTNGP10.phx.gbl:
> Take a look at text and ntext. You have a 2GB/1GB limit respectively.
Perfect. I created a UDDT associating "longtext" with text, and it imports
more-or-less perfectly.
|||My apology for btting in this reply but I have the same issue where I need to
store +- up to 16000 characters in one field and I tried text or ntext but
the max size value is 16, can you tell me how to set up that field to accept
the 16000 i need or more?
Thanks
Xaviervp
"Scott McNair" wrote:
> "Rick Sawtell" <quickening@.msn.com> wrote in news:ODEu8w#9EHA.2568
> @.TK2MSFTNGP10.phx.gbl:
>
> Perfect. I created a UDDT associating "longtext" with text, and it imports
> more-or-less perfectly.
>
|||As Rick stated, the max text/ntext/image size is 2GB. You don't need to do
anything special.
The reported 16-byte length is the default text-in-row length. The default
length of 16 will hold the pointer to the separately stored value.
Hope this helps.
Dan Guzman
SQL Server MVP
"xaviervp" <xaviervp@.discussions.microsoft.com> wrote in message
news:66C5708A-B2F2-44ED-BC46-727AFD090EEB@.microsoft.com...[vbcol=seagreen]
> My apology for btting in this reply but I have the same issue where I need
> to
> store +- up to 16000 characters in one field and I tried text or ntext but
> the max size value is 16, can you tell me how to set up that field to
> accept
> the 16000 i need or more?
> Thanks
> Xaviervp
> "Scott McNair" wrote:
Wednesday, March 7, 2012
Importing data files into new instance of SQL Server
Hi Everyone,
I've got an .MDF file (database) and an .LDF file (transaction log) that
were created using SQL Server 7.0. We've built a new windows server with
SQL Server 2000 installed and would like to import the files onto that
server. The problem is that we just have the files, the SQL Server 7.0
instance is gone. What is the simplest way to create a new database on the
SQL Server 2000 box with those files?
Thanks in Advance,
Jon
if the files were previously detached from the db using sp_detach_db
you can just attach them using sp_attach_db. SQL Server should
do the upgrading for you. Of course, you'll need to drop and re-add
any users and roles.
"Jon Westmore" <westmoj@.reidhosp.com> wrote in message
news:%23c8rSoq7EHA.936@.TK2MSFTNGP12.phx.gbl...
> Hi Everyone,
> I've got an .MDF file (database) and an .LDF file (transaction log) that
> were created using SQL Server 7.0. We've built a new windows server with
> SQL Server 2000 installed and would like to import the files onto that
> server. The problem is that we just have the files, the SQL Server 7.0
> instance is gone. What is the simplest way to create a new database on
the
> SQL Server 2000 box with those files?
> Thanks in Advance,
> Jon
>
I've got an .MDF file (database) and an .LDF file (transaction log) that
were created using SQL Server 7.0. We've built a new windows server with
SQL Server 2000 installed and would like to import the files onto that
server. The problem is that we just have the files, the SQL Server 7.0
instance is gone. What is the simplest way to create a new database on the
SQL Server 2000 box with those files?
Thanks in Advance,
Jon
if the files were previously detached from the db using sp_detach_db
you can just attach them using sp_attach_db. SQL Server should
do the upgrading for you. Of course, you'll need to drop and re-add
any users and roles.
"Jon Westmore" <westmoj@.reidhosp.com> wrote in message
news:%23c8rSoq7EHA.936@.TK2MSFTNGP12.phx.gbl...
> Hi Everyone,
> I've got an .MDF file (database) and an .LDF file (transaction log) that
> were created using SQL Server 7.0. We've built a new windows server with
> SQL Server 2000 installed and would like to import the files onto that
> server. The problem is that we just have the files, the SQL Server 7.0
> instance is gone. What is the simplest way to create a new database on
the
> SQL Server 2000 box with those files?
> Thanks in Advance,
> Jon
>
Subscribe to:
Posts (Atom)