Friday, March 30, 2012
Importing XML in SQL Server 2005 in .Net 2.0
Schema. XML Schema is in such a way that it is not having annotation
suitlable for SQLXML.
You mentioned about loading XML into tables, but how do I update table
into database (I assume you mean Table datastruction in ADO.net)?
Thanks
NeoHello Neo,
Okay, so let me ask this: are you simply interested in loading the XML into
the database in some batch operation or do you both load the orginal data
into SQL Server and modify stored values from XML?
If it this is a simple ETL (former) case, SSIS would be my tool of choice.
If you're looking to both load and update from XML, SSIS is still usable,
but the solution might look a little different.
If you simply putting the XML into a XML-typed column, consider using openro
wset(bulk)
instead.
More details helpful. :)
Thanks!
Kent Tegels
DevelopMentor
http://staff.develop.com/ktegels/
Importing XML in SQL Server 2000 in .Net 2.0
I realized, I am having SQL Server 2000 for this problem, and not 2005
(To many projects I guess). So, SSIS won't be possible. Same goes for
XML typed column.
Is there option for bulk upload in SQL Server 2000.
Kent Tegels wrote:
> Hello Neo,
> Okay, so let me ask this: are you simply interested in loading the XML int
o
> the database in some batch operation or do you both load the orginal data
> into SQL Server and modify stored values from XML?
> If it this is a simple ETL (former) case, SSIS would be my tool of choice.
> If you're looking to both load and update from XML, SSIS is still usable,
> but the solution might look a little different.
> If you simply putting the XML into a XML-typed column, consider using open
rowset(bulk)
> instead.
> More details helpful. :)
> Thanks!
> Kent Tegels
> DevelopMentor
> http://staff.develop.com/ktegels/Hello Neo,
N> I realized, I am having SQL Server 2000 for this problem, and not
N> 2005 (To many projects I guess). So, SSIS won't be possible. Same
N> goes for XML typed column.
BCP or SQLXMLBulkLoad then. Not an easy problem to solve with those tools.
If you can use the .NET 2.0 framework, you might want to look into the SqlBu
lkCopy
class. That might help.
Thanks!
Kent
Wednesday, March 28, 2012
importing users into membership database
Hi, i have a new site built in .net using the login/membership system. i have a legacy database (access) of users from the previous site with email, name, password, etc.
Is it possible to import this data straight into my new membership tables through some insert statement, software or through some other way?
You can do it in code. Use the method CreateUser.
That is: Membership.CreateUser(...
Importing text file
fit in here better.
Using VS.Net 2003 I built an ASP.Net Web application using
VB.Net. I connect to an SQL db(SQL Server 2000) and am setup to
write/read/edit data no problem. I'm having a problems with the following
scenario. Any help, advice, or if you could point me in the right direction
is highly appreciated.
1. The user clicks a button which opens the Find file dialogue box.
2. They navigate to the text(or possible dbf file) and click ok.
3. The a temp table is created and the text(dbf) file is imported
creating the temp table
4. A procedure is run the takes my active data(an existing table) and
archives(appends) that data to an archive table. Before this though the
data in the archive table needs to be deleted.
5. The data from my live table is deleted.
6. Data from my imported temp table is appended to the live table.
7. Temp table is deleted.
I read some stuff about DTS packages but I'm not real familiar with SQL
Server and don't know how exactly to call the DTS package from my
application and/or if DTS is actually the way to go. The text file is
delimited and contains field names that will map directly to my existing
tables. Not looking for a hand out so if you could get me pointed in the
right direction I'm not against doing more research, but at this time I'm
not exactly sure where to start. Thanks again for your time.Hi
The following web site has lots of information on DTS. There is also alot of
information in Books online (which can be downloaded from
http://www.microsoft.com/downloads/...&displaylang=en
if you don't have it!). You may also want to look up BCP or Bulk Insert as a
alternative method.
John
"Reggie" wrote:
> HI and TIA. I posted this to a vs.net NG but after looking thought it mig
ht
> fit in here better.
> Using VS.Net 2003 I built an ASP.Net Web application using
> VB.Net. I connect to an SQL db(SQL Server 2000) and am setup to
> write/read/edit data no problem. I'm having a problems with the following
> scenario. Any help, advice, or if you could point me in the right directi
on
> is highly appreciated.
> 1. The user clicks a button which opens the Find file dialogue box.
> 2. They navigate to the text(or possible dbf file) and click ok.
> 3. The a temp table is created and the text(dbf) file is imported
> creating the temp table
> 4. A procedure is run the takes my active data(an existing table) and
> archives(appends) that data to an archive table. Before this though the
> data in the archive table needs to be deleted.
> 5. The data from my live table is deleted.
> 6. Data from my imported temp table is appended to the live table.
> 7. Temp table is deleted.
> I read some stuff about DTS packages but I'm not real familiar with SQL
> Server and don't know how exactly to call the DTS package from my
> application and/or if DTS is actually the way to go. The text file is
> delimited and contains field names that will map directly to my existing
> tables. Not looking for a hand out so if you could get me pointed in the
> right direction I'm not against doing more research, but at this time I'm
> not exactly sure where to start. Thanks again for your time.
>
>|||John, Thanks very much for you quick response. I will look BOL some more.
You mention a website with lots of info but forgot the link. Take care!
Reggie
"John Bell" <jbellnewsposts@.h0tmail.com> wrote in message
news:4097B1F5-B1F6-434F-A14B-593AC227135F@.microsoft.com...
> Hi
> The following web site has lots of information on DTS. There is also alot
> of
> information in Books online (which can be downloaded from
> http://www.microsoft.com/downloads/...&displaylang=en
> if you don't have it!). You may also want to look up BCP or Bulk Insert as
> a
> alternative method.
> John
> "Reggie" wrote:
>|||Hi
Sorry anbout the missing link! http://www.sqldts.com/default.aspx?6
If you don't want the user to wait around for the file to load then you can
schedule your DTS package to run at intervals. Look at the looping example o
n
how to load all files in a given directory.
If you want to load immediately and know the destination file name, bulk
insert may also be an option. You may need use UNC addresses to get/put the
file into an accessible location.
John
"Reggie" wrote:
> John, Thanks very much for you quick response. I will look BOL some more
.
> You mention a website with lots of info but forgot the link. Take care!
> Reggie
> "John Bell" <jbellnewsposts@.h0tmail.com> wrote in message
> news:4097B1F5-B1F6-434F-A14B-593AC227135F@.microsoft.com...
>
>
Friday, March 23, 2012
Importing old data into asp_tables...?
I have created an application that uses the login, create, etc login components in .net. How hard is it to convert all my old users, passwords, usertypes into the new tables. It almost looks like I have to do them by hand and created a new guid(userid), along with the same guid in the aspnet_usersinroles and aspnet_Membership. Is there a script to do this programatically?
Hi,
You can use t-sql cursor to go record by record thru your old table and inserting to asp.net tables. The difficult part is passwords. I think it would be too difficult to decode and encode password in t-sql.
So, probably first import data and then use asp.net to re-encode passwords. As far as I know there are no utilities to do it automatically.
Wednesday, March 21, 2012
Importing lots of rows :-)
Try using SqlBulkCopy object (ado.net 2.0).
AMB
"Simon Harvey" wrote:
> Hi All,
> I'm not sure how to approach this problem so if anyone could advise me
> how to do it I would be very grateful.
> I have an app that calls a web service. The webservice returns an array
> of business objects.
> So I have a collections of objects that might be, say, 10000 objects in
> size.
> I need to import all the business objects into an SQL Server 2005
> database. The only way I know how to do this is to loop through each
> object and call a stored procedure repeatedly that takes the appropriate
> parameters. The whole import must be atomic. That is, if something goes
> wrong, the whole import needs to be aborted.
> I guess I have two questions:
> 1. Is repeatedly calling a stored procedure the best way to do this sort
> of stuff? I know doing imports like this must be a pretty common
> operation, but I don't know if using an SPROC is a dumb idea.
> 2. Is it ok to start a transaction and repeatedly call a SPROC, like,
> 10000 times? Is there a better what to go about this?
> Maybe I'm worrying about nothing - but it just "feels" wrong to be
> repeatedly calling an SPROC 10,000 times! :-) I was wondering if there
> is a more elegant way?
> Any advice anyone could offer would be very much appreciated.
> Kindest Regards
> Simon
>You're right. You don't want to call the same stored procedure 10,000 times
if there is way to accomplish the task as one unit of work. First off,
calling the stored procedure 10,000 times -while inside a transaction, will
consume and hold more resources longer than necessary.
Look into SQLBulkCopy as Alejandro offered
Arnie Rowland, YACE*
"To be successful, your heart must accompany your knowledge."
*Yet Another certification Exam
"Simon Harvey" <nothanks@.hotmail.com> wrote in message
news:ezy9dt4mGHA.732@.TK2MSFTNGP04.phx.gbl...
> Hi All,
> I'm not sure how to approach this problem so if anyone could advise me how
> to do it I would be very grateful.
> I have an app that calls a web service. The webservice returns an array of
> business objects.
> So I have a collections of objects that might be, say, 10000 objects in
> size.
> I need to import all the business objects into an SQL Server 2005
> database. The only way I know how to do this is to loop through each
> object and call a stored procedure repeatedly that takes the appropriate
> parameters. The whole import must be atomic. That is, if something goes
> wrong, the whole import needs to be aborted.
> I guess I have two questions:
> 1. Is repeatedly calling a stored procedure the best way to do this sort
> of stuff? I know doing imports like this must be a pretty common
> operation, but I don't know if using an SPROC is a dumb idea.
> 2. Is it ok to start a transaction and repeatedly call a SPROC, like,
> 10000 times? Is there a better what to go about this?
> Maybe I'm worrying about nothing - but it just "feels" wrong to be
> repeatedly calling an SPROC 10,000 times! :-) I was wondering if there is
> a more elegant way?
> Any advice anyone could offer would be very much appreciated.
> Kindest Regards
> Simon|||If your data is coming in as one large XML document, XML Bulk Load might be
another alternative. The final result is the same as SQLBulkCopy but the
input is an XML document.
This posting is provided "AS IS" with no warranties, and confers no rights.
Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm
"Arnie Rowland" <arnie@.1568.com> wrote in message
news:eyYZQd5mGHA.1204@.TK2MSFTNGP03.phx.gbl...
> You're right. You don't want to call the same stored procedure 10,000
> times if there is way to accomplish the task as one unit of work. First
> off, calling the stored procedure 10,000 times -while inside a
> transaction, will consume and hold more resources longer than necessary.
> Look into SQLBulkCopy as Alejandro offered
> --
> Arnie Rowland, YACE*
> "To be successful, your heart must accompany your knowledge."
> *Yet Another certification Exam
>
> "Simon Harvey" <nothanks@.hotmail.com> wrote in message
> news:ezy9dt4mGHA.732@.TK2MSFTNGP04.phx.gbl...
>|||Hi All,
I'm not sure how to approach this problem so if anyone could advise me
how to do it I would be very grateful.
I have an app that calls a web service. The webservice returns an array
of business objects.
So I have a collections of objects that might be, say, 10000 objects in
size.
I need to import all the business objects into an SQL Server 2005
database. The only way I know how to do this is to loop through each
object and call a stored procedure repeatedly that takes the appropriate
parameters. The whole import must be atomic. That is, if something goes
wrong, the whole import needs to be aborted.
I guess I have two questions:
1. Is repeatedly calling a stored procedure the best way to do this sort
of stuff? I know doing imports like this must be a pretty common
operation, but I don't know if using an SPROC is a dumb idea.
2. Is it ok to start a transaction and repeatedly call a SPROC, like,
10000 times? Is there a better what to go about this?
Maybe I'm worrying about nothing - but it just "feels" wrong to be
repeatedly calling an SPROC 10,000 times! :-) I was wondering if there
is a more elegant way?
Any advice anyone could offer would be very much appreciated.
Kindest Regards
Simon|||Simon,
Try using SqlBulkCopy object (ado.net 2.0).
AMB
"Simon Harvey" wrote:
> Hi All,
> I'm not sure how to approach this problem so if anyone could advise me
> how to do it I would be very grateful.
> I have an app that calls a web service. The webservice returns an array
> of business objects.
> So I have a collections of objects that might be, say, 10000 objects in
> size.
> I need to import all the business objects into an SQL Server 2005
> database. The only way I know how to do this is to loop through each
> object and call a stored procedure repeatedly that takes the appropriate
> parameters. The whole import must be atomic. That is, if something goes
> wrong, the whole import needs to be aborted.
> I guess I have two questions:
> 1. Is repeatedly calling a stored procedure the best way to do this sort
> of stuff? I know doing imports like this must be a pretty common
> operation, but I don't know if using an SPROC is a dumb idea.
> 2. Is it ok to start a transaction and repeatedly call a SPROC, like,
> 10000 times? Is there a better what to go about this?
> Maybe I'm worrying about nothing - but it just "feels" wrong to be
> repeatedly calling an SPROC 10,000 times! :-) I was wondering if there
> is a more elegant way?
> Any advice anyone could offer would be very much appreciated.
> Kindest Regards
> Simon
>|||You're right. You don't want to call the same stored procedure 10,000 times
if there is way to accomplish the task as one unit of work. First off,
calling the stored procedure 10,000 times -while inside a transaction, will
consume and hold more resources longer than necessary.
Look into SQLBulkCopy as Alejandro offered
Arnie Rowland, YACE*
"To be successful, your heart must accompany your knowledge."
*Yet Another certification Exam
"Simon Harvey" <nothanks@.hotmail.com> wrote in message
news:ezy9dt4mGHA.732@.TK2MSFTNGP04.phx.gbl...
> Hi All,
> I'm not sure how to approach this problem so if anyone could advise me how
> to do it I would be very grateful.
> I have an app that calls a web service. The webservice returns an array of
> business objects.
> So I have a collections of objects that might be, say, 10000 objects in
> size.
> I need to import all the business objects into an SQL Server 2005
> database. The only way I know how to do this is to loop through each
> object and call a stored procedure repeatedly that takes the appropriate
> parameters. The whole import must be atomic. That is, if something goes
> wrong, the whole import needs to be aborted.
> I guess I have two questions:
> 1. Is repeatedly calling a stored procedure the best way to do this sort
> of stuff? I know doing imports like this must be a pretty common
> operation, but I don't know if using an SPROC is a dumb idea.
> 2. Is it ok to start a transaction and repeatedly call a SPROC, like,
> 10000 times? Is there a better what to go about this?
> Maybe I'm worrying about nothing - but it just "feels" wrong to be
> repeatedly calling an SPROC 10,000 times! :-) I was wondering if there is
> a more elegant way?
> Any advice anyone could offer would be very much appreciated.
> Kindest Regards
> Simon|||If your data is coming in as one large XML document, XML Bulk Load might be
another alternative. The final result is the same as SQLBulkCopy but the
input is an XML document.
This posting is provided "AS IS" with no warranties, and confers no rights.
Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm
"Arnie Rowland" <arnie@.1568.com> wrote in message
news:eyYZQd5mGHA.1204@.TK2MSFTNGP03.phx.gbl...
> You're right. You don't want to call the same stored procedure 10,000
> times if there is way to accomplish the task as one unit of work. First
> off, calling the stored procedure 10,000 times -while inside a
> transaction, will consume and hold more resources longer than necessary.
> Look into SQLBulkCopy as Alejandro offered
> --
> Arnie Rowland, YACE*
> "To be successful, your heart must accompany your knowledge."
> *Yet Another certification Exam
>
> "Simon Harvey" <nothanks@.hotmail.com> wrote in message
> news:ezy9dt4mGHA.732@.TK2MSFTNGP04.phx.gbl...
>|||Thanks guys!
That XML one seems even more interesting actually, although I'm guessing
it might be quite hard to set up the mappings. Especially seeing as the
data is quite hierarchichal in nature.
Many thanks for your advice!
Simon
Roger Wolter[MSFT] wrote:
> If your data is coming in as one large XML document, XML Bulk Load might b
e
> another alternative. The final result is the same as SQLBulkCopy but the
> input is an XML document.
>|||Thanks guys!
That XML one seems even more interesting actually, although I'm guessing
it might be quite hard to set up the mappings. Especially seeing as the
data is quite hierarchichal in nature.
Many thanks for your advice!
Simon
Roger Wolter[MSFT] wrote:
> If your data is coming in as one large XML document, XML Bulk Load might b
e
> another alternative. The final result is the same as SQLBulkCopy but the
> input is an XML document.
>
Monday, March 19, 2012
importing excel file into sql server 2005 in asp .net using c#
Dear friends i want to import excel file into sql server 2005 in asp .net using c# can anyone help me?
rgds,
RK
try the link below [SqlBulkCopy]
Import / Export Excel Spreadsheet Data into SQL Server Database Table Using SqlBulkCopy
Just a little bit of code transfers the data from the Excel Spreadsheet into the SQL Server Database Table:
// Connection String to Excel Workbookstring excelConnectionString = @."Provider=Microsoft .Jet.OLEDB.4.0;Data Source=Book1.xls;Extended Properties=""Excel 8.0;HDR=YES;""";// Create Connection to Excel Workbookusing (OleDbConnection connection =new OleDbConnection(excelConnectionString)){ OleDbCommand command =new OleDbCommand ("Select ID,Data FROM [Data$]", connection); connection.Open();// Create DbDataReader to Data Worksheetusing (DbDataReader dr = command.ExecuteReader()) {// SQL Server Connection Stringstring sqlConnectionString ="Data Source=.; Initial Catalog=Test;Integrated Security=True";// Bulk Copy to SQL Serverusing (SqlBulkCopy bulkCopy =new SqlBulkCopy(sqlConnectionString)) { bulkCopy.DestinationTableName ="ExcelData"; bulkCopy.WriteToServer(dr); } }}
Good Luck./.
|||Hi
you have to first read the XL content then you have to insert into Sql server.
you can transfer your XL to dataset by Oledb provider by this codestring conection_string ="Provider=Microsoft.Jet.OLEDB.4.0; Data Source=YOUR XL PATH; Extended Properties='Excel 8.0;IMEX=1;HDR=YES;'";
OleDbDataAdapter adp =newOleDbDataAdapter("SELECT * FROM [Sheet1$] ", conection_string);
DataSet ds =newDataSet();
adp.Fill(ds);
now your xl content would be in dataset ds , then you can access this dataset as well and read dataset line by line and insert into Sql
add namespace :using System.Data.OleDb;
|||Dear what is Sheet1$ can u tell me detail?
|||sorry to interrupt
ramu778:
Dear what is Sheet1$ can u tell me detail?
in ConnectionString; for DataSource we give the XLS file path...
and in below code line... we have to give the Sheet name in the Select Query from which we have to read the date from, as there would be more than one Sheet in an excel file...
OleDbDataAdapter adp =newOleDbDataAdapter("SELECT * FROM [Sheet1$] ", conection_string);
hope it clears./.
Hi
sheet1 is nothing but the name of the XL sheet , there are some name at bottom of XL like sheet1,sheet2 and so, we have to give that name
Wednesday, March 7, 2012
Importing data
It is now saved in a file called Shortages.vb.
What I want to do now is add this to an existing ASP.NET project and be able to call this DTS function by the click of a button.
Is this possible and how can I go about it? Can I just add it to the Click event of a button?You could write a simple stored proceduure to call your DTS, and then execute your sproc from the web page button.|||I have considered that.
I tried to run it by calling it in the format below but
CREATE PROCEDURE doPublish ASexec master..xp_cmdshell 'dtsrun /Ssql1 /NPublish /E'
This is what I got when I tried to run the DTS package within SQL Sever Query Analyser.
If I put the Excel file on the desktop it works fine but it has a problem when its on the H: drive (a shared drive)
DTSRun: Executing...
DTSRun OnStart: Create Table [GE].[dbo].[tester] Step
DTSRun OnFinish: Create Table [GE].[dbo].[tester] Step
DTSRun OnStart: Copy Data from Results to [GE].[dbo].[tester] Step
DTSRun OnError: Copy Data from Results to [GE].[dbo].[tester] Step, Error = -2147467259 (80004005)
Error string: The Microsoft Jet database engine cannot open the file 'H:\Book1.xls'. It is already opened exclusively by another user, or you need permission to view its data.
Error source: Microsoft JET Database Engine
Help file:
Help context: 5003051
Error Detail Records:
Error: -2147467259 (80004005); Provider Error: -534709256 (E020FBF8)
Error string: The Microsoft Jet database engine cannot open the file 'H:\Book1.xls'. It is already opened exclusively by another user, or you need permission to view its data.
Error source: Microsoft JET Database Engine
Help file:
Help context: 5003051
DTSRun OnFinish: Copy Data from Results to [GE].[dbo].[tester] Step
DTSRun: Package execution complete.
NULL
How can I make it run so that I can read from the H: drive?
It works if I run the package immediately when I create it with the DTS Import/Export Wizard.|||Does anyone have an answer tothis? I get the same error when I try to call my DTS package toaccess an excel file on my network.
Friday, February 24, 2012
Importing CVS to SQLServer 2005 in a web application
Importing CVS to SQLServer 2005 in a web application:
I need to import CSV file to a table. This is a .Net 2.0 web application running in the WebServer which has .Net 2.0 installed. SQL Server 2005 is a database in another machine. What are my options? Would SSIS work in WebServer if no database installed in that machine.
Thanks
>>.Importing CVS to SQLServer 2005 in a web application: << Do you mean CSV?
>>Would SSIS work in WebServer if no database installed in that machine.<< is SQL client and SQL Management studio was installed on the web server and you could log on to the web server.
>>What are my options? << You have specified if this is a on-off import or a regular job. One way of handling it would be to convert the csv to either a series in inserts to the table or calls to a stored procedure to do the job.
importing an excel file and save data into a database in my web account
Hi everyone, sorry if this message is not supposed to be posted here.
I'm learning asp.net , and would like to know how I can insert data from a excel file into a database on my web account. Pretty much insert/update information in the database using excel file or a access file.
Thanks a lot in advance
Checkout thses links:
http://support.microsoft.com/kb/311731
http://www.msdner.com/forum/thread302753.html
http://support.microsoft.com/kb/316934
http://www.vbdotnetheaven.com/UploadFile/mahesh/AccessExcelDatabase04252005014321AM/AccessExcelDatabase.aspx
I hope they will help.
Good luck.
|||
Thanks a lot. I found an interesting article that talks about this. again thanks for your assistance
|||If my post was helpful for your then please mark it as answer.
If not, please share the answer with us in a post and mark it as answer.
Remeber we have to cooperate to help each other.
Thanks and you are welcome.
Sunday, February 19, 2012
Importing Access database into SQLExpress
I am trying to complete a lab in the Microsoft self-paced training kit, Developing Web Applications with Microsoft Visual C# .NET.
They have provided an Access database named Contacts.mdb and a batch file named InstContacts.bat that uses InstContacts.sql to import the data.
I have installed SQLExpress, but when I run the batch file I get the following:
C:\Microsoft Press\...\data>rem The following command line installs the Contact SQL database
C:\Microsoft Press\...\data>osql -i InstContacts.Sql -E
[SQL Native Client]Named Pipes Provider: Could not open a connection to SQL Server [2].
[SQL Native Client]Login timeout expired
[SQL Native Client]An error has occurred while establishing a connection to the server. When connecting to SQL Server 2005, this failure may be caused by the fact that under the default settings SQL Server does not allow remote connections.
Nevermind, I found the answer here.
http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=421593&SiteID=1