Friday, March 30, 2012

Imporving Performence

Can any body tell me what measures I have to take to improve the performance of my SQL QUERIES and Stored procedures any thing relatted to SQL?Lookup "Query Tuning" in Books Online.

blindman|||We can definitely identify optimization opportunities for a specific query, but for all of your queries you either have to do it yourself or hire a consultant to do it for you.

imports table structures in SQL 2000 into Excel

Hi.
Is there anyway to export the table structures : data type,length,NULLABLE,Description into an Excel file using MS SQL Server?

Or I need to do it manually?
Thank you in advanced.
Sincerely

AgustinaRun this in Query Analyzer: (common data types, add the the case statement for more)


select name,
case xtype
when 56 then 'Int'
when 127 then 'BigInt'
when 167 then 'VarChar'
when 175 then 'Char'
when 60 then 'Money'
when 58 then 'SmallDateTime'
when 104 then 'Bit'
when 173 then 'TimeStamp'
when 61 then 'DateTime'
when 48 then 'TinyInt'
else 'Other' end,
length
from syscolumns
where id = (
select id
from sysobjects
where name = 'TheTableName')
order by colid
|||You could look up the Schema. Run this in Query Analyzer and adjust accordingly:
SELECT * FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_CATALOG = '<DATABASE NAME>' AND TABLE_SCHEMA = '<DB OWNER>' AND TABLE_NAME = '<YOUR TABLES NAME>'
sql

imports microsoft.sqlserver.dts.pipeline does not work

I have been trying to follow/implement the examples in the following help topics (thanks to Jamie for these links).

Building Packages Programmatically

(http://msdn2.microsoft.com/en-us/library/ms345167.aspx)

Connecting Data Flow Components Programmatically

(http://msdn2.microsoft.com/en-us/library/ms136086.aspx)

The problem I am having is that MainPipe is not recognized as a valid type in my Script task, even though I have the imports statements that are listed in the example. I get the message "Error 30002: Type 'MainPipe' is not defined". The other and related problem is that when I type "imports microsoft.sqlserver.dts", the intellisense offers only two choices: {}Runtime and {}Tasks. I don't see any choice for Pipeline. Can anyone tell what I am missing? It seems to be some kind of configuration/installation issue, but I have no idea how to resolve it. I have tried this on 3 different machines, with both the RTM SQL 2005 standard edition, and with SP2 installed, all with the same result. Any help is appreciated Smile

Here is my code:

' Microsoft SQL Server Integration Services Script Task

' Write scripts using Microsoft Visual Basic

' The ScriptMain class is the entry point of the Script Task.

Imports System

Imports System.Data

Imports System.Math

Imports Microsoft.SqlServer.Dts.Runtime

Imports Microsoft.SqlServer.Dts.Pipeline

Imports Microsoft.SqlServer.Dts.Pipeline.wrapper

Imports Microsoft.SqlServer.Dts.

Public Class ScriptMain

Public Sub Main()

'

Dim package As Microsoft.SqlServer.Dts.Runtime.Package = _

New Microsoft.SqlServer.Dts.Runtime.Package()

Dim e As Executable = package.Executables.Add("DTS.Pipeline.1")

Dim thMainPipe As Microsoft.SqlServer.Dts.Runtime.TaskHost = _

CType(e, Microsoft.SqlServer.Dts.Runtime.TaskHost)

Dim dataFlowTask As MainPipe = CType(thMainPipe.InnerObject, MainPipe)

Dts.TaskResult = Dts.Results.Success

End Sub

End Class

Make sure you have added a reference (under Project..References) to the Microsoft.SqlServer.DTSPipelineWrap.dll.

|||That was it. Thanks for your help.

importing XML/ASCII files to SQl database using VB express

Hi everyone,

I have to write a program in VB to receive the read data from a RFID reader for my graduation project.The problem is I am not a computer science student so I have only general info on programming.

I created my DB in VB express but I couldn't find out how to send the read data (that will be either in XML or ASCII format) to my database...The read data will be transferred to my computer by the RFID reader's software but after that I don'T know how to transfer it to my DB.As I know I have to use commands like read.xml etc,but no idea how write the complete program.

I checked the forum and couldn't find the answer,sorry if someone already answered my question and I missed it.

Thanks...

Can

i suspect that you will find greater success if you post your question here: http://forums.microsoft.com/MSDN/ShowForum.aspx?ForumID=38&SiteID=1|||

ok,thanks,now I 'll do it...

Can

Importing Xml with SqlBulkCopy

I am looking into importing Xml into SQL Server 2005. The Xml files are about 30MB each, and I need the import to work fast.

The Xml is only records with fields (no real hierarchy) i.e.

<books>
<book>
<title>Harry Potter</title>
<author>JK Rowling</author>
etc etc
</book>
</books>

I am looking into doing in a C# application that will use SqlBulkCopy. However as far I can see the WriteToServer method can only accept either a DataRow[], DataTable or IDataReader.

Obviously it is very easy to convert my Xml into a DataTable to DataRow, but these are both in-memory, so they will memory intensive and slow.

Is there an easy way to expose my Xml through an IDataReader object? Like a wrapper over XmlTextReader? I have thought I could write a custom wrapper over XmlTextReader that supports the IDataReader interface - but I keep thinking what I am doing is pretty basic and I must be missing an easier solution? Anybody ideas?

If not, can anybody recommend an alternative solution for getting these records in extremely quickly programmatically?

AndrewYou can use the OPENXML command to achieve this, You would either have to pass the XML as a string to a procedure or have the file in a place SQL could access.|||I looked into OPENXML but have received several warnings regarding its performance. I wanted the fastest way of importing without consuming a lot of resources.

In the end I opted to use SQLXMLBulkLoad and it works very nice and quite elegantly (using a schema to map XML elements to SQL fields.

Andrew|||

I thought there was an XML bulk load option just couldn't find it.

Would you care to post your schema and code so this question has a complete answer.

|||

You maybe can use SQLXMLBULKLOAD .The sample code (c#)

public bool BulkLoad(string ConnectionStr, string MyXsdFile, string MyXMLFile)
{
bool test;
try
{
SQLXMLBulkLoad4 objBL = new SQLXMLBulkLoad4();
objBL.ConnectionString = @."provider=SQLOLEDB;data source=SHA-WKS1333\SQL2005;database=MatrixTest;integrated security=SSPI;";
objBL.ErrorLogFile = @.".\error.log";
objBL.CheckConstraints = true;
test = objBL.Transaction;
objBL.XMLFragment = true;
objBL.TempFilePath = @.".\";
objBL.SchemaGen = true;
objBL.SGDropTables = true;
objBL.Execute(MyXsdFile, MyXMLFile);
return true;
}
catch (Exception e)
{
MessageBox.Show("{0} Exception caught." + e);
}
return false;
}

and you can find more info about SQLXMLBULKLOAD in MSDN.

|||Could you please send the code which you have used to achieve this.

Importing Xml with SqlBulkCopy

I am looking into importing Xml into SQL Server 2005. The Xml files are about 30MB each, and I need the import to work fast.

The Xml is only records with fields (no real hierarchy) i.e.

<books>
<book>
<title>Harry Potter</title>
<author>JK Rowling</author>
etc etc
</book>
</books>

I am looking into doing in a C# application that will use SqlBulkCopy. However as far I can see the WriteToServer method can only accept either a DataRow[], DataTable or IDataReader.

Obviously it is very easy to convert my Xml into a DataTable to DataRow, but these are both in-memory, so they will memory intensive and slow.

Is there an easy way to expose my Xml through an IDataReader object? Like a wrapper over XmlTextReader? I have thought I could write a custom wrapper over XmlTextReader that supports the IDataReader interface - but I keep thinking what I am doing is pretty basic and I must be missing an easier solution? Anybody ideas?

If not, can anybody recommend an alternative solution for getting these records in extremely quickly programmatically?

AndrewYou can use the OPENXML command to achieve this, You would either have to pass the XML as a string to a procedure or have the file in a place SQL could access.|||I looked into OPENXML but have received several warnings regarding its performance. I wanted the fastest way of importing without consuming a lot of resources.

In the end I opted to use SQLXMLBulkLoad and it works very nice and quite elegantly (using a schema to map XML elements to SQL fields.

Andrew|||

I thought there was an XML bulk load option just couldn't find it.

Would you care to post your schema and code so this question has a complete answer.

|||

You maybe can use SQLXMLBULKLOAD .The sample code (c#)

public bool BulkLoad(string ConnectionStr, string MyXsdFile, string MyXMLFile)
{
bool test;
try
{
SQLXMLBulkLoad4 objBL = new SQLXMLBulkLoad4();
objBL.ConnectionString = @."provider=SQLOLEDB;data source=SHA-WKS1333\SQL2005;database=MatrixTest;integrated security=SSPI;";
objBL.ErrorLogFile = @.".\error.log";
objBL.CheckConstraints = true;
test = objBL.Transaction;
objBL.XMLFragment = true;
objBL.TempFilePath = @.".\";
objBL.SchemaGen = true;
objBL.SGDropTables = true;
objBL.Execute(MyXsdFile, MyXMLFile);
return true;
}
catch (Exception e)
{
MessageBox.Show("{0} Exception caught." + e);
}
return false;
}

and you can find more info about SQLXMLBULKLOAD in MSDN.

|||Could you please send the code which you have used to achieve this.

Importing Xml with SqlBulkCopy

I am looking into importing Xml into SQL Server 2005. The Xml files are about 30MB each, and I need the import to work fast.

The Xml is only records with fields (no real hierarchy) i.e.

<books>
<book>
<title>Harry Potter</title>
<author>JK Rowling</author>
etc etc
</book>
</books>

I am looking into doing in a C# application that will use SqlBulkCopy. However as far I can see the WriteToServer method can only accept either a DataRow[], DataTable or IDataReader.

Obviously it is very easy to convert my Xml into a DataTable to DataRow, but these are both in-memory, so they will memory intensive and slow.

Is there an easy way to expose my Xml through an IDataReader object? Like a wrapper over XmlTextReader? I have thought I could write a custom wrapper over XmlTextReader that supports the IDataReader interface - but I keep thinking what I am doing is pretty basic and I must be missing an easier solution? Anybody ideas?

If not, can anybody recommend an alternative solution for getting these records in extremely quickly programmatically?

AndrewYou can use the OPENXML command to achieve this, You would either have to pass the XML as a string to a procedure or have the file in a place SQL could access.|||I looked into OPENXML but have received several warnings regarding its performance. I wanted the fastest way of importing without consuming a lot of resources.

In the end I opted to use SQLXMLBulkLoad and it works very nice and quite elegantly (using a schema to map XML elements to SQL fields.

Andrew|||

I thought there was an XML bulk load option just couldn't find it.

Would you care to post your schema and code so this question has a complete answer.

|||

You maybe can use SQLXMLBULKLOAD .The sample code (c#)

public bool BulkLoad(string ConnectionStr, string MyXsdFile, string MyXMLFile)
{
bool test;
try
{
SQLXMLBulkLoad4 objBL = new SQLXMLBulkLoad4();
objBL.ConnectionString = @."provider=SQLOLEDB;data source=SHA-WKS1333\SQL2005;database=MatrixTest;integrated security=SSPI;";
objBL.ErrorLogFile = @.".\error.log";
objBL.CheckConstraints = true;
test = objBL.Transaction;
objBL.XMLFragment = true;
objBL.TempFilePath = @.".\";
objBL.SchemaGen = true;
objBL.SGDropTables = true;
objBL.Execute(MyXsdFile, MyXMLFile);
return true;
}
catch (Exception e)
{
MessageBox.Show("{0} Exception caught." + e);
}
return false;
}

and you can find more info about SQLXMLBULKLOAD in MSDN.

|||Could you please send the code which you have used to achieve this.sql