Friday, March 30, 2012
Importing XML in SQL Server 2000 in .Net 2.0
I realized, I am having SQL Server 2000 for this problem, and not 2005
(To many projects I guess). So, SSIS won't be possible. Same goes for
XML typed column.
Is there option for bulk upload in SQL Server 2000.
Kent Tegels wrote:
> Hello Neo,
> Okay, so let me ask this: are you simply interested in loading the XML int
o
> the database in some batch operation or do you both load the orginal data
> into SQL Server and modify stored values from XML?
> If it this is a simple ETL (former) case, SSIS would be my tool of choice.
> If you're looking to both load and update from XML, SSIS is still usable,
> but the solution might look a little different.
> If you simply putting the XML into a XML-typed column, consider using open
rowset(bulk)
> instead.
> More details helpful. :)
> Thanks!
> Kent Tegels
> DevelopMentor
> http://staff.develop.com/ktegels/Hello Neo,
N> I realized, I am having SQL Server 2000 for this problem, and not
N> 2005 (To many projects I guess). So, SSIS won't be possible. Same
N> goes for XML typed column.
BCP or SQLXMLBulkLoad then. Not an easy problem to solve with those tools.
If you can use the .NET 2.0 framework, you might want to look into the SqlBu
lkCopy
class. That might help.
Thanks!
Kent
Importing XML Files to SQL Database using SSIS
I need to Import an XML file into SQL Database using SSIS, Plz let me know
how do I acheive this. I am quite new to SSIS, So if you can Point me to some
samples on importing XML files using SSIS, nothing like that. Any additional
references, if any, are most welcome.
Thanks in Advance,
Sachin R. Chavan.
Hello Sachin,
> I need to Import an XML file into SQL Database using SSIS, Plz let me
> know how do I acheive this. I am quite new to SSIS, So if you can
> Point me to some samples on importing XML files using SSIS, nothing
> like that. Any additional references, if any, are most welcome.
a. Get this book and read it: http://www.wrox.com/WileyCDA/WroxTitle/productCd-0764584359.html
b. Use the XML source.
Thanks!
Kent Tegels
DevelopMentor
http://staff.develop.com/ktegels/
sql
Importing XML Files to SQL Database using SSIS
I need to Import an XML file into SQL Database using SSIS, Plz let me know
how do I acheive this. I am quite new to SSIS, So if you can Point me to som
e
samples on importing XML files using SSIS, nothing like that. Any additional
references, if any, are most welcome.
Thanks in Advance,
Sachin R. Chavan.Hello Sachin,
> I need to Import an XML file into SQL Database using SSIS, Plz let me
> know how do I acheive this. I am quite new to SSIS, So if you can
> Point me to some samples on importing XML files using SSIS, nothing
> like that. Any additional references, if any, are most welcome.
a. Get this book and read it: http://www.wrox.com/WileyCDA/WroxTi...>
4584359.html
b. Use the XML source.
Thanks!
Kent Tegels
DevelopMentor
http://staff.develop.com/ktegels/
Importing XML File using SSIS (DTS)
I can use MS Access to import an XML file and it builds the columns and loads the data. I am trying to use SQL Server 2005 to do the same thing and there doesn't seen to be a way to easily accomplish the same task that works so well in Access.
Has anyone tried this? If so and you were successful, what were the steps to set up the package?
Thanks.
Have you tried to use XML Source adapter?
- Grab a Data Flow task from the toolbox
- Double-click on the added task
- Double-click on the XML Source in the data flow toolbox
- Double-click added component on the diagram
- Set-up your XML sorce
HTH.
|||Hi HTH,
After reading your post, i was trying to dothe same thing using the below xml:
<country
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="country.xsd">
<name>France</name>
<population>59.7</population>
</country>
and i even generated the XSD using SSIS and got this :
<?xml version="1.0"?>
<xs:schema xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" attributeFormDefault="unqualified" elementFormDefault="qualified" xmlns:xs="http://www.w3.org/2001/XMLSchema">
<xs:element name="country">
<xs:complexType>
<xs:sequence>
<xs:element minOccurs="0" name="name" type="xs:string" />
<xs:element minOccurs="0" name="population" type="xs:decimal" />
</xs:sequence>
</xs:complexType>
</xs:element>
</xs:schema>
and now after pointing things correctly in my connection manager when i go to my columns all i get is one column which says Name..
why is it so? ideally i should get Name and Pollution right..? Can you advice..?
|||
Try wrapping the xml in a root node:
<root>
<country
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="country.xsd">
<name>France</name>
<population>59.7</population>
</country>
</root>
Thanks
Mark
Mark,
Thanks It worked.
So the impression that i get is that all XML's should be waped in a root node. Is it True
Importing XML File using SSIS (DTS)
I can use MS Access to import an XML file and it builds the columns and loads the data. I am trying to use SQL Server 2005 to do the same thing and there doesn't seen to be a way to easily accomplish the same task that works so well in Access.
Has anyone tried this? If so and you were successful, what were the steps to set up the package?
Thanks.
Have you tried to use XML Source adapter?
- Grab a Data Flow task from the toolbox
- Double-click on the added task
- Double-click on the XML Source in the data flow toolbox
- Double-click added component on the diagram
- Set-up your XML sorce
HTH.
|||Hi HTH,
After reading your post, i was trying to dothe same thing using the below xml:
<country
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="country.xsd">
<name>France</name>
<population>59.7</population>
</country>
and i even generated the XSD using SSIS and got this :
<?xml version="1.0"?>
<xs:schema xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" attributeFormDefault="unqualified" elementFormDefault="qualified" xmlns:xs="http://www.w3.org/2001/XMLSchema">
<xs:element name="country">
<xs:complexType>
<xs:sequence>
<xs:element minOccurs="0" name="name" type="xs:string" />
<xs:element minOccurs="0" name="population" type="xs:decimal" />
</xs:sequence>
</xs:complexType>
</xs:element>
</xs:schema>
and now after pointing things correctly in my connection manager when i go to my columns all i get is one column which says Name..
why is it so? ideally i should get Name and Pollution right..? Can you advice..?
|||
Try wrapping the xml in a root node:
<root>
<country
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:noNamespaceSchemaLocation="country.xsd">
<name>France</name>
<population>59.7</population>
</country>
</root>
Thanks
Mark
Mark,
Thanks It worked.
So the impression that i get is that all XML's should be waped in a root node. Is it True
Friday, March 23, 2012
Importing Pervasive SQL into SQL2005 via SSIS
In SQL 2000, I had a working DTS package that would import a Pervasive SQL database into SQL 2000 (There is a good reason, provided on request). The column type definitions came over just fine in SQL 2000 with a few minor changes.
In SQL 2005 (SSIS), i create a Data Source via the Connection Manager (Provider: .Net Providers\Odbc Data Provider) to the Pervasive database (System DSN, <database_odbc>). I then create a Data Destination via the Connection Manager (Provider: Native OLE DB\SQL Native Client) to the SQL database. Both databases reside on the same machine.
I've created a DataSource Reader and used the sql command, "select * from ARCustomer" as an example. The issue is with "data types" for the columns. They don't come close to resembling the results that i had in SQL 2000 DTS.
Is there another method or parameter setting that will preserve the "data types" for the columns being imported from the Pervasive database.
This has been a real stumbling block and any help would be truly appreciated. Thanks in advance for your assistance ... Bernie
connect to the data conversion task before inserting into the destination or provide false for metadata in the properties of both the task.|||The issue is still with the Data Conversion. I've been reading many of the forums and it seems that, like myself, there are a lot of frustrated developers out there trying to make simple sql2000 dts packages work in sql2005 SSIS. Almost all of them point to the data conversion issue. What a real pain in the arse. I surely hope that Microsoft does something about this, SOON...
Something simple that would take less than an hour could turn into weeks and weeks of fruitless effort.
The client has a pervasive sql database that i did many tests on using sql2000 dts. There's approximately 150 tables. My testing went very well. I recommended to the client that they purchase a new server Win2003 R2 along with SQL2005, advising them that my development time in SQL2005, converting the tables with DTS would be a much better process with the upgrade (is it an really an upgraded when you have all this bickering on unicode (data conversion) issues).
I have spent countless hours trying to get this working in sql2005 ssis, with no success.
I wish in retrospect that i'd just had them get Win2000 & SQL2000. I would have been done with the project by now and i would have had a happy client. I feel like a damn fool, recommending a package that i thought would be a significant DEVELPOR improvement, and have yet to show any results. This does not bode well for others like me at all.
|||I have been testing and trying for days, almost a week trying to create a simple, single table SSIS package that i created in DTS in minutes. I'm very frustrated with how the whole process gets hung up in the data conversion.
I have a DataReader Source (an ODBC connection (System DSN) to a Pervasive SQL Database). It pulls the table in just fine. I have a DataFlow Task, using a Data Conversion. I have a SQL Server Destination (The table already exists in the SQL 2005 database).|||Please do this, I had did same thing in my package.
double click on data task flow,
then click on source task go to it's properties windows
there is a properties called ValidateExternalMetadata :- set is False
repeat for destination task properties.
|||
Your recommendation seems to work for Numerical Conversions.
BUT...
Not for string transformations. The strings come in as unicode DT-WSTR.
In the Data Conversion Transformation Editor, I have to manually create a transformation for each and every instance, from DT_WSTR to DT_STR. Once i've done this, it passes muster with running in Debug. This would still require lots of tedious manual, repetitive tasks.
Is there a way to force the DT_WSTR to DT_STR for all of these so i don't have to manually create a Transformation for each?
Or...
Do you have another suggestion on the STR issue that i might try?
Thanks too very much for your assistance!!!
Importing Pervasive SQL into SQL2005 via SSIS
In SQL 2000, I had a working DTS package that would import a Pervasive SQL database into SQL 2000 (There is a good reason, provided on request). The column type definitions came over just fine in SQL 2000 with a few minor changes.
In SQL 2005 (SSIS), i create a Data Source via the Connection Manager (Provider: .Net Providers\Odbc Data Provider) to the Pervasive database (System DSN, <database_odbc>). I then create a Data Destination via the Connection Manager (Provider: Native OLE DB\SQL Native Client) to the SQL database. Both databases reside on the same machine.
I've created a DataSource Reader and used the sql command, "select * from ARCustomer" as an example. The issue is with "data types" for the columns. They don't come close to resembling the results that i had in SQL 2000 DTS.
Is there another method or parameter setting that will preserve the "data types" for the columns being imported from the Pervasive database.
This has been a real stumbling block and any help would be truly appreciated. Thanks in advance for your assistance ... Bernie
connect to the data conversion task before inserting into the destination or provide false for metadata in the properties of both the task.|||The issue is still with the Data Conversion. I've been reading many of the forums and it seems that, like myself, there are a lot of frustrated developers out there trying to make simple sql2000 dts packages work in sql2005 SSIS. Almost all of them point to the data conversion issue. What a real pain in the arse. I surely hope that Microsoft does something about this, SOON...
Something simple that would take less than an hour could turn into weeks and weeks of fruitless effort.
The client has a pervasive sql database that i did many tests on using sql2000 dts. There's approximately 150 tables. My testing went very well. I recommended to the client that they purchase a new server Win2003 R2 along with SQL2005, advising them that my development time in SQL2005, converting the tables with DTS would be a much better process with the upgrade (is it an really an upgraded when you have all this bickering on unicode (data conversion) issues).
I have spent countless hours trying to get this working in sql2005 ssis, with no success.
I wish in retrospect that i'd just had them get Win2000 & SQL2000. I would have been done with the project by now and i would have had a happy client. I feel like a damn fool, recommending a package that i thought would be a significant DEVELPOR improvement, and have yet to show any results. This does not bode well for others like me at all.
|||I have been testing and trying for days, almost a week trying to create a simple, single table SSIS package that i created in DTS in minutes. I'm very frustrated with how the whole process gets hung up in the data conversion.
I have a DataReader Source (an ODBC connection (System DSN) to a Pervasive SQL Database). It pulls the table in just fine.
I have a DataFlow Task, using a Data Conversion.
I have a SQL Server Destination (The table already exists in the SQL 2005 database).|||
Please do this, I had did same thing in my package.
double click on data task flow,
then click on source task go to it's properties windows
there is a properties called ValidateExternalMetadata :- set is False
repeat for destination task properties.
|||
Your recommendation seems to work for Numerical Conversions.
BUT...
Not for string transformations. The strings come in as unicode DT-WSTR.
In the Data Conversion Transformation Editor, I have to manually create a transformation for each and every instance, from DT_WSTR to DT_STR. Once i've done this, it passes muster with running in Debug. This would still require lots of tedious manual, repetitive tasks.
Is there a way to force the DT_WSTR to DT_STR for all of these so i don't have to manually create a Transformation for each?
Or...
Do you have another suggestion on the STR issue that i might try?
Thanks too very much for your assistance!!!
Importing Online XML file into SQL Server 2005 tables – no data transferred
Does anyone have any great suggestions on how I can import an online XML file into an SQL 2005 table?
So far I tried to accomplish it with SSIS Data Flow Task where my source is XML Source (Data access mode: XML file location; XML location: URL, Use inline schema = True). This set up properly identified the columns to be imported.
I used Copy Column data flow transformation task to load data to OLE DB destination table that has same structure.
When I run the task it does execute with no errors, however the table remains empty.It looks like I am failing to read the actual data.
Do you have any suggestions?I am willing to go around this approach with stored procs/com/you name it – just make it work!
Thanks a lot for your help!
LoveDanger
First of all, does any data enter the pipeline from the XM source? Use a data-flow viewer to find out.
-Jamie
|||Nope. No rows affected during the execution.. When I work with the data flow viewer, and point to the URL in the XML Source, it does read the columns of the XML output. Doesn't read the data though..|||
Jamie,
Thanks for the reply. Here is more: I copied xml and tried to import it into my table using sp_xml_prepareDocument ...OPENXML.
I still got the same result: 0 rows affected. Any suggestions? The table structure is identical to the XML output. The OpenXML qry uses the following syntax:
FROM OPENXML(@.xmlHndAdd, '/NewDataSet/Table1', 1)
WITH MyTbl
The XML file format is:
<NewDataSet xmlns="">
<Table1 diffgr:id="Table11" msdata:rowOrder="0">
<program_id>1-2-3-4-5</program_id>
<object_name />
</Table1>
<Table1 diffgr:id="Table12" msdata:rowOrder="1">
<object_id>6-7-8-9-0</object_id>
<object_name>ABC</object_name>
<objectproperty_id>1-3-5-7-9</objectproperty_id>
</Table1>
Thank you!
|||I have the same setup that works without "diffgr:id="TableN" msdata:rowOrder..." if I simply set <table1> - the rows can be inserted just fine... Any more suggestions?|||
Oh.. it was a matter of syntax. Here is the answer:
DECLARE @.xmlHndAdd INT
EXEC sp_xml_prepareDocument @.xmlHndAdd OUTPUT, @.availabilityXml
TRUNCATE TABLE MyTbl
INSERT MyTbl SELECT * FROM OPENXML(@.xmlHndAdd, '//NewDataSet/Table1', 2)
WITH MyTbl
Importing non-hidden rows from Excel?
I have a need to import only non-hidden rows from an excel spreadsheet. When I create the package in SSIS, it imports everything. Due to the use of the data on the spreadsheet, we cannot simply delete the data.
Is there a special setting in the Excel Source or Connection manager that can be set to "only import non-hidden rows"?
Also, how do I go about setting the sheet with an index instead of the actual Sheet name? The user changes the sheet name at random, but I know I only need the first two sheets on the file.
Thanks!
Matt Michuta
No, there is no setting like this. You at the mercy of the Excel OLE DB Provider here and it doesn't possess functionality like you are requesting. You will have to filter those rows out in the pipeline.
Not sure about the sheet index problem - I don't ever use the Excel provider. I suspect you can't do that either. Perhaps try the data access forum if no-one here knows: https://forums.microsoft.com/MSDN/ShowForum.aspx?ForumID=87&SiteID=1
-Jamie
|||You should be able to do this with a little bit of creative vba coding in excel. You can write a macro that will copy all the data to a sheet that has a static name and link the action to an event or button. Since you can use sheet indexes and row properties in vba you can get exactly what you need, and since the sheet you're copying to has a static name you can use the excel source in SSIS to import it.Importing multiple flat files to multiple tables in SSIS
I have a couple of hundred flat files to import into database tables using SSIS.
The files can be divided into groups by the format they use. I understand that I could import each group of files that have a common format at the same time using a Foreach Loop Container.
However, the example for the Foreach Loop Container has multiple files all being imported into the same database table. In my case, each file needs to be imported into a different database table.
Is it possible to import each set of files with the same format into different tables in a simple loop? I can't see a way to make a Data Flow Destination item accept its table name dynamically, which seems to prevent me doing this.
I suppose I could make a different Data Flow Destination item for each file, in the Data Flow. Would that be a reasonable solution, or is there a simpler solution, or should I just resign myself to making a separate Data Flow for every single file?
In an OLEDB destination you can select to use a "Table or view name from variable" -in this way the destination can be dynamic.
|||Thanks, I missed that.|||Any suggestions on the best way to assign the destination table name variable?
I imagine could do it in a Script item in the Foreach Loop before the Data Flow item is executed, but is there somewhere to do it more in-line, like building it into the Data Flow item's properties somehow?
|||Use some logic to derive the table name based on the format, which I guess must be derivable from the filename?
You have the filename, so how do you expect to transform this into a table name. I would expect an expression to be used somewhere, the tablename variable for example, but you may need to refer to some logic table. an Execute SQL Task inside the loop could query a SQL table that gave you the destination table from a filename. The Exec SQL Task result could then be assigned to the tablename variable.
Wednesday, March 21, 2012
Importing Foxpro Database
I am new to SSIS and want to import a foxpro database from a CD into my SQL Server 2005 Database.
Is there a basic tutorial that can help me navigate the import wizard or can someone provide the proper data source connection information to use?
Have a look at this post:
http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=76237&SiteID=1
sqlMonday, March 19, 2012
Importing excel file into Database
Hi, I am very new to ssis, I am trying to set up a package that I can set to run every hour or so, what it will do is look for all excel files in a certain folder and import them into a table on an sql server.
I managed to get it working but my problem is that my data is survey answers and some columns contain a comment.
I get these files on a weekly basis and some weeks the length of the longest comment makes ssis want to use a different data type for the comment column (sometimes it wants DT_NTEXT, other times it wants DT_WSTR).
as this column is filled out by a human I guess aposrtophies and other characters may affext this as well.
when I made the data flow task, I used the wizard on a file which contained about 8 weeks worth of data. when I use 1 weeks worth of data where the comment length is very low, the task gives a validation error saying the metadata is out of sync. if I go back and set the data type for that column to DT_WSTR and rerun the task, it works but then when it tries to process a different weeks worth of data it will fail again
here is an example of an error I get when it thinks the data type is wrong.
[Source - 'raw data$' [1]] Error: The output column "Question ID50# (Verbatim)" (439) on the error output has properties that do not match the properties of its corresponding data source column.
I played around with the data types for a while and managed to get it to process the first file and then try to process the secondfile, in the second file it got around the validation but then got this error:
[Source - 'raw data$' [1]] Error: Failed to retrieve long data for column "Question ID3# (Verbatim)".
is there a way to make it recalculate the data types by itself for each excel file?
I am stuck trying to figure this one out.
sorry if I havent provided enough information, I am not sure which direction to head with this
First, you have to implement an administrative policy: a same type of Excel file must be used in your activities with same list of columns. So, you can use a SSIS package that will be created with DT_NTEXT data type (in this way it can get small or big text strings) and run it when you need; you don't have to rebuild periodically this SSIS package or recreate with wizard as you say.
In plus, using "for each loop container" you can, automatically load a series of excel files from a folder.
|||Excel reports the metadata based on the actual data in the column. Unfortunately, the only workaround for the DT_WSTR / DT_NTEXT issue that I am aware is of to create two data flows, one with DT_WSTR and one with DT_NTEXT. Then use a precendence constraint to determine which one to execute.
You might check out this thread for some additional detail.
http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=2043158&SiteID=1
Importing Excel File in SSIS
Okay... I am now about to pull my hair out: something that worked VERY EASILY in Server 2000 doesn't seem to work at all in 2005. I am trying to pump an Excel table into a 2005 database. I go into the Visual Studio Integration Services Project (this is so much easier... cynicism) and set up a project. I have my data source (Excel), I have my destination (SQL Natve Server, database). I set it up the same way that it worked (perfectly) in DTS and I run it... it grinds away and reports back that all is well... no errors. I go looking for the table... not there. I try with an SA login VS windows authentication... not there. I try with a different table... no there. I try with a different database... not there.
I am certain you can imagine the frustration... that is, if you are a user, not a programmer at Microsoft.
Okay, okay... I won't launch in to abuse here... but hey, how do I make this very complicated process now work?
Thanks...
Use the OLE DB Destination.The SQL Server destination is for databases local to the package.|||
Thanks Phil... I did figure it out with more fidling (as is usually the case, when you breakdown and ask for help, you discover the answer and just how much of a dolt you are): you have to SAVE the package before it runs correctly. [so, what was previously a ONE STEP operation, is now 8 steps... joy, joy, joy]
Thanks, and adios...
Importing Excel
Hi,
I know this issue exisits in DTS but needs to check still is in SSIS, Also you guys may have a better solution for it.
Issue: When I try to import a column from excel which has data like A,B,C,D,E,4,5 in the destination table has the data type as varchar it imports only A,B,C,D,E and 4 & 5 as nulls. How to fix this.
Set the Excel connection Extended Property, IMEX=1.http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1294377&SiteID=1|||Thanks, it worked. Appreciated.|||
Imex=1 worked for all the cells except few, this column has data like below
even in this column some has been imported correctly and few are imported like 7.15E+...
Note: My desination column data type in Varchar.
|||Make sure that Excel doesn't have any formatting for the problem cells or the problem column.|||In the above example for the second row the data was formatted as text becasue it has the leading Zero where as the third is not formatted becuase it doesnt have any leading zero. the issue was in the third row coverting as 7.15E+.. while running the SSIS|||yes, i′ve got just the same problem.
incredibly it didn′t happen the first time i runned the dts, but now..
|||..and i just got it..
I selected the whole worksheek, converted the cells into numeric type (format/cells/numeric) and saved. Then i converted the cells into General type again, saved and runned the dts. And it works now.
Better not to use Text types when it happens. When you′ve got a Text type it doesnt work properly, and if you change to General => it doesnt work either. But if you had Numeric types and change to General then it works.
|||When I change this to Numeric then I will loose the leading zero in the 2 & 3 row.
|||? no, you wont..
It′s just an excel fail, i doesnt catch properly that there′s a number, not a text, when you had a text previously
Importing Excel
Hi,
I know this issue exisits in DTS but needs to check still is in SSIS, Also you guys may have a better solution for it.
Issue: When I try to import a column from excel which has data like A,B,C,D,E,4,5 in the destination table has the data type as varchar it imports only A,B,C,D,E and 4 & 5 as nulls. How to fix this.
Set the Excel connection Extended Property, IMEX=1.http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1294377&SiteID=1|||Thanks, it worked. Appreciated.|||
Imex=1 worked for all the cells except few, this column has data like below
even in this column some has been imported correctly and few are imported like 7.15E+...
Note: My desination column data type in Varchar.
|||Make sure that Excel doesn't have any formatting for the problem cells or the problem column.|||In the above example for the second row the data was formatted as text becasue it has the leading Zero where as the third is not formatted becuase it doesnt have any leading zero. the issue was in the third row coverting as 7.15E+.. while running the SSIS|||yes, i′ve got just the same problem.
incredibly it didn′t happen the first time i runned the dts, but now..
|||..and i just got it..
I selected the whole worksheek, converted the cells into numeric type (format/cells/numeric) and saved. Then i converted the cells into General type again, saved and runned the dts. And it works now.
Better not to use Text types when it happens. When you′ve got a Text type it doesnt work properly, and if you change to General => it doesnt work either. But if you had Numeric types and change to General then it works.
|||When I change this to Numeric then I will loose the leading zero in the 2 & 3 row.
|||? no, you wont..
It′s just an excel fail, i doesnt catch properly that there′s a number, not a text, when you had a text previously
Importing dBase files with the SSIS Import/Export Wizard
I saw this post by dterrie in the Wishlist thread and I just wanted to second it:
"How about bringing back a simple dBase import. The SSIS guys are clearly FAR out of touch with reality if they think people who handle data no longer need to work with dbf files. I've seen alot of dumb stuff in my day, bit this is just sheer brilliance. I just love the advice of first importing into Access and then importing the Access table. Gee, why didn't I think of such a convenient solution. I could have had a V-8."
I've been struggling with this the last couple days and finally decided to import the dBase III file into Access and then import that into SQL Server 2005. Imagine my surprise when I discovered this was the current recommended method.
That's just ridiculous. Can someone tell me why they would reduce some of the functionality of SQL Server from 2000 to 2005? This was a very easy process in SQL Server 2000...
Philip,
Could you record your request here:
http://lab.msdn.microsoft.com/productfeedback/default.aspx
That way a request will be passed directly to our bug system. It will increase a chance to address it sooner, and you will be informed about the progress.
Thanks.
|||Thanks Bob! That's a great idea.
I know you guys have been catching some flack over the anemic ODBC support.
Here's hoping there is a service pack for it soon!
Monday, March 12, 2012
Importing DATE with Timestamp(In a Flat file) Column using SSIS
SSIS is brand new for me.. Playing with since a few hours..
Iam trying to import a Flat File into the SQLSERV DB using SSIS..
One of the column is in this format -- "YYYYMMDDHH24MISS"
How do i get around this to import the data in a readable fashion into the Destination?
Thanks!
MKR
Hi MKR,
What data type are you wanting the result to be?
You can use a derived column component to parse the format of the column and create anything you like -- a DT_DBTIMESTAMP, a string with your own format, etc...
You could turn the string in the format you have above into a string with this format: "YYYY-MM-DD HH:MM: SS" with an expression like this in derived column (where i am assuming the string is in a column called 'Col'):
SUBSTRING(Col, 1, 4) + "-" + SUBSTRING(Col, 5,2) + "-" + SUBSTRING(Col, 7,2) + " " + SUBSTRING(Col, 9,2) + ":" + SUBSTRING(Col, 13,2) + ":" + SUBSTRING(Col, 15,2)
Is that the sort of thing you are looking for?
Thanks
Mark
But as i was telling you earlier.. My Knowledge on SSIS is very limited..
Now that i know we can manipulate the string..
Where do i do this -- I mean, where do i add this SUBSTRING Manipulation..
|||You want to add in in the data flow, using a derived column transformation.|||Thanks! Welch n Mark
importing database from access to sql2005
i am working on sql2005. i am trying to import database from access including relations by using SSIS.any one suggest me or give me the links.
Thanks in advance.I am moving this to the forum.
Friday, March 9, 2012
Importing data from various data sources with non standard formats
Hi all :)
I'm wondering if SSIS will be the solution to the problem I'm working on.
Some of our customers give us an Excel sheet with data they want to insert or update in the database.
I've created a package that will take an Excel sheet, do some data conversion so the data types match up and after that I use a Slowly Changing Data component to create the insert/update commands.
This works great. If a customer adds a new row to the Excel sheet or updates an existing row changes are nicely reflected in the database.
But now I’ve got the following problem. The column names and the order of the columns in the Excel sheet are not standard and in the future it could happen a customer doesn't even use an Excel sheet but something totally different.
Can I use SSIS for this? Is it possible to let the user set the mappings trough some sort of user interface? I’ve looked at programmatically creating the package but I’ve got to say that’s quit hard to do… It would be easier to write the whole thing myself than to create the package trough code ;)
If not I thought about transforming the data in code before I pass it on to the SSIS package in something like XML. That way I can use standard column names and data types.
So how should I solve this problem? Use SSIS or not?
Thnx :)
Wouter de Kort
A question and comment:
1. Why are you using SCD for insert/updates on non-dimension data? It doesn't sound like you have a warehouse and a simple insert/update scenario should work (lookup data, if row not found, then insert, else update). But as long as it's working for you, that's ok.
2. You will have various problems with non-standard columns or file formats. You need to get your customer to send a defined format. One way around this is to create a staging package, loading the data into staging tables and then running the insert/updates on it from a standardized interface. There are numerous ways to set up staging, and I use VIEWS for mine as they are easy to replicate.
Wes|||
1. I already thought that SCD was a little bit overkill for this problem but it has the logic build in to check if a row is already present -> update it and otherwise it will delete it and that was just the thing I need. Would it be easier to 'duplicate' this functionality by using a lookup?
2. Off course it would be easiest to let the customer send a defined format... but we're not in the position the demand that from them (maybe in the future :))
I've started building code to convert the format the user sent to a standard xml format which I can feed to my package and the user can supply the mapping between the data and the XML schema. So they can send in whatever they want... the only ′problem′ is that I will have to build code that manually reads an Excel/Text/.... file and parse that to create the XML structure so I replicate some things that are already in SSIS :(
Could you tell me somewhat more about your staging package? Is that just an Insert into a custom build table that will hold the data? And then use a View to map it to the structure the SSIS package wants?
Wouter :)
|||The lookup method I'm talking about does a similar function as the SCD. It just sounds like you are doing constant updates which violates the idea of "slowly" changing dimensions. I'm not too keen on the SCD stage in SSIS so I prefer to build my own lookup/update mechanism unless it truly is a type 2 SCD, but like I said whatever is working for you that meets the requirements is sufficient. (look at the sticky at the top of the forums referencing Jamie Thompson's blog and how he handles this)
Before I explain my staging, how often do the clients change their file formats? I don't want to lead you on the wrong track.
Thanks.
|||Some of our clients send in an Excel sheet once a week with approximately 600 records. The strange thing is there system doesn't tell them if some stores are new or some are updated... it just gives the whole list.
But as I already said this could be changing in the future. Some other clients have requested Import functionality so they can connect their system/data to ours.
But what do you understand as slowly changing? My data has some modifications, some new records on each update so I thought this would be handy because it nicely encapsulates the logic?
Wouter :)
|||
Razorblade wrote:
Some
of our clients send in an Excel sheet once a week with approximately
600 records. The strange thing is there system doesn't tell them if
some stores are new or some are updated... it just gives the whole list.
But as I
already said this could be changing in the future. Some other clients
have requested Import functionality so they can connect their
system/data to ours.
But what
do you understand as slowly changing? My data has some modifications,
some new records on each update so I thought this would be handy
because it nicely encapsulates the logic?
Wouter :)
Your input list is fine if it's mixed data. As stated, you can
use the lookup function to process the list and sort into new/inserted
rows, and update or insert accordingly.
Nothing you have sounds slowly changing dimension(al), which is why I
questioned your use of the SCD wizard. SCD (in theory) is
for dimensions that don't change much, say a customers' home address or
an employees' position. But if the SCD wizard is working for you,
use it.
Go ahead and mark your question as answered if you are satisfied.
Thanks.
Importing data from txt file
I have 30 text file with same format that I have to make SSIS package so that I can populate my permanent tables. Because they are just record keeping data we are not ver concerned about validating the data. We just want to copy and dump it in permanent table. Which should mean the least amount of work, because I used one file as a sample, did advance formating to give column names. Now, I did one simple dataflow task that takes the flat file and tranfers the data to OLEDB SQL server database table. Unfortunately, it keeps giving me error. I believe this is because of NULL. I want to take care of this without making staging tables and validating data. Any ideas?
are u running this manually from within your SSIS Package? if so, you should get reasonably good information as to the prob in the Execution Results tab.
Are u sure your incoming ASCII fields are set wide enough to capture the data? If not, truncation errors will kill the process.
Are your table fields setup to accept NULLs?
Seth J Hersh
|||well, we couldn't find set solution so, as a get arround we used VB.Net script to pad the row to a set width. This seems to have solved the problem for now.
Thank You
Anjali
|||anj755 wrote:
I have 30 text file with same format that I have to make SSIS package so that I can populate my permanent tables. Because they are just record keeping data we are not ver concerned about validating the data. We just want to copy and dump it in permanent table. Which should mean the least amount of work, because I used one file as a sample, did advance formating to give column names. Now, I did one simple dataflow task that takes the flat file and tranfers the data to OLEDB SQL server database table. Unfortunately, it keeps giving me error. I believe this is because of NULL. I want to take care of this without making staging tables and validating data. Any ideas?
Ummm, it might help if you posted the error. This is a real simple task you are trying to do. (I think.)