Showing posts with label flat. Show all posts
Showing posts with label flat. Show all posts

Wednesday, March 28, 2012

importing to swl server from a flat file PK problem

I am trying to import into sql server from a .csv flat file i made in excel. The problem is the Primary Key. It is set to auto increment.
When i leave an empty column i get:
"Directcopyxform conversion error:destination does not allow null"

when i just omit the column entirely, i get a wrong datatype error because it basically tries to copy all the columns 1 shift to the left

How am i supposed to represent and auto PK in .csv file. ThanksUse a staging table. Where you can load the information to, perform any processing, then insert into your table. Finally cleaning out the staging table at the end.

Importing text from a flat text file.

I have a DOS scripts that echo's some PC data to a local log file but instea
d
of writing directly to this log file i'd like the output of this script to
populate an SQL 2000 database. How would i go about redirecting the output
from a flat text file to a SQL database.
Regards
John>> How would i go about redirecting the output from a flat text file to a
Some options include Bulk Insert, BCP IN or DTS. Simply search the index in
SQL Server Books Online & you'll find all the information you need to use
them. Also, for DTS, check out www.sqldts.com
Anith|||"ValleyBoy" <ValleyBoy@.discussions.microsoft.com> wrote in message
news:D54A35D3-BE24-458A-B371-27B0A239B043@.microsoft.com...
> I have a DOS scripts that echo's some PC data to a local log file but
instead
> of writing directly to this log file i'd like the output of this script to
> populate an SQL 2000 database. How would i go about redirecting the output
> from a flat text file to a SQL database.
> Regards
> John
Take a look at bcp, BULK INSERT and osql. I'm not sure which approach will
work with your given situation, or if it's even possible to have it
redirected. You may have to dump it to the flat file and then bcp or BULK
INSERT the data in to SQL Server as part of another batch.
Another possible option is to have DTS run the DOS jobs and do something
with the input. I'm not a DTS expert, so I can't say whether or not this
approach is feasible either.
HTH
Rick Sawtell
MCT, MCSD, MCDBA|||for this you might check DTS available in BOL
alternatively check for osql command in BOL
eg:
osql /U alma /P /i stores.qry
or u can use bulk Insert
best Regards,
Chandra
http://chanduas.blogspot.com/
http://groups.msn.com/SQLResource/
---
"ValleyBoy" wrote:

> I have a DOS scripts that echo's some PC data to a local log file but inst
ead
> of writing directly to this log file i'd like the output of this script to
> populate an SQL 2000 database. How would i go about redirecting the output
> from a flat text file to a SQL database.
> Regards
> John

Friday, March 23, 2012

Importing multiple flat files to multiple tables in SSIS

I have a couple of hundred flat files to import into database tables using SSIS.

The files can be divided into groups by the format they use. I understand that I could import each group of files that have a common format at the same time using a Foreach Loop Container.

However, the example for the Foreach Loop Container has multiple files all being imported into the same database table. In my case, each file needs to be imported into a different database table.

Is it possible to import each set of files with the same format into different tables in a simple loop? I can't see a way to make a Data Flow Destination item accept its table name dynamically, which seems to prevent me doing this.

I suppose I could make a different Data Flow Destination item for each file, in the Data Flow. Would that be a reasonable solution, or is there a simpler solution, or should I just resign myself to making a separate Data Flow for every single file?

In an OLEDB destination you can select to use a "Table or view name from variable" -in this way the destination can be dynamic.

|||Thanks, I missed that.|||

Any suggestions on the best way to assign the destination table name variable?

I imagine could do it in a Script item in the Foreach Loop before the Data Flow item is executed, but is there somewhere to do it more in-line, like building it into the Data Flow item's properties somehow?

|||

Use some logic to derive the table name based on the format, which I guess must be derivable from the filename?

You have the filename, so how do you expect to transform this into a table name. I would expect an expression to be used somewhere, the tablename variable for example, but you may need to refer to some logic table. an Execute SQL Task inside the loop could query a SQL table that gave you the destination table from a filename. The Exec SQL Task result could then be assigned to the tablename variable.

Wednesday, March 21, 2012

Importing from a Flat File

I have a flat file which has five fields, and need to import into a table that has 6 fields, the first of which will be data from a variable.

i have Used a Dataflow task which has a flat file source task . i need help to get the data from that into the table

Table field1 field2 field3 field4 field5 field6

@.Variable

File Col0 Col1 Col2 Col3 Col4

can anyone explain how i can achieve thsi

I would suggest using a derived column task to add a column to the pipeline that can be set to your variable.

Then I would use an OLE DB destination. You can set the target table in the OLE DB Destination.

Place the derived column task right after the flat file source then pipe the data into the destination.

HTH

Mark

http://spaces.msn.com/mgarnerbi

importing flat files to many tables

I'm trying to input a few thousand flat files into a few thousand tables in a sql database, using SQL Server Business Intelligence Development Studio.

im using a for each loop to read all the files in a directory

the problem is i can only insert the data from all the files into one table

does anyone know a way to do multiple tables? maybe using some sort of variable?


Use the multicast in the dataflow.

Kirk Haselden
Author "SQL Server Integration Services"

importing flat files to many tables

I'm trying to input a few thousand flat files into a few thousand tables in a sql database

im using integration services with a for each loop to read all the files in a directory

the problem is i can only insert the data from all the files into one table

does anyone know a way to do multiple tables? maybe using some sort of variable?

Yes you can, although the approach differs based on what you are trying to accomplish. If each file goes to a different table, use table name variable as your data access mode. If you want each file to go to each table, use nested foreach loops. Foreach file, foreach table, input data.

Monday, March 12, 2012

importing datetime data

My flat file I import to the table is set up as 2004/06/16 09:40:07.994 comma delimited, and i import this using DTS into a datetime field. but when I run a query on the table, the only thing I see is 2004/06/16 showing up, how come I can't see the time?
Thx for your help!What tool are you using to query the table? It may be set to display DATETIME values as only the date (this is common if using the regional settings on the client with an ODBC based tool).

-PatP

Importing DATE with Timestamp(In a Flat file) Column using SSIS

Hi

SSIS is brand new for me.. Playing with since a few hours..

Iam trying to import a Flat File into the SQLSERV DB using SSIS..
One of the column is in this format -- "YYYYMMDDHH24MISS"

How do i get around this to import the data in a readable fashion into the Destination?

Thanks!
MKR

Hi MKR,

What data type are you wanting the result to be?

You can use a derived column component to parse the format of the column and create anything you like -- a DT_DBTIMESTAMP, a string with your own format, etc...

You could turn the string in the format you have above into a string with this format: "YYYY-MM-DD HH:MM: SS" with an expression like this in derived column (where i am assuming the string is in a column called 'Col'):

SUBSTRING(Col, 1, 4) + "-" + SUBSTRING(Col, 5,2) + "-" + SUBSTRING(Col, 7,2) + " " + SUBSTRING(Col, 9,2) + ":" + SUBSTRING(Col, 13,2) + ":" + SUBSTRING(Col, 15,2)

Is that the sort of thing you are looking for?

Thanks
Mark

|||Thanks Mark..

But as i was telling you earlier.. My Knowledge on SSIS is very limited..
Now that i know we can manipulate the string..

Where do i do this -- I mean, where do i add this SUBSTRING Manipulation..

|||You want to add in in the data flow, using a derived column transformation.|||Thanks! Welch n Mark

Importing Date Values from a flat file into the database

Hi

I am trying to to import a flat file into a table in my database, i get all the values right except for the date, it keeps on inserting NULL values into the date fields.

The date format in the flat file is '20070708' etc.

Does anyone know what i can do to fix this?

I've tried to change the datatype values that it imports, but it still ignores it and inserts NULL values

Any help will be greatly appreciated

Kind Regards

Carel Greaves

Search this forum for "YYYYMMDD" and you'll find your answer along with examples.

Basically, you need to substring the date field into the various date parts and then assemble a date in the format of perhaps mm/dd/yyyy before converting into a datetime field.|||

Thanks

Importing Data Throught ASP

Hello,
I've been trying to work with code to import data from a "flat file" (.csv)
into a SQL 2000 table.
I'm not that well versed in SQL and was wondering is it was possible using
OPENROWSET or something similar?
I've tried using OPENROWSET and encountered the error "Ad hoc access to OLE
DB provider 'MSDASQL' has been denied. You must access this provider through
a linked server."
Any help or resources to explain the process would be fantastic.
-SteveIf you're trying to upload through an ASP page you don't want to use
OPENROWSET. That's really for accessing OLE DB data sources. For an example
of uploading a file via ASP check out:
http://msdn.microsoft.com/library/d...br />
0900.asp
Mike O.
"Anubis" <anubis2003@.bluestreem.com.au> wrote in message
news:ucZJcoO1DHA.208@.TK2MSFTNGP12.phx.gbl...
quote:

> Hello,
> I've been trying to work with code to import data from a "flat file"

(.csv)
quote:

> into a SQL 2000 table.
> I'm not that well versed in SQL and was wondering is it was possible using
> OPENROWSET or something similar?
> I've tried using OPENROWSET and encountered the error "Ad hoc access to

OLE
quote:

> DB provider 'MSDASQL' has been denied. You must access this provider

through
quote:

> a linked server."
> Any help or resources to explain the process would be fantastic.
> -Steve
>
|||Hi Mike,
No, actually I wanted to upload data from a flat file (.csv) into a SQL
Database like to following
Flat File (x.csv) contains:
Col1, Col2, Col3
Value 0, Value1, Value 2,
Value 0, Value1, Value 2
Then do a command something like the following:
INSERT INTO dbo.table
SELECT Col1, Col2, Col3
FROM x.csv
Basically I want to open a rowset to insert into a SQL table but I cannot
open the rowset from the .csv source through an ASP command.
Any idea's?
Thanks
-Steve
"Michael Otey" <mikeo@.teca.com> wrote in message
news:eEOx90a1DHA.2336@.TK2MSFTNGP09.phx.gbl...
quote:

> If you're trying to upload through an ASP page you don't want to use
> OPENROWSET. That's really for accessing OLE DB data sources. For an

example
quote:

> of uploading a file via ASP check out:
>

http://msdn.microsoft.com/library/d...tml/asp0900.asp
quote:
[
color=darkred]
> Mike O.
>
> "Anubis" <anubis2003@.bluestreem.com.au> wrote in message
> news:ucZJcoO1DHA.208@.TK2MSFTNGP12.phx.gbl...
> (.csv)
using[QUOTE]
> OLE
> through
>
|||If you're running this on the SQL Server system and just want to import the
test file then probably the simplest thing is to use the BULK INSERT
statement something like the following:
BULK INSERT mydb..mytable FROM 'c:\mycsvfile.csv'
WITH (DATAFILETYPE = 'char',FIELDTERMINATOR = ',',ROWTERMINATOR = '\n')
Mike O.
"Anubis" <anubis2003@.bluestreem.com.au> wrote in message
news:%23eXL8Kb1DHA.2396@.TK2MSFTNGP09.phx.gbl...
quote:

> Hi Mike,
> No, actually I wanted to upload data from a flat file (.csv) into a SQL
> Database like to following
>
> Flat File (x.csv) contains:
> Col1, Col2, Col3
> Value 0, Value1, Value 2,
> Value 0, Value1, Value 2
> Then do a command something like the following:
> INSERT INTO dbo.table
> SELECT Col1, Col2, Col3
> FROM x.csv
> Basically I want to open a rowset to insert into a SQL table but I cannot
> open the rowset from the .csv source through an ASP command.
> Any idea's?
> Thanks
> -Steve
>
> "Michael Otey" <mikeo@.teca.com> wrote in message
> news:eEOx90a1DHA.2336@.TK2MSFTNGP09.phx.gbl...
> example
>

http://msdn.microsoft.com/library/d...tml/asp0900.asp
quote:
[
color=darkred]
> using
to[QUOTE]
>

Wednesday, March 7, 2012

Importing Data From a Flat File

Hi,

I am trying to import data from a Flat File in SQL Server 2005, this used to be a simple task in SQL2K but for some reason isn't in SQL2K5.

I am only trying to import part of the record not the whole thing and it keeps giving me this error:

**************************************************************

MessagesWarning 0x80047076: Data Flow Task: The output column "Column 3" (19) on output "Flat File Source Output" (2) and component "Source - consumer_0_dat" (1) is not subsequently used in the Data Flow task. Removing this unused output column can increase Data Flow task performance.
(SQL Server Import and Export Wizard)

Warning 0x80047076: Data Flow Task: The output column "Column 3" (19) on output "Flat File Source Output" (2) and component "Source - consumer_0_dat" (1) is not subsequently used in the Data Flow task. Removing this unused output column can increase Data Flow task performance.
(SQL Server Import and Export Wizard)

******************************************************************************

If I delete the columns and only have those that I want inserted then i get column length errors.

Can someone pls provide instruction to import a flat file in my database?

Thanks

These are just Warnings, this should not stop your package from execution, unless there is error happening somewhere else.

Importing data from a flat file

I have a flat file data source - call it "order". Its a text file that looks something like this:

ORDERNAME| Example1

CUSTOMER|Acme Industries
COST|11611
ITEMS
B1|550S162-43(33)|35.708|1|636
T1|550S162-43(33)|20.967|1|636
T2|550S162-43(33)|20.967|1|636
W1|350S162-43(33)|1.330|2|501
W21|350S162-43(33)|1.330|1|911
W2|350S162-43(33)|3.044|2|501
W20|350S162-43(33)|3.044|1|911

I would like to write the metadata to a [order header] table and the ITEMS to a [order detail] table. Can someone direct me to a example of something similar?

There are at least two ways to do this, either by using a custom script transform or by reading the source file twice, one for header rows, once for details. I posted a quick comparision of the options and a sample of the second approach on my blog (Pardon the site - it's still under construction -http://agilebi.com/cs/blogs/jwelch/default.aspx). You didn't specify if the source file contains multiple orders, which does have some impact on the solution.

I'm sure there are other approaches for this as well. I'd be interested to hear about other techniques, as I've run into this problem a few times myself, and am not completely satisfied with the options I've explored.

|||

jwelch wrote:

There are at least two ways to do this, either by using a custom script transform [...]

You can use a conditional split instead of a custom script.

|||

I ran into a problem with the conditional split when my header and detail rows had a different number of columns. How'd you work around that with the conditional split?

|||

jwelch wrote:

I ran into a problem with the conditional split when my header and detail rows had a different number of columns. How'd you work around that with the conditional split?

Normally, I read in the full record as one big text string and work with it that way.|||Yeah, I touched on that in the blog post. Guess I just don't like string manipulation |||Since my original post -- I've tried writing a class that reads the file into a StreamReader Object and then moves the data into a DataSet. Although not complete, this approach may turn out to be the simplest. If successful, I'll post the code once its complete.|||

JohnBGood wrote:

Since my original post -- I've tried writing a class that reads the file into a StreamReader Object and then moves the data into a DataSet. Although not complete, this approach may turn out to be the simplest. If successful, I'll post the code once its complete.

The only problem with this approach is this:
Building a class (ie programming) isn't really simple to those who don't build code. It might be simple for you, but perhaps not for others -- just something to consider|||

I do understand your points. Writing code from scratch to do ETL instead of using an ETL tool such as SSIS does seem counterintuitive.

With that said, I did finally manage to get my C# ETL utility working correctly. To me its an easier approach simply because I'm more comfortable with C# File I/O than SSIS. My intial thought was to use SSIS, but after reading up a little on working with nested data, it seemed to me that an SSIS approach would require more time (at least for me) to implement.

As far as the raw data file goes... it comes from a 3rd party system - so we don't have alot of control over what they provide.

Thanks for taking the time to answer my questions. Very much appreciated. I'll post the code for my C# routine tomorrow AM.

|||

Here's the code in C#...

|||

public class BOM
{
public DataSet GetOrderInfo(string logFileName) //bring the datafile into a dataset.
{
DataTable tblOrderInfo = CreateOrderInfo(); //procedure that creates a datatable
DataTable tblConnectors = CreateConnectorInfo();
DataTable tblMembers = CreateMemberInfo();
DataSet bom = new DataSet();

if (File.Exists(logFileName))
{
FileStream fs = new FileStream(logFileName, FileMode.Open, FileAccess.Read, FileShare.None);
StreamReader sr = new StreamReader(fs);
string Line;
string[] items = new string[50];
int i = 0;

while ((Line = sr.ReadLine()) != "ITEMS")
{
itemsIdea = Line;
string[] columns = Line.Split(new char[1] { '|' });
DataRow newrow;
newrow = tblOrderInfo.NewRow();
int j = 0;
foreach (string Data in columns)
{
newrow[j] = Data;
j++;
}
tblOrderInfo.Rows.Add(newrow);
i++;
}

while ((Line = sr.ReadLine()) != null)
{
itemsIdea = Line;
string[] columns = Line.Split(new char[1] { '|' });
DataRow newrow;
newrow = tblMembers.NewRow();
int j = 0;
foreach (string Data in columns)
{
newrow[j] = Data;
j++;
}
tblMembers.Rows.Add(newrow);
i++;
}


bom.Tables.Add(tblOrderInfo);
bom.Tables.Add(tblItems);
fs.Close();

return bom;

}

else
{
return null;

}
}

Friday, February 24, 2012

importing data

we are trying to import data from a flat file using an uptick (`) as a column separator and {CR/LF} as a record terminator. There is a variable number of columns for each record. The initial record in the flat file has 3 columns. Upon processing this record, the import sets all records to 3 columns and does not read the column separators past the second column (even though there may be up to 7 columns in the record).

This method worked ok in DTS2000 and it works with Excel. Any suggestions?

Thank You

You'll need to set up as many columns as you can have in the data. It's tricky because SSIS doesn't handle variable columns very well without implementing some workarounds. (Read in entire row as one field, later split field up using substrings, or other solutions)

Take a look at this forum using the search feature. This has been discussed quite a bit recently.

|||You may try DataDefractor. It handles variable columns quite nicely.

Regards,
Ivan

Ivan Peev | http://www.cozyroc.com/

importing data

we are trying to import data from a flat file using an uptick (`) as a column separator and {CR/LF} as a record terminator. There is a variable number of columns for each record. The initial record in the flat file has 3 columns. Upon processing this record, the import sets all records to 3 columns and does not read the column separators past the second column (even though there may be up to 7 columns in the record).

This method worked ok in DTS2000 and it works with Excel. Any suggestions?

Thank You

You'll need to set up as many columns as you can have in the data. It's tricky because SSIS doesn't handle variable columns very well without implementing some workarounds. (Read in entire row as one field, later split field up using substrings, or other solutions)

Take a look at this forum using the search feature. This has been discussed quite a bit recently.

|||You may try DataDefractor. It handles variable columns quite nicely.

Regards,
Ivan

Ivan Peev | http://www.cozyroc.com/

Sunday, February 19, 2012

importing a sysbase database into SQL server.

Hi

Has anyone done this before, I have been given some flat files for a Sybase database to import into SQL server. Yet SQL server doesn't want to know. I have a .db and .log, but SQLserv er doesn't like either of them.

However I'm not sure what to do, I have been given a .db and .log . but SQL server doesn't like either of these, is their something I should do to make it compatable? First tie I've ever look at Sysserver so please use short words, Thanks EdYou can't restore a Sybase database onto an MS-SQL server. Your best bet is to restore the files to another Sybase server of exactly the same version/service pack (they are really persnickity about that), then use either DTS or a linked server to move the schema/data to an empty MS-SQL database.

-PatP|||Right okay i think we will have to look at talking to the people who have provided the database.

Thanks for the info!|||Microsoft has a Sybase to SQL Server migration tool. I thinks it's available in the SQL server 200 resource kit. It works really well.|||Excellent point Peter! Ed really should check out Chapter 6 (http://www.microsoft.com/resources/documentation/sql/2000/all/reskit/en-us/part2/c0661.mspx) of The SQL Resource Kit before he goes too far with this process!

-PatP|||I'd use the dts utility just like if your were importing a exel file into access.|||Excellent , thanks for the help

I like the sound of the tool in the the "SQL server 200 resource kit" The only thing is I'm not sure how to get hold of it? Do you have to order the CD or is it something I can download?|||Most of the larger bookstores stock the SQL Server Resource Kit. Drop by your friendly neighborhood Barnes and Noble, Borders, etc and you should have no problem. If push comes to shove, you can always order it online (http://search.barnesandnoble.com/booksearch/isbnInquiry.asp?userid=ZYw1fKKpGI&isbn=0735612668&itm=3) for delivery tomorrow.

-PatP