Showing posts with label decimal. Show all posts
Showing posts with label decimal. Show all posts

Wednesday, March 28, 2012

Importing text file removes decimal separator

Hi,

I'm trying to import a semi-comma separated text file into a SQL db. I have a field in the text file that contains decimal number. As a decimal separator it's used a comma (15,35). When i use a DTS package to create a destination table and import all rows, the field is created as a float field. In this field the decimal comma is removed so the number in SQL becomes 1535. If I change the decimal separator to (.) i works OK. But I need to get it work with comma as decimal separator. In the DTS package the field form the text file is recognised as varchar (8000). Any ideas?

Ingar

You have to choose appropriate locale which threats comma as a decimal delimiter. There is the drop-down control for choosing locales on the first page of the Flat File Connection Manager UI.

Thanks.

|||

Hi, thanks for your answer. I can see that I didn't specify which SQL version i was using. I use SQL 2000 not 2005. I know that the theme in this forum is SSIS but it also states "transforms/data flow" and I thought DTS packages from SQL 2000 was included here. I appologize if I have misunderstood this but english is not my first language.

So as far as I know the Flat file connection manager is not available in SQL 2000.

But thanks anyway.

Ingar

Monday, March 19, 2012

Importing EBCDIC file with COMP3 fields

Hi All,

I have a file with several columns in Comp-3.

I have downloaded the UnPack Decimal component and, as it needs byte stream (dt_bytes) as input, so I set up an appropriate Flat File Source columns.

But I get this error:

[Flat File Source [2201]] Error: Data conversion failed. The data conversion for column "DTCDC" returned status value 2 and status text "The value could not be converted because of a potential loss of data.".

[Flat File Source [2201]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "output column "DTCDC" (2250)" failed because error code 0xC0209084 occurred, and the error row disposition on "output column "DTCDC" (2250)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.

DTCDC is first of the columns packed. These are mostly date columns packed into 5 bytes - should be unpacked to normal SQL date.

I've tried different locale , as suggested in other threads, but it didn't help.

Can anybody help me with this issue how can I do it in (VB.NET script ,importing as String?).

Thanks in advance

Michal

Did you remember to go into the advanced editor and mark the fields as binary with the correct column width?

|||

Hi EWisdahl,

Indeed it helped Smile

UnPackDecimal does pretty good job too.

Thanks,

Michal

|||Just as a reminder, if your question is answered, mark the reply as the answer. It helps out those who are looking for answers to the same type of questions later, as well as those who try to look through all of the unaswered posts...|||If your file gets too complicated there is a custom source component at www.aminosoftware.com that is supposed to do conversion to ascii on the fly including comp-3 with redefines, occurs, and other nasty EBCDIC nuances.

Importing Decimal Data Types into SQL Server 2005

I have a simple Integration Services project and the problem is that decimal fields are importing as real (I'm loosing the digits behind the decimal point).

The project contains a data flow task importing a flat file (.csv) to an SQL Server destination. My .csv file has two decimal type fields. Example:

Field 1: 12345.67

Field 2: .123456

My database table that I'm importing to has two fields. The only way that I can get this data to import is to define the fields as "float" in both the text file and database table. I want the database table fields to be defined as decimal or numeric, not float:

Field 1: decimal(7,2)

Field 2: decimal(6,6)

When all fields are defined as decimal (in both the flat file and database file), I get the following results:

Field 1: 12345.00

Field 2: .000000

How does one import decimal data from a flat file (.csv)?

Thank you in advance!

I answered my own question. My eyes were just not seeing the DataScale property of the Advanced connection manager editor window. Once I plugged in the scale value my decimals imported perfectly.

jn

Friday, March 9, 2012

Importing data from oracle to sql loosing data after the decimal point

I have created a simple package that uses a sql command to pull data from an oracle database and inserts the data into a sql 2005 table. Some of the data fields that i am pulling from contain two digits after the decimal point, however this data is lost when it gets into sql. I have even tried putting the data into a flat file, and still the data is lost.

In the package I have a ole db source connection which is the oracle database and when i do the preview i see all the data I need. I am very confused and tried a number of things to get the data into sql, but none work. Any ideas would be very helpful.

thanks

What is the data type of the SSIS column that you are storing the Oracle results in? Also double check that you have a "scale" defined for that data type.|||

I have tried money, decimal (38,2) , float and numeric (18,2)

With no luck. What is really weird is that when I use the Destination of flat file the same thing happens I loose all digits after the decimal.

|||Are you just going source to destination? No other transformations?|||thats correct.|||Do you receive any truncation warnings when you run the package? (Or ANY warnings?)|||

I don't get any warnings if I use numeric (38,2)

I let the package create the table. The package wants to use numeric (38) but I changed to numeric (38,2) on the create table statement in the destination window.

Sunday, February 19, 2012

Importing a TXT file to MSSQL

hi, I have a TXT file with several fields with semicolom between fields.

But after Importing is done, Decimal field Dont appears with
comma.

in other words , if I type in a DataBase using Enterprise Manager 123,45678 I get exactily 123,45678, but from
Txt file I get 1234568.

what do i missing?
Tks
Carlos Lages
ps. Txt files is "12345678" even I put Txt file "123,45678"
I get the same result in databaseIn your table definition is the field a character field or numeric? What data type is it? This is just a thought, but maybe during the import job, DTS? is treating the field as some form of numeric field and ignoring the "," as that is not relevant..

Hope this helps, or am i on the wrong track?

Originally posted by Carlos lages
hi, I have a TXT file with several fields with semicolom between fields.

But after Importing is done, Decimal field Dont appears with
comma.

in other words , if I type in a DataBase using Enterprise Manager 123,45678 I get exactily 123,45678, but from
Txt file I get 1234568.

what do i missing?
Tks
Carlos Lages
ps. Txt files is "12345678" even I put Txt file "123,45678"
I get the same result in database