0
Declare @BadDecimal varchar(5)
Set @BadDecimal = '4.5'
Declare @GoodDecimal Decimal
Set @GoodDecimal = @BadDecimal
Select @GoodDecimal

--Outputs 5

Why?

digiguru
  • 12,724
  • 20
  • 61
  • 87

4 Answers4

6

Try

Declare @GoodDecimal Decimal(2,1)

edit: changed to (2,1) after request.

Coentje
  • 501
  • 1
  • 6
  • 15
4

Coentjie beat me to it arrggg...

You need to declare your decimals using this format Decimal(p,s)

p = The maximum total number of decimal digits that can be stored, both to the left and to the right of the decimal point. The precision must be a value from 1 through the maximum precision of 38. The default precision is 18.

s = The maximum number of decimal digits that can be stored to the right of the decimal point. Scale must be a value from 0 through p. Scale can be specified only if precision is specified. The default scale is 0; therefore, 0 <= s <= p. Maximum storage sizes vary, based on the precision.

Gareth
  • 2,180
  • 5
  • 19
  • 24
1

Try Set @BadDecimal = '4,5'

user48864
  • 11
  • 1
1

casting or converting to explicit decimal(digits, precision) works:

SELECT CONVERT(DECIMAL(5, 2), '4.5')
SELECT CAST('4.5' AS DECIMAL(5,2))

SQLServer Help states that each combination of (digits, precision) is handled as separate data type.

devio
  • 36,858
  • 7
  • 80
  • 143