Should you choose the MONEY or DECIMAL(x,y) datatypes in SQL Server? Should you choose the MONEY or DECIMAL(x,y) datatypes in SQL Server? sql-server sql-server

Should you choose the MONEY or DECIMAL(x,y) datatypes in SQL Server?


Never ever should you use money. It is not precise, and it is pure garbage; always use decimal/numeric.

Run this to see what I mean:

DECLARE    @mon1 MONEY,    @mon2 MONEY,    @mon3 MONEY,    @mon4 MONEY,    @num1 DECIMAL(19,4),    @num2 DECIMAL(19,4),    @num3 DECIMAL(19,4),    @num4 DECIMAL(19,4)    SELECT    @mon1 = 100, @mon2 = 339, @mon3 = 10000,    @num1 = 100, @num2 = 339, @num3 = 10000    SET @mon4 = @mon1/@mon2*@mon3    SET @num4 = @num1/@num2*@num3    SELECT @mon4 AS moneyresult,    @num4 AS numericresult

Output: 2949.0000 2949.8525

To some of the people who said that you don't divide money by money:

Here is one of my queries to calculate correlations, and changing that to money gives wrong results.

select t1.index_id,t2.index_id,(avg(t1.monret*t2.monret)    -(avg(t1.monret) * avg(t2.monret)))            /((sqrt(avg(square(t1.monret)) - square(avg(t1.monret))))            *(sqrt(avg(square(t2.monret)) - square(avg(t2.monret))))),current_timestamp,@MaxDate            from Table1 t1  join Table1 t2  on t1.Date = traDate            group by t1.index_id,t2.index_id


SQLMenace said money is inexact. But you don't multiply/divide money by money! How much is 3 dollars times 50 cents? 150 dollarcents? You multiply/divide money by scalars, which should be decimal.

DECLARE@mon1 MONEY,@mon4 MONEY,@num1 DECIMAL(19,4),@num2 DECIMAL(19,4),@num3 DECIMAL(19,4),@num4 DECIMAL(19,4)SELECT@mon1 = 100,@num1 = 100, @num2 = 339, @num3 = 10000SET @mon4 = @mon1/@num2*@num3SET @num4 = @num1/@num2*@num3SELECT @mon4 AS moneyresult,@num4 AS numericresult

Results in the correct result:

moneyresult           numericresult--------------------- ---------------------------------------2949.8525             2949.8525

money is good as long as you don't need more than 4 decimal digits, and you make sure your scalars - which do not represent money - are decimals.


Everything is dangerous if you don't know what you are doing

Even high-precision decimal types can't save the day:

declare @num1 numeric(38,22)declare @num2 numeric(38,22)set @num1 = .0000006set @num2 = 1.0select @num1 * @num2 * 1000000

1.000000 <- Should be 0.6000000


The money types are integers

The text representations of smallmoney and decimal(10,4) may look alike, but that doesn't make them interchangeable. Do you cringe when you see dates stored as varchar(10)? This is the same thing.

Behind the scenes, money/smallmoney are just a bigint/int The decimal point in the text representation of money is visual fluff, just like the dashes in a yyyy-mm-dd date. SQL doesn't actually store those internally.

Regarding decimal vs money, pick whatever is appropriate for your needs. The money types exist because storing accounting values as integer multiples of 1/10000th of unit is very common. Also, if you are dealing with actual money and calculations beyond simple addition and subtraction, you shouldn't be doing that at the database level! Do it at the application level with a library that supports Banker's Rounding (IEEE 754)