Is there any difference between DECIMAL and NUMERIC in SQL Server?
This is what then SQL2003 standard (ยง6.1 Data Types) says about the two:
<exact numeric type> ::= NUMERIC [ <left paren> <precision> [ <comma> <scale> ] <right paren> ] | DECIMAL [ <left paren> <precision> [ <comma> <scale> ] <right paren> ] | DEC [ <left paren> <precision> [ <comma> <scale> ] <right paren> ] | SMALLINT | INTEGER | INT | BIGINT ...21) NUMERIC specifies the data type exact numeric, with the decimal precision and scale specified by the <precision> and <scale>.22) DECIMAL specifies the data type exact numeric, with the decimal scale specified by the <scale> and the implementation-defined decimal precision equal to or greater than the value of the specified <precision>.
To my knowledge there is no difference between NUMERIC and DECIMAL data types. They are synonymous to each other and either one can be used. DECIMAL and NUMERIC data types are numeric data types with fixed precision and scale.
Edit:
Speaking to a few collegues maybe its has something to do with DECIMAL being the ANSI SQL standard and NUMERIC being one Mircosoft prefers as its more commonly found in programming languages. ...Maybe ;)