How to add xml encoding <?xml version="1.0" encoding="UTF-8"?> to xml Output in SQL Server How to add xml encoding <?xml version="1.0" encoding="UTF-8"?> to xml Output in SQL Server xml xml

How to add xml encoding <?xml version="1.0" encoding="UTF-8"?> to xml Output in SQL Server


You have to add it manually. SQL Server always stores xml internally as ucs-2 so it is impossible for SQL to generate it a utf-8 encoding header

See "Limitations of the xml Data Type" on MSDN

The XML declaration PI, for example, <?xml version='1.0'?>, is not preserved when storing XML data in an xml data type instance. This is by design. The XML declaration (<?xml ... ?>) and its attributes (version/encoding/stand-alone) are lost after data is converted to type xml. The XML declaration is treated as a directive to the XML parser. The XML data is stored internally as ucs-2.


When I read this post I thought it was the "end of the line"... no solution... I almost gave up on the approach... but in fact there is a way to work around this limitation by converting the XML to varchar(max) and then appending the declaration to the beginning of the string. The following post shows how:

Using SQL Server "FOR XML": Convert Result Datatype to Text/varchar/string whatever?

A simple example would look something like this:

SELECT 'MY DATA' As MyColumn INTO #MyTable SELECT '<?xml version="1.0" encoding="UTF-8"?>' + CAST((SELECT MyColumn FROM #MyTable FOR XML PATH('')) AS VARCHAR(MAX)) AS XmlDataDROP TABLE #MyTable 

The Output:

<?xml version="1.0" encoding="UTF-8"?><MyColumn>MY DATA</MyColumn>


The accepted answer of "add it manually", while technically correct, is incomplete and hence misleading. Simply adding the XML declaration with whatever "encoding" you want doesn't change the actual encoding of the string. This is sometimes ok. If you specify "UTF-8" and convert the XML data to VARCHAR, then as long as all of the characters are standard ASCII characters (values 1 - 127), then sure, it's UTF-8 (at least there is no noticeable difference). BUT, if there are any characters with values 128 or above, then you do not have a UTF-8 encoded XML document. And if you convert the XML data to NVARCHAR, then you have a UTF-16 encoded document, regardless of what you manually specify in the XML declaration. You should only be specifying an encoding IF it is the actual encoding being used.

And until SQL Server 2019 (currently in beta at CTP 2.1), there was no way to get the encoding to be UTF-8 within SQL Server, at least not without using SQLCLR. But in SQL Server 2019, you can now convert the XML to actual UTF-8:

DECLARE @XML XML;SET @XML = N'<test attr="&#x1F60E;"/>';SELECT @XML,       CONVERT(VARBINARY(100), CONVERT(NVARCHAR(MAX), @XML)), -- UTF-16 / UCS-2       CONVERT(VARBINARY(100),               CONVERT(VARCHAR(MAX),                       CONVERT(NVARCHAR(MAX), @XML) COLLATE Latin1_General_100_CI_AS_SC_UTF8)              ); -- UTF-8

That returns:

Column 1: <test attr="😎" />Column 2: 0x3C007400650073007400200061007400740072003D0022003DD80EDE22002F003E00Column 3: 0x3C7465737420617474723D223F3F222F3E

Since many people won't be on SQL Server 2019 for a while yet, this is possible via SQLCLR. You can use .NET Xml classes (e.g. XmlWriter) to export this with various options. In fact, I created a SQLCLR library of functions, SQL#, that includes such a function: XML_SaveToFile. The XML_SaveToFile function allows for specifying any valid encoding and it will both set that in the XML declaration and ensure that the file is saved with that encoding. It also has options for indenting, newlines, etc. Just FYI: while there are many functions available in the Free version, XML_SaveToFile is only available in the Full (paid for) version.