Bulk Insert - Row Terminator for UNIX file + "\l" row terminator Bulk Insert - Row Terminator for UNIX file + "\l" row terminator sql sql

Bulk Insert - Row Terminator for UNIX file + "\l" row terminator

The issue you are having is actually not due to the Row Terminator. I suspect, along with the End of File error, you also saw something similar to the following:

Msg 4864, Level 16, State 1, Line 1
Bulk load data conversion error (type mismatch or invalid character for the specified codepage) for row 1, column 4 ({column_name}).

While what I said below the line is still valid regarding the ROWTERMINATOR, the real issue is indicated by your statement of:

[the] table that has 7 columns. The data file only has the first 4 columns, the rest should be left NULL.

This is the issue. When using BULK INSERT, the data file has to have the same number of fields as the table being inserted into. If that is not the case, then you have to use the FORMATFILE ='format_file_path' option in which case you need to create a Format File and specify the location.

I thought you could get away with the easier OPENROWSET(BULK...) so that you can do the following:


But that doesn't allow you to specify a ROWTERMINATOR without using a Format File. Hence you need the Format File in either case.

OR, you could just import into a different staging table that only has 4 columns, and then either:

  • dump that into your current STAGING_TABLE, or

  • do an ALTER TABLE to add the 3 missing columns (it is more efficient to just add 3 NULLable fields than to transfer the data from one table to another :-).

OR, as mentioned by @PhilipKelley in a comment on this answer, you could create a View with just those four fields and have that be the destination/target. And if you were doing the appropriate steps to enable the operation to be minimally logged, the MSDN page for Prerequisites for Minimal Logging in Bulk Import does not say one way or the other what the effect will be if you use a View.

Most likely the \l was just interpreted as those two literal characters, hence it not respecting the rowterminator when you tried it.

The 0x0A will work as I have tested it and it behaves as expected. Your statement should look like the following:


I tried both with and without a 0x0A character at the end of the final line and both worked just the same.

I then removed one of the commas from one of the lines, leaving it with less than the full set of fields, and that is when I got the following error:

Msg 4832, Level 16, State 1, Line 2   Bulk load: An unexpected end of file was encountered in the data file.Msg 7399, Level 16, State 1, Line 2   The OLE DB provider "BULK" for linked server "(null)" reported an error. The                  provider did not give any information about the error.Msg 7330, Level 16, State 2, Line 2   Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".

Make sure that all of the rows in the data file have the required number of field separators (, in this case). You mentioned having 4 columns in the file so that should be 3 commas per row.

I'd comment to ask these but my reputation is not high enough.

I believe "\l" is "linefeed", so that would mesh with you seeing 0A in the file encoding.

My first question would be, what character encoding is your data files in? And what is the datatype on your table columns?

I would guess that this is going to be a character encoding issue. I see your DATAFILETYPE is 'widechar' Did you confirm that your source file is Unicode? And when you insert data and select it back out, does it look as if the character encoding is being preserved?

This seems to indicate that using a newline character as the row terminator, '\n', will get automatically translated to '\r\n'. It says that only applies to bcp, but clearly something else is going on.

Example C at the bottom of that page says to use this dynamic SQL for Unix line endings:

DECLARE @bulk_cmd varchar(1000);SET @bulk_cmd = 'BULK INSERT AdventureWorks2012.Sales.SalesOrderDetailFROM ''<drive>:\<path>\<filename>'' WITH (ROWTERMINATOR = '''+CHAR(10)+''')';EXEC(@bulk_cmd);

That makes it seem like it's a known issue.

If you're retrieving the file from an FTP/SFTP site, can you transfer the file in ASCII mode? Alternately, can you run the file through any of the numerous line ending changers like unix2dos, or todos?

I know SSIS allows you to specify just newlines for row terminators, as does the Import/Export Wizard. If that's an option, you might look at that. You have to define your columns in the data files very precisely and it's very tedious for files with a lot of columns, but you generally get a ton more options like quoted field identifiers, etc.

And I have no idea what control character \l represents. It doesn't seem to be documented anywhere.