How to use sqoop to export the default hive delimited output? How to use sqoop to export the default hive delimited output? hadoop hadoop

How to use sqoop to export the default hive delimited output?


I've found the correct solution for that special character in bash

#!/bin/bash# ... your scripthive_char=$( printf "\x01" )sqoop export --connect jdbc:mysql://mysqlm/site --username site --password site --table x_data --export-dir /x  --input-fields-terminated-by ${hive_char} --lines-terminated-by '\n'

The problem was in correct separator recognition (nothing to do with types and schema) and that was achieved by hive_char.

Another possibility to encode this special character in linux to command-line is to type Cntr+V+A


Using

--input-fields-terminated-by '\001' --lines-terminated-by '\n'

as flags in the sqoop export command seems to do the trick for me.

So, in your example, the full command would be:

sqoop export --connect jdbc:mysql://mysqlm/site --username site --password site --table x_data --export-dir /x  --input-fields-terminated-by '\001' --lines-terminated-by '\n'


I think its the DataType mismatch with your RDBMS schema.

Try to find the column name of "9-2" value and check the datatype in RDBMS schema.

If its int or numeric then Sqoop will parse the value and insert. And as it seems "9-2" is not numeric value.

Let me know if this doesn't work.