Run a sqoop job on a specific queue Run a sqoop job on a specific queue hadoop hadoop

Run a sqoop job on a specific queue


I think you have an error in your command

-Dmapreduce.job.queuename=NameOfTheQueue

note queuename one word and the order, based on the documentation, vm args need to go directly after the import.

https://sqoop.apache.org/docs/1.4.3/SqoopUserGuide.html#_using_generic_and_specific_arguments

Generic Hadoop command-line arguments: (must preceed any tool-specific arguments) Generic options supported are -conf specify an application configuration file -D use value for given property

sqoop job -Dmapred.job.queuename=shortduration \         --create myjob \         -- import  \         --connect jdbc:teradata://RCT/DATABASE=MYDB \         --driver com.teradata.jdbc.TeraDriver \         --username DBUSER -P \         --query "$query" \         --target-dir /data/source/dest/$i \         --check-column DAT_CRN_AGG \         --incremental  append \         --last-value 2001-01-01 \         --split-by NUM_CTR

you might just want to try it with the import tool to see if it is working correctly then do the job command ie

sqoop import -Dmapred.job.queuename=shortduration \         --connect jdbc:teradata://RCT/DATABASE=MYDB \         --driver com.teradata.jdbc.TeraDriver \         --username DBUSER -P \         --query "$query" \         --target-dir /data/source/dest/$i \         --check-column DAT_CRN_AGG \         --incremental  append \         --last-value 2001-01-01 \         --split-by NUM_CTR