override log4j.properties in hadoop override log4j.properties in hadoop hadoop hadoop

override log4j.properties in hadoop


Modify the log4j file inside HADOOP_CONF_DIR. Note that hadoop job wont consider the log4j file of your application. It will consider the one inside HADOOP_CONF_DIR.

If you want to force hadoop to use some other log4j file, try one of these:

  1. You can try what @Patrice said. ie.

    -Dlog4j.configuration=file:/path/to/user_specific/log4j.xml

  2. Customize HADOOP_CONF_DIR/log4j.xml and set the logger level for "your" classes as per your wish. Other user(s) wont be affected due to this unless both are having classes with same package structure. This wont work for core hadoop classes as all users will get afftected.

  3. Create your customized log4j file. Replicate the directory HADOOP_CONF_DIR and put your log4j file inside it. export HADOOP_CONF_DIR to your conf directory. Other users will point to the default one.


If you use the default Log4j.properties file the logging settings get overridden by environment variables from the startup script. If you want to use the default log4j and just simply want to change the logging level, use $HADOOP_CONF_DIR/hadoop-env.sh

For example, to change your logger to DEBUG log level and DRFA logger, use

export HADOOP_ROOT_LOGGER="DEBUG,DRFA"


  1. You could remove the log4j.properties from your hadoop jar
  2. OR make sure that your jar/log4j.properties is first in the classpath (log4j picks the first log4j.properties from the classpath that it finds)
  3. OR specify the system variable: -Dlog4j.configuration=PATH_TO_FILE

See the documentation to learn how log4j finds the configuration.