Hadoop 2.2.0 fails running start-dfs.sh with Error: JAVA_HOME is not set and could not be found Hadoop 2.2.0 fails running start-dfs.sh with Error: JAVA_HOME is not set and could not be found hadoop hadoop

Hadoop 2.2.0 fails running start-dfs.sh with Error: JAVA_HOME is not set and could not be found


You can edit hadoop-env.sh file and set JAVA_HOME for Hadoop

Open the file and find the line as bellow

export JAVA_HOME=/usr/lib/j2sdk1.6-sun

Uncomment the line And update the java_home as per your environment

This will solve the problem with java_home.


Weird out of the box bug on Ubuntu. The current line

export JAVA_HOME=${JAVA_HOME}

in /etc/hadoop/hadoop-env.sh should pick up java home from host but it doesnt.

Just edit the file and hard code the java home for now.


Alternatively you can edit /etc/environment to include:

JAVA_HOME=/usr/lib/jvm/[YOURJAVADIRECTORY]

This makes JAVA_HOME available to all users on the system, and allows start-dfs.sh to see the value. My guess is that start-dfs.sh is kicking off a process as another user somewhere that does not pick up the variable unless explicitly set in hadoop-env.sh.

Using hadoop-env.sh is arguably clearer -- just adding this option for completeness.