ModuleNotFoundError: No module named 'py4j' ModuleNotFoundError: No module named 'py4j' hadoop hadoop

ModuleNotFoundError: No module named 'py4j'


If you can run spark directly, maybe you have to fix the environment variable PYTHONPATH. Check the filename in the directory $SPARK_HOME/python/lib/. If the Spark version 2.4.3, the file is py4j-0.10.7-src.zip:

export PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.10.7-src.zip:$PYTHONPATH