Spark Error - Unsupported class file major version Spark Error - Unsupported class file major version python python

Spark Error - Unsupported class file major version


Edit Spark 3.0 supports Java 11, so you'll need to upgrade

Spark runs on Java 8/11, Scala 2.12, Python 2.7+/3.4+ and R 3.1+. Java 8 prior to version 8u92 support is deprecated as of Spark 3.0.0



Original answer

Until Spark supports Java 11, or higher (which would be hopefully be mentioned at the latest documentation when it is), you have to add in a flag to set your Java version to Java 8.

As of Spark 2.4.x

Spark runs on Java 8, Python 2.7+/3.4+ and R 3.1+. For the Scala API, Spark 2.4.4 uses Scala 2.12. You will need to use a compatible Scala version (2.12.x)

On Mac/Unix, see asdf-java for installing different Javas

On a Mac, I am able to do this in my .bashrc,

export JAVA_HOME=$(/usr/libexec/java_home -v 1.8)

On Windows, checkout Chocolately, but seriously just use WSL2 or Docker to run Spark.


You can also set this in spark-env.sh rather than set the variable for your whole profile.

And, of course, this all means you'll need to install Java 8 in addition to your existing Java 11


I ran into this issue when running Jupyter Notebook and Spark using Java 11. I installed and configured for Java 8 using the following steps.

Install Java 8:

$ sudo apt install openjdk-8-jdk

Since I had already installed Java 11, I then set my default Java to version 8 using:

$ sudo update-alternatives --config java

Select Java 8 and then confirm your changes:

$ java -version

Output should be similar to:

openjdk version "1.8.0_191"OpenJDK Runtime Environment (build 1.8.0_191-8u191-b12-2ubuntu0.18.04.1-b12)OpenJDK 64-Bit Server VM (build 25.191-b12, mixed mode)

I'm now able to run Spark successfully in Jupyter Notebook. The steps above were based on the following guide: https://www.digitalocean.com/community/tutorials/how-to-install-java-with-apt-on-ubuntu-18-04


I found that adding the spark location through findspark and java8 with os at the beginning of the script the easiest solution:

import findsparkimport osspark_location='/opt/spark-2.4.3/' # Set your ownjava8_location= '/usr/lib/jvm/java-8-openjdk-amd64' # Set your ownos.environ['JAVA_HOME'] = java8_locationfindspark.init(spark_home=spark_location)