Checking the version of Databricks Runtime in Azure Checking the version of Databricks Runtime in Azure azure azure

Checking the version of Databricks Runtime in Azure


In Scala:

dbutils.notebook.getContext.tags("sparkVersion")

In Python:

spark.conf.get("spark.databricks.clusterUsageTags.sparkVersion")

Is giving you the Databricks runtime and Scala version back, e. g.: 5.0.x-scala2.11 .


Databricks Runtime is the set of core components that run on the clusters managed by Azure Databricks. It includes Apache Spark but also adds a number of components and updates that substantially improve the usability, performance, and security of big data analytics.

You can choose from among many supported runtime versions when you create a cluster.

enter image description here

If you want to know the version of Databricks runtime in Azure after creation:

Go to Azure Data bricks portal => Clusters => Interactive Clusters => here you can find the run time version.

enter image description here

For more details, refer "Azure Databricks Runtime versions".

Hope this helps.


print (spark.version)

worked for me