Spark driver doesn't crash on exception
Well, that's was an easy one.
I had to catch all exceptions to ensure that spark context is being closed no matter what:
def main(args: Array[String]): Unit = { // some code implicit val sparkSession = SparkSession.builder().getOrCreate try { // application code with potential exceptions } catch { case exception: Exception => sparkSession.close() throw exception } sparkSession.close() }
That way all resources are freed and the driver pod changes its state to Error
as excepted.
EDIT - on in the Scala fashion:
def main(args: Array[String]): Unit = { // some code implicit val sparkSession = SparkSession.builder().getOrCreate Try { // application code with potential exceptions } match { case Success(_) => None case Failure(exception) => sparkSession.close() throw exception } sparkSession.close() }