Spark: Create temporary table by executing sql query on temporary tables
You need to save your results as temp table
tableQuery .createOrReplaceTempView("dbtable")
Permanant storage on external table you can use JDBC
val prop = new java.util.Propertiesprop.setProperty("driver", "com.mysql.jdbc.Driver")prop.setProperty("user", "vaquar")prop.setProperty("password", "khan") //jdbc mysql url - destination database is named "temp"val url = "jdbc:mysql://localhost:3306/temp" //destination database table val dbtable = "sample_data_table" //write data from spark dataframe to databasedf.write.mode("append").jdbc(url, dbtable, prop)
https://docs.databricks.com/spark/latest/data-sources/sql-databases.html
http://spark.apache.org/docs/latest/sql-programming-guide.html#saving-to-persistent-tables
sqlContext.read.json(file_name_A).createOrReplaceTempView("A")sqlContext.read.json(file_name_B).createOrReplaceTempView("B")val tableQuery = "(SELECT A.id, B.name FROM A INNER JOIN B ON A.id = B.fk_id) C"sqlContext.sql(tableQuery).createOrReplaceTempView("C")
Try the above code it will work.