Create Spark DataFrame from Pandas DataFrame Create Spark DataFrame from Pandas DataFrame pandas pandas

Create Spark DataFrame from Pandas DataFrame


Import and initialise findspark, create a spark session and then use the object to convert the pandas data frame to a spark data frame. Then add the new spark data frame to the catalogue. Tested and runs in both Jupiter 5.7.2 and Spyder 3.3.2 with python 3.6.6.

import findsparkfindspark.init()import pysparkfrom pyspark.sql import SparkSessionimport pandas as pd# Create a spark sessionspark = SparkSession.builder.getOrCreate()# Create pandas data frame and convert it to a spark data frame pandas_df = pd.DataFrame({"Letters":["X", "Y", "Z"]})spark_df = spark.createDataFrame(pandas_df)# Add the spark data frame to the catalogspark_df.createOrReplaceTempView('spark_df')spark_df.show()+-------+|Letters|+-------+|      X||      Y||      Z|+-------+spark.catalog.listTables()Out[18]: [Table(name='spark_df', database=None, description=None, tableType='TEMPORARY', isTemporary=True)]