How to write pandas dataframe into Databricks dbfs/FileStore? How to write pandas dataframe into Databricks dbfs/FileStore? pandas pandas

How to write pandas dataframe into Databricks dbfs/FileStore?


Try with this in your notebook databricks:

import pandas as pdfrom io import StringIOdata = """CODE,L,PS5d8A,N,P604905d8b,H,P803775d8C,O,P60491"""df = pd.read_csv(StringIO(data), sep=',')#print(df)df.to_csv('/dbfs/FileStore/NJ/file1.txt')pandas_df = pd.read_csv("/dbfs/FileStore/NJ/file1.txt", header='infer') print(pandas_df)


This worked out for me:

outname = 'pre-processed.csv'outdir = '/dbfs/FileStore/'dfPandas.to_csv(outdir+outname, index=False, encoding="utf-8")

To download the file, add files/filename to your notebook url (before the interrogation mark ?):

https://community.cloud.databricks.com/files/pre-processed.csv?o=189989883924552#

(you need to edit your home url, for me is :

https://community.cloud.databricks.com/?o=189989883924552#)

dbfs file explorer