How to loop through Azure Datalake Store files in Azure Databricks How to loop through Azure Datalake Store files in Azure Databricks azure azure

How to loop through Azure Datalake Store files in Azure Databricks


The answer was simple even when i searched for two days:

files = dbutils.fs.ls('mnt/dbfolder1/projects/clients')for fi in files:   print(fi.path)


Scala version of the same (with ADLS path)

val dirList = dbutils.fs.ls("abfss://<container>@<storage_account>.dfs.core.windows.net/<DIR_PATH>/")// option1dirList.foreach(println)// option2for (dir <- dirList) println(dir.name)


Another way that translates seamlessly to a local installation of python is:

import osos.listdir("/dbfs/mnt/projects/clients/")