Writing log with python logging module in databricks to azure datalake not working
You can use azure_storage_logging handler:
import loggingfrom azure_storage_logging.handlers import BlobStorageRotatingFileHandlerlog = logging.getLogger('service_logger')azure_blob_handler = BlobStorageRotatingFileHandler(filename, account_name, account_key, maxBytes, container)log.addHandler(azure_blob_handler)
Let me explain the steps for accessing or performing Write operations on Azure data lake storage using python
1) Register an application in Azure AD
2) Grant permission in data lake for the application you have registered
3) Please get the client secret from azure AD for the application you have registered.
4) You need to write a code to mount the directory in Azure data lake like below
dbutils.fs.mkdirs("/mnt/mountdatalake")config = {"dfs.adls.oauth2.access.token.provider.type": "ClientCredential", "dfs.adls.oauth2.client.id": "Registered_Client_Id_From_Azure_Portal", "dfs.adls.oauth2.credential": "Cleint_Secret_Obtained_By_Azure_Portal", "dfs.adls.oauth2.refresh.url":"https://login.microsoftonline.com/Your_Directory_ID/oauth2/token"}dbutils.fs.amount( source="adl://mydata.azuredatalakestore.net/mountdatabricks", mount_point ="/mnt/mountdatalake",extra_configs=configs)
Once the configuration/mounting is done using application client credential, you are good to access the directory and log it.
for example , Below I have extracted couple of records from SQL server and stored it in azure data lake
Hope this helps.