Posts

Showing posts from June, 2019

databricks using secret

container = "raw" storageAccount = "testarunacc" accountKey = "fs.azure.account.key.{}.blob.core.windows.net".format(storageAccount) accessKey = dbutils.secrets.get(scope = "arunscope", key = "key1") # Mount the drive for native python inputSource = "wasbs://{}@{}.blob.core.windows.net".format(container, storageAccount) mountPoint = "/mnt/" + container extraConfig = {accountKey: accessKey} print("Mounting: {}".format(mountPoint)) try:   dbutils.fs.mount(     source = inputSource,     mount_point = str(mountPoint),     extra_configs = extraConfig   )   print("=> Succeeded") except Exception as e:   if "Directory already mounted" in str(e):     print("=> Directory {} already mounted".format(mountPoint))   else:     raise(e) # Set the credentials to Spark configuration spark.conf.set(   accountKey,   accessKey) spark._jsc.hadoopConfiguration

Azure - Accessing Blob storage from Data bricks cluster using account key

Azure - Accessing Blob storage from Data bricks cluster container = "raw" storageAccount = "arunacc" accessKey = "<<>>" accountKey = "fs.azure.account.key.{}.blob.core.windows.net".format(storageAccount) # Set the credentials to Spark configuration spark.conf.set(   accountKey,   accessKey) spark._jsc.hadoopConfiguration().set(   accountKey,   accessKey) # Mount the drive for native python inputSource = "wasbs://{}@{}.blob.core.windows.net".format(container, storageAccount) mountPoint = "/mnt/" + container extraConfig = {accountKey: accessKey} print("Mounting: {}".format(mountPoint)) try:   dbutils.fs.mount(     source = inputSource,     mount_point = str(mountPoint),     extra_configs = extraConfig   )   print("=> Succeeded") except Exception as e:   if "Directory already mounted" in str(e):     print("=> Directory {} already mounted".form