Accessing Databricks storage from AKS

Hi,

I’m looking to migrate a databricks hosted shiny app to AKS. Is there any documentation on mounting Databricks abfss storage under AKS via the pod patches file?

https://www.shinyproxy.io/documentation/configuration/#kubernetes-pod-patches

Thanks

Hi, it seems that abfss is using Azure Blob Filesystem under the hood, which has good integration with AKS. As a starting point, see https://learn.microsoft.com/en-us/azure/aks/azure-blob-csi?tabs=NFS . You’ll probably have to re-use an existing volume on your storage account, I did not found a good example on the Azure docs for this, but this blog explains it: https://medium.com/code-it-yourself/aks-persistent-volume-with-existing-storage-account-d2ec6aabe65e

You can convert this example in the blog:

kind: Pod
apiVersion: v1
metadata:
  name: mypod
spec:
  containers:
 - name: mypod
    image: mcr.microsoft.com/oss/nginx/nginx:1.15.5-alpine
    volumeMounts:
    - mountPath: "/mnt/azure"
      name: volume
  volumes:
    - name: volume
      persistentVolumeClaim:
        claimName: pvc-blob

to a ShinyProxy app:

id: my-app
container-image: openanalytics/shinyproxy-image
kubernetes-pod-patches: |
  - op: add
    path: /spec/volumes
    value:
      - name: volume
        persistentVolumeClaim:
          claimName: pvc-blob
  - op: add
    path: /spec/containers/0/volumeMounts
    value:
      - mountPath: /mnt/azure
        name: volume

I guess it could make sense to first get it working with a regular pod and then try to integrate it with ShinyProxy.