How to create a task application in Spring Data Flow Server and mount a remote directory using Kubernetes? How to create a task application in Spring Data Flow Server and mount a remote directory using Kubernetes? kubernetes kubernetes

How to create a task application in Spring Data Flow Server and mount a remote directory using Kubernetes?


Looks like the actual issue in your case is the name of the volume you specified in the configuration properties. Since K8s doesn't allow upper case letters for the name (see here), you need to use lowercase for your name values (Currently there are accessFilesDir and processedFilesDir), etc.,

I tried to pass the similar settings on minikube (without NFS mounting etc.,) just to see if the task launching passes the volume and volume mount K8s deployer properties and they seem to work fine:

dataflow:>task create a1 --definition "timestamp"dataflow:>task launch a1 --properties "deployer.timestamp.kubernetes.volumes=[{name: accessfilesdir, persistentVolumeClaim: { claimName: 'apache-volume-claim' }},{name: processedfilesdir, persistentVolumeClaim: { claimName: 'apache-volume-claim' }}],deployer.timestamp.kubernetes.volumeMounts=[{name: 'accessfilesdir', mountPath: '/data/apache/access'},{name: 'processedfilesdir', mountPath: '/data/apache/processed'}]"

and, this resulted in the following config when I describe the pod (kubectl describe ) of the launched task:

Volumes:  accessfilesdir:    Type:       PersistentVolumeClaim (a reference to a PersistentVolumeClaim in the same namespace)    ClaimName:  apache-volume-claim    ReadOnly:   false  processedfilesdir:    Type:       PersistentVolumeClaim (a reference to a PersistentVolumeClaim in the same namespace)    ClaimName:  apache-volume-claim    ReadOnly:   false