Unable to create SparkApplications on Kubernetes cluster using SparkKubernetesOperator from Airflow DAG Unable to create SparkApplications on Kubernetes cluster using SparkKubernetesOperator from Airflow DAG kubernetes kubernetes

Unable to create SparkApplications on Kubernetes cluster using SparkKubernetesOperator from Airflow DAG


Here are kube cluster role resources. Create with kubectl -n <namespace> apply -f <filename.yaml>

# Role for spark-on-k8s-operator to create resources on clusterapiVersion: rbac.authorization.k8s.io/v1kind: ClusterRolemetadata:  name: spark-cluster-cr  labels:    rbac.authorization.kubeflow.org/aggregate-to-kubeflow-edit: "true"rules:  - apiGroups:      - sparkoperator.k8s.io    resources:      - sparkapplications    verbs:      - '*'---# Allow airflow-worker service account access for spark-on-k8sapiVersion: rbac.authorization.k8s.io/v1kind: ClusterRoleBindingmetadata:  name: airflow-spark-crbroleRef:  apiGroup: rbac.authorization.k8s.io  kind: ClusterRole  name: spark-cluster-crsubjects:  - kind: ServiceAccount    name: airflow-cluster    namespace: airflow

Notes:

  • The above is assuming the error message
    sparkapplications.sparkoperator.k8s.io is forbidden: User "system:serviceaccount:airflow:airflow-cluster\" cannot create resource \"sparkapplications\" in API group \"sparkoperator.k8s.io\" in the namespace spark-apps
    • Airflow namespace: airflow
    • Airflow serviceaccount: airflow-cluster
    • Spark jobs namespace: spark-apps
  • You should also have spark-on-k8s-operator installed
    • With helm --set webhook.enable=true if you want to use env in your spec.driver


Thanks for your suggestion. My issue has been resolved after granting the appropriate permissions to the service account on the airflow namespace.