Uploaded image for project: 'Spark'
  1. Spark
  2. SPARK-24857

required the sample code test the spark steaming job in kubernates and write the data in remote hdfs file system

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Resolved
    • Priority: Major
    • Resolution: Invalid
    • Affects Version/s: 2.3.1
    • Fix Version/s: None
    • Component/s: Kubernetes, Spark Submit
    • Labels:
      None

      Description

      ./bin/spark-submit --master k8s://https://api.kubernates.aws.phenom.local --deploy-mode cluster --name spark-pi --class com.phenom.analytics.executor.SummarizationJobExecutor --conf spark.executor.instances=5 --conf spark.kubernetes.container.image=phenommurali/spark_new --jars hdfs://test-dev.com:8020/user/spark/jobs/Test_jar_without_jars.jar
      error
      Normal SuccessfulMountVolume 2m kubelet, ip-xxxxx.ec2.internal MountVolume.SetUp succeeded for volume "download-files-volume" Warning FailedMount 2m kubelet, ip-xxxx.ec2.internal MountVolume.SetUp failed for volume "spark-init-properties" : configmaps "spark-pi-b5be4308783c3c479c6bf2f9da9b49dc-init-config" not found

        Attachments

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              kmkrishna1223@gmail.com kumpatla murali krishna
            • Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: