Uploaded image for project: 'Apache Airflow'
  1. Apache Airflow
  2. AIRFLOW-7052

spark 3.0.0 does not work with sparksubmitoperator

    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Open
    • Priority: Major
    • Resolution: Unresolved
    • Affects Version/s: 1.10.9
    • Fix Version/s: None
    • Component/s: operators
    • Labels:
      None

      Description

      from slack:
      If anyone runs into this in the future I've found out where the issue is in the spark_submit_hook.py.
      Line 419
      ``match_exit_code = re.search(r'\s*Exit code: (\d+)', line)```
      in spark 3.0 the line that prints the exit code is actually lower case E on "Exit code:" so this re.search will never find that value. To fix this you can simply switch the line to this
      ```match_exit_code = re.search(r'\s*Exit code: (\d+)', line, re.IGNORECASE)```
      Which should also be backwards compatible.
      MattD
      Having some difficulty understanding why my spark-submit task is being marked as failed even though the spark job has completed successfully,
      I see these logs at the end of the job,
      exit code: 0
      termination reason: Completed
      But then it also right after displays this,
      Traceback (most recent call last):
      File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 966, in _run_raw_task
      result = task_copy.execute(context=context)
      File "/usr/local/lib/python3.7/site-packages/airflow/contrib/operators/spark_submit_operator.py", line 187, in execute
      self._hook.submit(self._application)
      File "/usr/local/lib/python3.7/site-packages/airflow/contrib/hooks/spark_submit_hook.py", line 403, in submit
      self._mask_cmd(spark_submit_cmd), returncode
      airflow.exceptions.AirflowException: Cannot execute: spark-submit (spark submit args would be here) Error code is: 0.
      I took a look at spark_submit_hook.py line 403 and it shows that it shouldn't be throwing that exception if the error code is 0. Anyone have any ideas? I'm only finding this happens now that I've switched to using spark 3.0, never ran into this with spark 2.4.5. *Also running 1.10.9 now

        Attachments

          Activity

            People

            • Assignee:
              Unassigned
              Reporter:
              toopt4 t oo
            • Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

              Dates

              • Created:
                Updated: