Uploaded image for project: 'Apache Airflow'
  1. Apache Airflow
  2. AIRFLOW-4719

Scheduler: 'airflow scheduler' fails to make opt directory

Agile BoardAttach filesAttach ScreenshotAdd voteVotersWatch issueWatchersLinkUpdate Comment AuthorReplace String in CommentUpdate Comment VisibilityDelete Comments
    XMLWordPrintableJSON

    Details

    • Type: Bug
    • Status: Open
    • Priority: Major
    • Resolution: Unresolved
    • Affects Version/s: 1.10.3
    • Fix Version/s: None
    • Component/s: scheduler
    • Labels:
      None
    • Environment:
      In a fedora 29 Singularity container

      Description

      I have a strange error when running `airflow schedule`.  Python errors with:

          `Process DagFileProcessor0-Process:
          Traceback (most recent call last):
            File "/usr/lib64/python3.7/multiprocessing/process.py", line 297, in _bootstrap
              self.run()
            File "/usr/lib64/python3.7/multiprocessing/process.py", line 99, in run
              self._target(*self._args, **self._kwargs)
            File "/opt/venv/tensorflow-1.13/lib/python3.7/site-packages/airflow/jobs.py", line 381, in helper
              set_context(log, file_path)
            File "/opt/venv/tensorflow-1.13/lib/python3.7/site-packages/airflow/utils/log/logging_mixin.py", line 170, in set_context
              handler.set_context(value)
            File "/opt/venv/tensorflow-1.13/lib/python3.7/site-packages/airflow/utils/log/file_processor_handler.py", line 65, in set_context
              local_loc = self._init_file(filename)
            File "/opt/venv/tensorflow-1.13/lib/python3.7/site-packages/airflow/utils/log/file_processor_handler.py", line 141, in _init_file
              os.makedirs(directory)
            File "/opt/venv/tensorflow-1.13/lib64/python3.7/os.py", line 211, in makedirs
              makedirs(head, exist_ok=exist_ok)
            File "/opt/venv/tensorflow-1.13/lib64/python3.7/os.py", line 211, in makedirs
              makedirs(head, exist_ok=exist_ok)
            File "/opt/venv/tensorflow-1.13/lib64/python3.7/os.py", line 211, in makedirs
              makedirs(head, exist_ok=exist_ok)
            [Previous line repeated 5 more times]
            File "/opt/venv/tensorflow-1.13/lib64/python3.7/os.py", line 221, in makedirs
              mkdir(name, mode)
          OSError: [Errno 30] Read-only file system: '/remote/XXX/rlugg/machine_learning/20190530_airflow_learn2/airflow/logs/scheduler/2019-05-31/../../../../../../../opt'
          `

      That very last line shows the problem.

      Airflow is attempting to make a directory one level down from the directory I own. `/remote/XXX/rlugg` is my directory, but `/remote/XXX` is not.

      I use AIRFLOW_HOME to point to `/remote/XXX/rlugg/machine_learning/20190530_airflow_learn2/airflow`
      I've tried changing airflow.cfg as well as setting the environment variable `export AIRFLOW__CORE__BASE_LOG_FOLDER=/x/y/z` yet the same error (with the exact same directory) is shown.

      I am running within a Singularity container if that's relevant.

        Attachments

          Activity

            People

              Dates

              • Created:
                Updated:

                Issue deployment