Errno 13 Permission denied when Airflow tries to write to logs

17,605

Solution 1

Folder permission that is bind mounted could also result in this error.

For example:

docker-compose.yml (pseudo code)

   service_name:
     ...
     volumes:
      - /home/user/airflow_logs:/opt/airflow/logs

Grant permission to the local folder, so that airflow container can write logs, create directory if needed etc.,

 sudo chmod u=rwx,g=rwx,o=rwx /home/user/airflow_logs

Solution 2

I also have the same problem using Apache Airflow 1.10.7.

Traceback (most recent call last):
  File "/usr/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap
    self.run()
  File "/usr/lib/python3.7/multiprocessing/process.py", line 99, in run
    self._target(*self._args, **self._kwargs)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/site-packages/airflow/jobs/scheduler_job.py", line 135, in _run_file_processor
    set_context(log, file_path)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/site-packages/airflow/utils/log/logging_mixin.py", line 198, in set_context
    handler.set_context(value)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/site-packages/airflow/utils/log/file_processor_handler.py", line 65, in set_context
    local_loc = self._init_file(filename)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/site-packages/airflow/utils/log/file_processor_handler.py", line 148, in _init_file
    os.makedirs(directory)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/os.py", line 211, in makedirs
    makedirs(head, exist_ok=exist_ok)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/os.py", line 211, in makedirs
    makedirs(head, exist_ok=exist_ok)
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/os.py", line 211, in makedirs
    makedirs(head, exist_ok=exist_ok)
  [Previous line repeated 5 more times]
  File "/home/radifar/.virtualenvs/airflow/lib/python3.7/os.py", line 221, in makedirs
    mkdir(name, mode)
PermissionError: [Errno 13] Permission denied: '/media/radifar/radifar-dsl/Workflow/Airflow/airflow-home/logs/scheduler/2020-01-04/../../../../../../../home'

After checking how file_processor_handler.py works I find that the error was caused by the different directory location of example dag and our dag folder settings. In my case 7 folder above the folder 2020-01-04 is /media/radifar. In your case 4 folder above the folder 2019-12-18 is /usr/local. That's why the PermissionError was raised.

I was able to solve this problem by cleaning the AIRFLOW_HOME folder then run airflow version, set the load_example to False in airflow.cfg. Then run airflow initdb. After that I can use airflow without error.

Solution 3

Just for anyone with the same issue...

Surprisingly, I had to take a look to the Airflow documentation... and according to it:

On Linux, the mounted volumes in container use the native Linux filesystem user/group permissions, so you have to make sure the container and host computer have matching file permissions.

mkdir ./dags ./logs ./plugins
echo -e "AIRFLOW_UID=$(id -u)\nAIRFLOW_GID=0" > .env

Once you have matched file permissions:

docker-compose up airflow-init
docker-compose up

Solution 4

I solved the issue: in my case the problem was that the volume mounted folders, logs and dags didn't have write permission. I added it with

chmod -R 777 dags/
chmod -R 777 logs/

and in the docker-composer file they are mounted as

    volumes:
      - ./dags:/opt/bitnami/airflow/dags
      - ./logs:/opt/bitnami/airflow/logs
Share:
17,605

Related videos on Youtube

phenderbender
Author by

phenderbender

Updated on June 04, 2022

Comments

  • phenderbender
    phenderbender almost 2 years

    We're running into a permission error when using Airflow, receiving the following error:

    PermissionError: [Errno 13] Permission denied: '/usr/local/airflow/logs/scheduler/2019-12-18/../../../../home
    

    We've tried using chmod 777 -R on the /usr/local/airflow/logs/schedule directory within the container but this doesn't seem to have done the trick.

    We have this piece in our entrypoint.sh script:

    export AIRFLOW__CORE__BASE_LOGS_FOLDER="/usr/local/airflow/logs

    Has anyone else run into this airflow log permission issue? Can't seem to find much about this one in particular online.

  • rwitzel
    rwitzel about 4 years
    For me load_examples = False was enough to fix the problem.
  • cdabel
    cdabel about 4 years
    Turning off examples also resolved the error for me.
  • mockash
    mockash almost 4 years
    I have set load_examples = False but I am still getting Permission denied error. airflow inidb and airflow UI everything worked perfectly till yesterday but today I am getting this error. Can someone please help.
  • mockash
    mockash over 3 years
    @alex yes solved it. For me it was related to permission issues. I changed the permissions to 777 and that helped me. What is the error you are getting now?
  • alex
    alex over 3 years
    ok interesting - so you changed permissions for /scheduler logs folder? I posted here my issue stackoverflow.com/questions/63510335/… this is airflow running on kubernetes
  • Lucas Thimoteo
    Lucas Thimoteo over 3 years
    I had the same issue and I have solved it with the command: sudo chmod -R 777 /home/user/airflow_logs Also, it is important to mention that this can be applied to any folder one is trying to export from container.
  • Radek
    Radek about 2 years
    Thank you this worked for me. If you don't set AIFLOW_UID, all files will be created as root user which causes permission issues.