Cannot connect to the Docker daemon at unix:///var/run/docker.sock in gitlab CI

10,770

Ahh, that's my lovely topic - using docker for gitlab ci. The problem you are experiencing is better known as docker-in-docker.

Before configuring it, you may want to read this brilliant post: http://jpetazzo.github.io/2015/09/03/do-not-use-docker-in-docker-for-ci/

That will give you a bit of understanding what is the problem and which solution best fits you. Generally there are 2 major approaches: actual installation of docker daemon inside docker and sharing host's daemon to containers. Which approach to choose - depends on your needs.

In gitlab you can go in several ways, I will just share our experience.

Way 1 - using docker:dind as a service.

It is pretty simple to setup. Just add docker:dind as a shared service to your gitlab-ci.yml file and use docker:latest image for your jobs.

image: docker:latest  # this sets default image for jobs
services:
  - docker:dind

Pros:

  • simple to setup.
  • simple to run - your source codes are available by default to your job in cwd because they are being pulled directly to your docker runner

Cons: you have to configure docker registry for that service, otherwise you will get your Dockerfiles built from scratch each time your pipeline starts. As for me, it is unacceptable, because can take more than an hour depending on the number of containers you have.

Way 2 - sharing /var/run/docker.sock of host docker daemon

We setup our own docker executor with docker daemon and shared the socket by adding it in /etc/gitlab-runner/config.toml file. Thus we made our machine's docker daemon available to docker cli inside containers. Note - you DONT need privileged mode for executor in this case.

After that we can use both docker and docker-compose in our custom docker images. Moreover, we dont need special docker registry because in this case we share executor's registry among all containers.

Cons

You need to somehow pass sources to your containers in this case, because you get them mounted only to docker executor, but not to containers, launched from it. We've stopped on cloning them with command like git clone $CI_REPOSITORY_URL --branch $CI_COMMIT_REF_NAME --single-branch /project

Share:
10,770
Majid Rajabi
Author by

Majid Rajabi

Updated on June 22, 2022

Comments

  • Majid Rajabi
    Majid Rajabi almost 2 years

    I looked at any other questions but can't find my own solution! I setting up a CI in gitlab and use the gitlab's shared runner. In build stage I used docker image as base image but when i use docker command it says :

    Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running?

    I looked at this topic but still don't understand what should I do?

    .gitlab-ci.yml :

    stages:
      - test
      - build
      - deploy
    
    job_1:
      image: python:3.6
      stage: test
      script:
        - sh ./sh_script/install.sh
        - python manage.py test -k
    
    job_2:
      image: docker:stable
      stage: build
      before_script:
        - docker info
      script:
        - docker build -t my-docker-image .
    

    I know that the gitlab runner must registered to use docker and share /var/run/docker.sock! But how to do this when using the gitlab own runner?

  • Majid Rajabi
    Majid Rajabi over 5 years
    In the second way, how can I access to the /etc/gitlab-runner/config.toml?
  • grapes
    grapes over 5 years
    As I mentioned for the second approach, we use our own docker executor (VPS service). If you use provided by gitlab, you wont be able to share the socket, sorry
  • Deepak Poojari
    Deepak Poojari about 4 years
    How do we make this configurations on a kubernets runner created using gitlab.com
  • Ravindra Kushwaha
    Ravindra Kushwaha over 3 years
    Hi @grapes please help me on the same solution, please check the link once stackoverflow.com/q/64587625/3946958
  • thahgr
    thahgr about 2 years
    sure this is nice, but if you are just a user and the admin refuses to do this ?