Solution for Multiple Docker Images in one Gitlab Repository

This isn’t the most elegant solution however it is a solution, and it’s working at the moment.

I try to keep my projects public where I can, so you can view the project here: https://gitlab.com/dxcker/docker-image-by-folder

How this helps

Essentially I have a project called (dxcker)https://gitlab.com/dxcker which is where I keep all of my docker images that I use elsewhere.

All of them will have the project name as the named purpose of the image (for eg/ alpxne is an alpine based utility image), where-as the tag denotes some kind of other functionality (eg/ jekyll is for building jekyll pages, and doesn’t include terraform binaries).

The system has been built over time, with large gaps between efforts. It is a little chaotic but is slowly harmonizing.

Standardizing the build process is a big step into fixing this all up.

Now for any docker image, I can dump the exact same .gitlab-ci.yml into each new project. Then, when I want to create a new ‘tag’, I can just create a new directory.

This saves me from playing with tags and the like. I can work entirely from the main branch, and not complicate my personal workflows too much.

The Central Automation Repo

I have one project that I consider the “Central Automation Repo”. This contains all of the generic scripts relevant to a specific purpose - in this case, building containers.

The automation for this makes some assumptions, but it’s not too bad. It will find all Dockerfile files, enter into the containing directory and build the image.

In the next step, it will deploy them to Gitlab CR.

variables:
  IMAGE_TAG: $CI_REGISTRY_IMAGE:$CI_COMMIT_REF_SLUG
  LATEST_TAG: $CI_REGISTRY_IMAGE:latest
  GITLAB_TOKEN: $CRPAT

.pre:
  before_script:
    - HERE=$(pwd)
    - docker login -u ${CI_REGISTRY_USER} -p ${GITLAB_TOKEN} $CI_REGISTRY

.build:
  extends: .pre
  image: docker:latest
  when: always
  script:
    - find . -name 'Dockerfile' -exec realpath {} \; | while read df ; do dir=$(dirname $df) ; cd $dir ; docker build -t ${CI_REGISTRY_IMAGE}:$(basename $dir) . ; done

.deploy:
  extends: .pre
  image: docker:latest
  when: on_success
  script:
    - find . -name 'Dockerfile' -exec realpath {} \; | while read df ; do dir=$(dirname $df) ; docker push ${CI_REGISTRY_IMAGE}:$(basename $dir); done

.build-latest:
  extends: .pre
  image: docker:latest
  when: always
  script:
    - cd latest
    - docker build -t ${CI_REGISTRY_IMAGE}:${PACKAGE}


.deploy-latest:
  extends: .pre
  image: docker:latest
  when: always
  script:
    - docker push ${CI_REGISTRY_IMAGE}:${PACKAGE}

One thing that is important to note is that I have a Gitlab token for my account generated and added as a CICD variable to the group. My automation uses this token - and thus, my account - to upload/update/whatever things in my Gitlab repos automatically.

This isn’t the most secure way of doing things, however it’s working. Just make sure the mask it.

Docker Projects

I expect that all of my other projects that build Docker images contains the following structure:

.
├── README.md
├── java
│   └── Dockerfile
├── latest
└── papermc
    └── Dockerfile

You’ll notice that there are two Dockerfiles, each in different repositories. I treat that repository as the ‘tag’ name.

The automation to pull the central automation repos automation is this:

$ cat .gitlab-ci.yml
include:
  - project: 'dxcker/docker-image-by-folder'
    ref: main
    file: 'includer.yml'

stages:
  - build
  - deploy

build:
  extends: .build
  stage: build

deploy:
  extends: .deploy
  stage: deploy
  only: 
    - main

This will create jobs in stages extended from the Central repo (I could consolidate this down further tbh), and it will run them in the Docker Project.

You just dump this .gitlab-ci.yml file into each Docker repo and it will do the rest. It’s pretty neat, and I’m not in a ‘wordsing’ mood, so consider this post as a stub for something better, which I’ll likely never do. glhf.