Digger has 2 main components: the CLI that runs within your CI and an orchestrator backend. The orchestrator enables parallelism, policies and other features that require centralisation.

This guide describes self-hosting the orchestrator backend as a binary aka “roll your own”. Self-hosting may come handy when using the cloud-based version of it is not an option (for example due to compliance requirements). More guides (docker, EC2, etc) are coming soon.

This guide is about self-hosting Digger Community Edition which is free and open source (Apache 2.0 license) (repo on GitHub). You do not need a license or an account on Digger Cloud to self-host Digger CE. If you require enterprise-grade features like Github Enterprise Server support, SSO / SAML, multi-team / multi-org - Talk to us

Prerequisites

  • docker and docker-compose installed on your machine

Using docker-compose is the easiest way to try digger quickly on a machine. Note: This is not meant to be used in production but rather a way to quickly try out digger backend in a few minutes

Setup docker-compose file

Create a docker-compose.yml file in your directory with the following contents:

# docker-compose.yml
version: '3.7'

services:
  postgres:
    image: postgres:alpine
    ports:
      - "5432:5432"
    environment:
      - POSTGRES_PASSWORD=root
    healthcheck:
      test: [ "CMD-SHELL", "pg_isready -U postgres" ]
      interval: 5s
      timeout: 5s
      retries: 5

  web:
    links:
      - postgres
    depends_on:
      postgres:
        condition: service_healthy
    image: "registry.digger.dev/diggerhq/digger_backend:latest"
    env_file:
      - .env
    ports:
      - "3000:3000"

Set initial environment variables

GITHUB_ORG=your_github_org
HOSTNAME=your_public_digger_hostname
BEARER_AUTH_TOKEN=$(openssl rand -base64 12)
DATABASE_URL=postgres://postgres:root@postgres:5432/postgres?sslmode=disable
HTTP_BASIC_AUTH=1
HTTP_BASIC_AUTH_USERNAME=myorg
HTTP_BASIC_AUTH_PASSWORD=$(openssl rand -base64 12)
ALLOW_DIRTY=false # set to true if the database has already a schema configured

Start the service

docker-compose up

You should now see initial database migrations and tables being created. Go to your_digger_hostname:3000/projects; you should see the dashboard. The setup is not complete yet!

Create GitHub app

Go to your_digger_hostname:3000/github/setup and click the Setup button. Follow the instructions on GitHub to install the app to the repositories in your organisation.

After the setup is done, GitHub should redirect you to the summary page at your_digger_hostname:3000/github/exchange-code

Copy the values into the following environment variables and restart the service.

GITHUB_APP_ID=
GITHUB_APP_CLIENT_ID=
GITHUB_APP_CLIENT_SECRET=
GITHUB_APP_PRIVATE_KEY_BASE64=
GITHUB_WEBHOOK_SECRET=

Install the GitHub app

On the exchange-code page (see above) click the installation link at the top and select the repositories.

From here on it’s the same steps as in the Quickstart:

  • Create digger.yml
  • Create workflow file
  • Create secrets
  • Create a PR to verify that Digger works