Search code examples
postgresqldockerdocker-composepsql

Docker container fails to connect to psql. psql: could not connect to server. docker compose


I need to import the schema into the db in the container and, I already had this whole thing going, imported the db schema, made sure the app is up and running and made a commit and had started working on the other aspects of the app. When I later tried to build the image docker compose up --build got this error:

psql: error: could not connect to server: No such file or directory the server running locally and accepting

tried to figure this out a little, then just gave up and reset to the latest commit when it still worked... the issue persisted...

docker-compose.yml:

version: '3'
services:
  db:
    tty: true
    image: postgres
    environment:
      POSTGRES_DB: fusionsolar
      POSTGRES_USER: fusionsolar
      POSTGRES_PASSWORD: fusionsolar
    ports: 
      - "5432:5432"
    extra_hosts:
    - "db:127.0.0.1"

  web:
    tty: true
    image: ghcr.io/a-lehmann-elektro-ag/fusionsolar_webplugin
    build: .
    volumes:
      - ./public:/mnt/public
    ports:
      - "3000:3000"
    depends_on:
      - db
    environment:
      DB_HOST: db
      POSTGRES_DB: fusionsolar
      POSTGRES_USER: fusionsolar
      POSTGRES_PASSWORD: fusionsolar
      POSTGRES_HOST_AUTH_METHOD: trust

volumes:
  dbdata:

Dockerfile:

RUN apt-get update -qq && apt-get install -y nodejs postgresql-client systemctl
WORKDIR /myapp

COPY . /myapp/

RUN bundle install


# Add a script to be executed every time the container starts.
COPY entrypoint.sh /usr/bin/
RUN chmod +x /usr/bin/entrypoint.sh
ENTRYPOINT ["entrypoint.sh"]
EXPOSE 3000

# Configure the main process to run when running the image
CMD ./start-dev.sh

start-dev.sh:

#!/bin/bash

DATABASE_NAME="fusionsolar"
DB_DUMP_LOCATION="./schema.sql"

echo "*** CREATING DATABASE ***"
psql "$DATABASE_NAME" < "$DB_DUMP_LOCATION";
echo "*** DATABASE CREATED! ***"


rm /src/tmp/pids/server.pid
bundle install --jobs 20 --retry 5

rails server -b 0.0.0.0 -p 3000

entrypoint.sh:

#!/bin/bash
set -e

# Remove a potentially pre-existing server.pid for Rails.
rm -f /myapp/tmp/pids/server.pid

# Then exec the container's main process (what's set as CMD in the Dockerfile).
exec "$@"

I tried to delete all the images and docker system prune -a, but it didn't work. I also made sure that the db is actually running, logged into the container and it indeed was running


Solution

  • When you just directly run psql like that, it looks for a set of standard environment varibales like $PGHOST and $PGUSER, but absent those variables, it will try to connect to the database using a Unix socket. Since you're in a separate Rails application container, that socket file won't exist.

    The absolute easiest thing here to do will be to use the Rails Active Record migration system. This will use the same database configuration you already have in your config/database.yml file.

    #!/bin/sh
    # entrypoint.sh
    
    # remove a stale pid file (if any)
    rm -f tmp/pids/server.pid
    
    # run database migrations (automatically on every container startup)
    bundle exec db:migrate
    
    # run the main container CMD, in a Bundler context
    exec bundle exec "$@"
    # (example: `CMD rails server -b 0.0.0.0 -p 3000`)
    

    If you do need to use psql separately, your script can access the Compose environment: variables directly, and you'll need to pass those variables to psql options or the more standard $PG* environment variables.

    PGPASSWORD="$POSTGRES_PASSWORD" \
    psql \
      -h "$DB_HOST" \
      -d "$POSTGRES_DB" \
      -u "$POSTGRES_USER" \
      -f schema.sql