Search code examples
pythondockerpipelinedockerpy

pipeline in docker exec from command line and from python api


What I try to implement is invoking mysqldump in container and dump the database into the container's own directory.

At first I try command below:

$ docker exec container-name mysqldump [options] database | xz > database.sql.xz

That's not working, so I try another one which is :

$ docker exec container-name bash -c 'mysqldump [options] database | xz > database.sql.xz'

This time it worked.

But that's really lame.

Then I try using docker-py this time cmd option that worked looks like this:

cmd=['bash', '-c', 'mysqldump [options]  database | xz > database.sql.xz']

the logger event as below:

level="info" msg="-job log(exec_start: bash -c mysqldump [options]  database | xz > database.sql.xz, fe58e681fec194cde23b9b31e698446b2f9d946fe0c0f2e39c66d6fe68185442, mysql:latest) = OK (0)"

My question:

is there a more elegant way to archive my goal?


Solution

  • You are almost there, you just need to add the -i flag to make the pipe work:

    -i, --interactive    Keep STDIN open even if not attached
    
    docker exec -i container-name mysqldump [options] database > database.sql.xz
    

    I replaced the pipe by a file redirection but it will work the same with a Pipe. Just make sure to don't use the -t option as this will break it.


    Extra:

    To import back the sql dump into mysql:

    docker exec -i container-name mysql [options] database < database.sql.xz
    

    This little script will detect if I am running mysql in a pipe or not:

    #!/bin/bash
    if [ -t 0 ]; then
        docker exec -it container-name mysql "$@"
    else
        docker exec -i container-name mysql "$@"
    fi