Running multiple projects using docker which each runs with docker-compose Running multiple projects using docker which each runs with docker-compose docker docker

Running multiple projects using docker which each runs with docker-compose


You can do this by combining services from multiple files using the extends feature of docker-compose. Put your projects in some well-defined location, and refer to them using relative paths:

../├── foo/│   └── docker-compose.yml└── bar/    └── docker-compose.yml

foo/docker-compose.yml:

base:    build: .foo:    extends:        service: base    links:        - dbdb:    image: postgres:9

If you wanted to test this project by itself, you would do something like:

sudo docker-compose up -d foo

Creating foo_foo_1

bar/docker-compose.yml:

foo:    extends:        file: ../foo/docker-compose.yml        service: base    links:        - dbbar:    build: .    extends:        service: base    links:        - db        - foodb:    image: postgres:9

Now you can test both services together with:

sudo docker-compose up -d bar

Creating bar_foo_1
Creating bar_bar_1


Am not 100% sure on your question so this will be a wide answer.

1) Everything can be in the same compose file if it's running on the same machine or server cluster.

#proxyhaproxy:  image: haproxy:latest  ports:    - 80:80#setup 1ubuntu_1:  image: ubuntu  links:    - db_1:mysql  ports:    - 80db1:  image: ubuntu  environment:    MYSQL_ROOT_PASSWORD: 123#setup 2ubuntu_2:   image: ubuntu   links:     - db_2:mysql   ports:    - 80db2:  image: ubuntu  environment:    MYSQL_ROOT_PASSWORD: 123

It's also possible to combine several yml files like
$docker-compose -f [File A].yml -f [File B].yml up -d

2) Every container in the build can be controlled separately with compose.
$docker-compose stop/start/build/ ubuntu_1

3) Using $docker-compose build it will only rebuild where changes have been done.

Here is more information that could be useful https://docs.docker.com/compose/extends/#extending-services

If none of above is correct please example of build.


This is our approach for anyone else having same problem:

Now each of our projects has a docker-compose which can be run standalone. We have another project called 'development-kit' which clones needed projects and store them in a directory. We can run our projects using command similiar to:

python controller.py --run projectA projectB

It runs each project using docker-compose up command. Then when all projects are up and running, it starts adding all other projects main docker's IP to other projects by adding them to the /etc/hosts ips using these commands:

# getting contaier id of projectA and projectBCIDA = commands.getoutput("docker-compose ps -q %s" % projectA)CIDB = commands.getoutput("docker-compose ps -q %s" % projectB)# getting ip of container projectAIPA = commands.getoutput("docker inspect --format '{{ .NetworkSettings.IPAddress }}' %s" % CIDA)

Now for sending requests from projectB to projectA we only need to define projectA IP as "projectA" in projectB's settings.