Deploy Docker Containers from Docker Cloud Deploy Docker Containers from Docker Cloud docker docker

Deploy Docker Containers from Docker Cloud


To solve this problem, I followed advice from StackOverflow user @MazelTov's and built the containers on my local OSX development machine, then published the images to Docker Cloud, then pulled those images down onto and ran the images from my production server (AWS EC2).

Install Dependencies

I'll try and outline the steps I followed below in case they help others. Please note these steps require you to have docker and docker-compose installed on your development and production machines. I used the gui installer to install Docker for Mac.

Build Images

After writing a Dockerfile and docker-compose.yml file, you can build your images with docker-compose up --build.

Upload Images to Docker Cloud

Once the images are built, you can upload them to Docker Cloud with the following steps. First, create an account on Docker Cloud.

Then store your Docker Cloud username in an environment variable (so your ~/.bash_profile should contain export DOCKER_ID_USER='yaledhlab' (use your username though).

Next login to your account from your developer machine:

docker login

Once you're logged in, list your docker images:

docker ps

This will display something like:

CONTAINER ID        IMAGE                          COMMAND                  CREATED             STATUS              PORTS                      NAMES89478c386661        yaledhlab/let-them-speak-web   "/bin/sh -c 'npm run…"   About an hour ago   Up About an hour    0.0.0.0:7082->7082/tcp     letthemspeak_web_15e9c75d29051        training/webapp:latest         "python app.py"          4 hours ago         Up 4 hours          0.0.0.0:5000->5000/tcp     heuristic_mirzakhani890f7f1dc777        bitnami/tomcat:latest          "/app-entrypoint.sh …"   4 hours ago         Up About an hour    0.0.0.0:8080->8080/tcp     letthemspeak_tomcat_service_109d74e36584d        mongo                          "docker-entrypoint.s…"   4 hours ago         Up About an hour    0.0.0.0:27017->27017/tcp   letthemspeak_mongo_service_1

For each of the images you want to publish to Docker Cloud, run:

docker tag image_name $DOCKER_ID_USER/my-uploaded-image-namedocker push $DOCKER_ID_USER/my-uploaded-image-name

For example, to upload mywebapp_web to your user's account on Docker cloud, you can run:

docker tag mywebapp_web $DOCKER_ID_USER/webdocker push $DOCKER_ID_USER/web

You can then run open https://cloud.docker.com/swarm/$DOCKER_ID_USER/repository/list to see your uploaded images.

Deploy Images

Finally, you can deploy your images on EC2 with the following steps. First, install Docker and Docker-Compose on the Amazon-flavored EC2 instance:

# install dockersudo yum install docker -y# start dockersudo service docker start# allow ec2-user to run dockersudo usermod -a -G docker ec2-user# get the docker-compose binariessudo curl -L https://github.com/docker/compose/releases/download/1.20.1/docker-compose-`uname -s`-`uname -m` -o /usr/local/bin/docker-compose# change the permissions on the sourcesudo chmod +x /usr/local/bin/docker-compose

Log out, then log back in to update your user's groups. Then start a screen and run the server: screen. Once the screen starts, you should be able to add a new docker-compose config file that specifies the path to your deployed images. For example, I needed to fetch the let-them-speak-web container housed within yaledhlab's Docker Cloud account, so I changed the docker-compose.yml file above to the file below, which I named production.yml:

version: '2'services:  tomcat_service:    image: 'bitnami/tomcat:latest'    ports:      - '8080:8080'    volumes:      - docker-data-tomcat:/bitnami/tomcat/data/      - docker-data-blacklab:/lts-app/lts/  mongo_service:    image: 'mongo'    command: mongod    ports:      - '27017:27017'  web:    image: 'yaledhlab/let-them-speak-web'    # gain access to linked containers    links:      - mongo_service      - tomcat_service    # explicitly declare service dependencies    depends_on:      - mongo_service      - tomcat_service    # set environment variables    environment:      PYTHONUNBUFFERED: 'true'    ports:      - '7082:7082'    volumes:      - docker-data-tomcat:/tomcat_webapps      - docker-data-blacklab:/lts-app/lts/volumes:  docker-data-tomcat:  docker-data-blacklab:

Then the production compose file can be run with: docker-compose -f production.yml up. Finally, ssh in with another terminal, and detach the screen with screen -D.


Yeah, that's true. Docker Cloud uses Docker Hub as its native registry for storing both public and private repositories. Once you push your images to Docker Hub, they are available in Docker Cloud.

Pulling images from Docker hub is the opposite of pushing them. This works for both private and public repositories.

To download your images locally, I always export docker username to shell session:

# export DOCKER_ID_USER="username"

In fact, I have this on my .bashrc profile.

Replacing the value of DOCKER_ID_USER with your Docker Cloud username.

Then Log in to Docker Cloud using the docker login command.

 $ docker login

This logs you in using your Docker ID, which is shared between both Docker Hub and Docker Cloud

You can now run docker pull command to get your images downloaded locally.

$ docker pull image:tag

This is applicable to any Cloud Platform, not specific to AWS.

As you’re new to docker, here is my recommendation of best Docker Guides, including Docker vs VMs and advanced topics like working with Docker swarm and Kubernetes.