Writing csv files to local host from a docker container Writing csv files to local host from a docker container docker docker

Writing csv files to local host from a docker container


Ok, gotcha, with the edit about expectations of the CSV being output to the host, we do have a problem with how this is set up.

You've got two VOLUMEs declared in your Dockerfile, which is fine. These are named volumes, which are great for persisting data between containers going up and down on a single host, but you aren't able to easily just go in like it's a normal file system from your host.

If you want the file to show up on your host, you can create a bind mounted volume at runtime, which maps a path in your host filesystem to a path in the Docker container's filesystem.

docker run -v $(pwd):/home/ec2-user/docker_test/data docker_test will do this. $(pwd) is an expression that evaluates to your current working directory if you're on a *nix system, where you're running the command. Take care with that and adjust as needed (like if you're using Windows as your host).

With a volume set up this way, when the CSV is created in the container file system at the location you intend, it will be accessible on your host in the location relative to however you've mapped it.

Read up on volumes. They're vital to using Docker, not hard to grasp at first glance, but there a some gotchas in the details.


Regarding uploading to S3, I would recommend using the boto3 library and doing it in your Python script. You could also use something like s3cmd if you find that simpler.


You could use S3FS Fuse to mount the S3 bucket as a drive in your docker container. This basically creates a folder on your filesystem that is actually the S3 bucket. Anything that you save/modify in that folder will be reflected in the S3 bucket.

If you delete the docker container or unmount the drive you still have your S3 bucket intact, so you don't need to worry too much about erasing files in the S3 bucket through normal docker use.