dockerfile copy list of files, when list is taken from a local file
Not possible in the sense that the COPY
directive allows it out of the box, however if you know the extensions you can use a wildcard for the path such as COPY folder*something*name somewhere/
.
For simple requirements.txt
fetching that could be:
# but you need to distinguish it somehow# otherwise it'll overwrite the files and keep the last one# e.g. rename package/requirements.txt to package-requirements.txt# and it won't be an issueCOPY */requirements.txt ./RUN for item in $(ls requirement*);do pip install -r $item;done
But if it gets a bit more complex (as in collecting only specific files, by some custom pattern etc), then, no. However for that case simply use templating either by a simple F-string, format()
function or switch to Jinja, create a Dockerfile.tmpl
(or whatever you'd want to name a temporary file), then collect the paths, insert into the templated Dockerfile
and once ready dump to a file and execute afterwards with docker build
.
Example:
# Dockerfile.tmplFROM alpine{{replace}}
# organize files into coherent structures so you don't have too many COPY directivesfiles = { "pattern1": [...], "pattern2": [...], ...}with open("Dockerfile.tmpl", "r") as file: text = file.read()insert = "\n".join([ f"COPY {' '.join(values)} destination/{key}/" for key, values in files.items()])with open("Dockerfile", "w") as file: file.write(text.replace("{{replace}}", insert))
You might want to do this for example:
FROM ...ARG filesCOPY files
and run with
docker build -build-args items=`${cat list_of_files_to_copy.txt}`