How to repeatedly pass arguments to a python file
I would suggest passing a file location (e.g. 'names.txt'
) as the parameter to your python script when called from shell. Then within the python script, import the names and work with them one by one.
To your second question, if you wrap the logic of script.py in a function (e.g. called script_function
) which takes param name
, and call script_function(name)
for each name
in 'names.txt'
, you should keep your memory use down. The reason is that all the variables created in script_function
would be local to that function call and deleted/replaced on the next function call for working with the next name.
try:
cat <file_with_list_of_names_1_per_line> | xargs python script.py -n
that should do it, if not ... try wrapping "python script.py -n" in a bash script. Something simple, let say a script named "call_my_script.sh" which would contain:
#!/bin/bashpython script.py -n $1
, then you could call it with:
cat <file_with_list_of_names_1_per_line> | xargs call_my_script.sh# this will call/execute "call_my_script.sh name", for each name in the file, 1 at a time
You could also use the "-P" option of "xargs", to run several instances in parallel (watch out for concurențial access to resources, like writing to same output file or similar, could produce strange results)
.. | xargs -P <n> ...
to run "n" instances of the script in parallel
Side note: also an important aspect for whom is not familiar with "xargs", it will treat each word individually, meaning if on a line there would've been 2 (or more) words "word1 word2" ... that would make it call "the script" 2 (or more) times, 1 for each word. It might not be the expected behavior, so it worth mentioning.