Bash command line and input limit Bash command line and input limit bash bash

Bash command line and input limit


The limit for the length of a command line is not imposed by the shell, but by the operating system. This limit is usually in the range of hundred kilobytes. POSIX denotes this limit ARG_MAX and on POSIX conformant systems you can query it with

$ getconf ARG_MAX    # Get argument limit in bytes

E.g. on Cygwin this is 32000, and on the different BSDs and Linux systems I use it is anywhere from 131072 to 2621440.

If you need to process a list of files exceeding this limit, you might want to look at the xargs utility, which calls a program repeatedly with a subset of arguments not exceeding ARG_MAX.

To answer your specific question, yes, it is possible to attempt to run a command with too long an argument list. The shell will error with a message along "argument list too long".

Note that the input to a program (as read on stdin or any other file descriptor) is not limited (only by available program resources). So if your shell script reads a string into a variable, you are not restricted by ARG_MAX. The restriction also does not apply to shell-builtins.


Ok, Denizens. So I have accepted the command line length limits as gospel for quite some time. So, what to do with one's assumptions? Naturally- check them.

I have a Fedora 22 machine at my disposal (meaning: Linux with bash4). I have created a directory with 500,000 inodes (files) in it each of 18 characters long. The command line length is 9,500,000 characters. Created thus:

seq 1 500000 | while read digit; do    touch $(printf "abigfilename%06d\n" $digit);done

And we note:

$ getconf ARG_MAX2097152

Note however I can do this:

$ echo * > /dev/null

But this fails:

$ /bin/echo * > /dev/nullbash: /bin/echo: Argument list too long

I can run a for loop:

$ for f in *; do :; done

which is another shell builtin.

Careful reading of the documentation for ARG_MAX states, Maximum length of argument to the exec functions. This means: Without calling exec, there is no ARG_MAX limitation. So it would explain why shell builtins are not restricted by ARG_MAX.

And indeed, I can ls my directory if my argument list is 109948 files long, or about 2,089,000 characters (give or take). Once I add one more 18-character filename file, though, then I get an Argument list too long error. So ARG_MAX is working as advertised: the exec is failing with more than ARG_MAX characters on the argument list- including, it should be noted, the environment data.


There is a buffer limit of something like 1024. The read will simply hang mid paste or input. To solve this use the -e option.

http://linuxcommand.org/lc3_man_pages/readh.html

-e use Readline to obtain the line in an interactive shell

Change your read to read -e and annoying line input hang goes away.