I have the following command running on cron:
sudo find "$VOLUME1" "$VOLUME2" "$VOLUME3" "$VOLUME4" -type f -exec ls -lT {} + > $FILE
This command takes a couple hours to complete, as it's running over about million files or so.
When I do the top
command, it shows 14(!!) difference instances of ls
running. Is this a bug in the script, or what is causing so many ls commands to be running?
Because the length of the command line is limited, find
cannot start a single ls
instance with over a million parameters. It will instead spawn multiple ls
processes with thousands of parameters each.
The maximum command line length seems to be 2097152 on my machine, perhaps that could also be about the size of 1000000/14 filenames?
$ getconf ARG_MAX
2097152