I have a set of subdirectories, with files inside them:
├── dir1
│ ├── file_a.type
│ ├── file_b.type
│ ├── file_c.type
│ └── file_d.type
├── dir2
│ ├── file_e.type
│ ├── file_f.type
│ ├── file_g.type
│ └── file_h.type
└── README.md
I can guarantee the uniqueness of each file name. The directory naming convention is n[some unique random number]
.
I have a text file with a subset of these files
file_g.type
file_a.type
file_e.type
I would like to copy all of the files matching the names in that text file to a new directory.
I have tried using xargs to copy, however this does not work because of the subdirectories.
xargs -a files.txt cp -t new_dir each
I could recursively copy all the files in the sub-directories to a new directory and go from there.
However this is not possible due to disk size and bandwidth issues.
What is an efficient way to do this using standard bash utilities ?
If you have the file names in a file named files.txt
:
while read file
do
cp dir[12]/"${file}" -t new_dir
done < files.txt
With xargs
it can be done like this:
xargs -a files.txt -IFILE bash -c 'cp dir[12]/"FILE" -t new_dir'