I'm trying to fetch all files within all directories on our SAN. I'm starting with my local to test out how I want to do it. So, at my Documents directory:
ls -sR > documents_tree.txt
With just my local, that's fine. It gives the exact output I want. But since I'm doing it on our SAN, I'm going to have to compress on-the-fly, and I'm not sure the best way of doing this. So far I have:
ls -sR > documents_tree.txt | tar -cvzf documents_tree.tgz documents_tree.txt
When I try to check the output, it is impossible for me to un-tar the file using tar -xvf documents_tree.tar
after I have gunzipped it.
So, what is the correct way to compress on-the-fly? How can I accurately check my work? Will this work when performing the same process on a SAN?
You don't need to use tar
to compress a single file, just use gzip
:
ls -sR | gzip > documents_tree.txt.gz
You can then use gunzip documents_tree.txt
to uncompress it, or tools like gzcat
and zless
to view it without having to uncompress it first.