Search code examples
bashshellrelative-pathsymlinkabsolute-path

Shell-Script for editing softlink - absolute to relative


I need to edit some Softlinks which are broken because they are absolute. I want to get them to some relative ones. For that I need to count the beginning directory depth of the path they are pointing at. Then I need to replace these counted directories by ../ because I just need to get up, until I find the matching directory.

I have already a line to find the broken links:

find -type l | while read f; do if [ ! -e "$f" ]; then ls -l "$f"; fi; done

This works and so I can find the broken ones, but I have no idea how to continue.

Edit: I found the following solution which works:

depthRoot=2 # Depth of root-directory
find -type l |
while read f
do
  if [ ! -e "$f" ] # broken link found (put in variable f)
  then
    echo "Link found: $f" # output of broken link
    previous="$(readlink "$f")" # target of broken link in variable previous
    echo "points to: $previous" # output of target

    y=$"${f//[^\/]}" # generating string of f with all the slashs
    depthFrom=${#y} # count length of y an put value in depthFrom
    depthFrom=`expr $depthFrom - 1`
    echo "Depth place = $depthFrom" # output of depth of place of broken link

    y=$"${previous//[^\/]}" # generating string of previous with all the slashs
    depthTarget=${#y} # count length of y an put value in depthTarget
    depthTarget=`expr $depthTarget - $depthRoot - 1` # subtraction of depth of root-directory
    echo "Depth target = $depthTarget" # output of depth of target of link
    depthRoot=`expr $depthRoot + 1` # root-directory-depth +1 (1 Directory - 2 Slashs)

    depth=`expr $depthFrom - $depthTarget` # calculate difference in depth
    echo "difference of depth = $depth" # output of differrence of depth

    while [ $depthRoot -gt 0 ] # while depthRoot > 0
    do
        depthRoot=`expr $depthRoot - 1` # decrement depthRoot by 1
        previous=${previous#*/} # erease everything of previous until first Slash leftside
    done

    f=${f:2} # subtract ./ of f
    tofind="$(echo "$previous"| sed -e 's/[\/&]/\\&/g')" # escape subdirectories of previous
    tofind=${tofind%\\*} # erease everything on the right side until first backslash
    newlink="$(echo "$f"| sed "s/$tofind//g")" # erease existing side on the left
    newlink=${newlink:1}                
    while [ $depth -gt 0 ] # while depth > 1 
    do
    newlinkaddition=$newlinkaddition"../" # generating ../ for relative links
    newlink=${newlink#*/} # remove old directories which will be replaced by ../
    depth=`expr $depth - 1` # decrement depth
    done


    newlink=$newlinkaddition$newlink # generating target of new link
    rm $f # erease old link
    ln -s "$newlink" "$" # generating new link
    echo "new link generated: $newlink -> $f" # output of successfull generation of new link
  fi
done

Solution

  • problem is you need to know there is a common root to all the broken links.

    e.g. in my trial your code gave me this output

     lrwxrwxrwx 1 user group 10 Aug 10 17:31 ./a/c -> /tmp/old/c
    

    and in this case the common root is /tmp/old

    so that has to be removed and replaced by a relative link to the new location which is supposed to be the current directory

    oldroot=/tmp/old/    # needs / at the end!
    find -type l |
    while read f
    do
      if [ ! -e "$f" ]      # dead link found
      then
        f="${f#./}"                    # remove ^./ from find result
        echo "found link $f"
        previous="$(readlink "$f")"    # get old destination
        echo "pointing to $previous"
        previous="${previous#$oldroot}"   # remove oldroot
        newlink="$(echo "$f"|awk 'BEGIN{FS=OFS="/"}{for(i=1;i<NF;i++){printf "../"}}END{print "."}')"
        newlink="$(echo "$newlink/$previous"|sed 's,/\./,/,g')"
        echo "creating link $f -> $newlink"
        rm $f                                      
        ln -s "$newlink" "$f"
      fi
    done
    

    This solution uses awk and sed. Maybe it's possible to do that also in bash, but I'm more comfortable to use awk & sed in this case.