Search code examples
powershelldirectorydata-migrationfile-copying

Copying files from multiple (specified) folder paths to another directory while maintaining file structure


I am trying to copy multiple files from one directory to another with PowerShell. I would like to:

  • Maintain the folder/file structure.
  • Copy all files within specific folders.

Hypothetical structure:

Source Folder
    \User 1
        \Folder 1
            \Files
        \Folder 2
            \Files
        \Folder 3
            \Files
    \User 2
        \Folder 3
            \Files
    \User 3
        \Folder 2
            \Files
    \User 4
        \Folder 3
            \Files
        \Folder 4
            \Files

Possible Scenario:

  • I want to copy files where users have a Folder 1 and Folder 2.

Expected Result:

Destination Folder
    \User 1
        \Folder 1
            \Files
        \Folder 2
            \Files
    \User 3
        \Folder 2
            \Files

This is the code I have so far:

$FolderName = '\\Folder 1\\'
$source = 'C:\CDPTest\Live'
$target = 'C:\CDPTest\DevTest'
$source_regex = [regex]::Escape($source)

(gci $source -Recurse | where {-not ($_.PSIsContainer)} | select -Expand FullName) -match $FolderName |
    foreach {
        $file_dest = ($_ | Split-Path -Parent) -replace $source_regex, $target
        if (-not (Test-Path $file_dest)) {mkdir $file_dest}
    }

As you can see the match is only going to return one file path based on the current code, what I am trying to do is extend this to match several folder names.

What I have tried:

  • Running this code with a different FolderName in a separate PowerShell file with no success.
  • Using an array of folder names for matching.
  • Using the -and/-or operators to extend the match function.

Solution

  • Thanks to all of the replies to this question, you have all helped to point me in the right direction. This is the solution I went with:

    #Used for time-stamped logs (requires C:\Root\RobocopyLogs\ to exist)
    #$log can be added after '$dest$'
    #$dateTime = Get-Date -Format g
    #$currentDateTime = get-date -format "MM.dd.yyyy-HH.mm.ss.fff"
    #$log = "/log:C:\Root\RobocopyLogs\$currentDateTime.txt"
    
    # Set up variables
    $sourceRootDirectory = "C:\Root\Source"
    $userDirectories = $sourceRootDirectory+"\*\"
    $dest = "C:\Root\Destination"
    $excludeExceptions = @("Folder 1",
    "Folder 2",
    "Folder 3",
    "Folder 4",
    "Folder 5")
    
    # Get the exclusion list from the source
    $excludedFolderArray = (gci $userDirectories -Exclude $excludeExceptions)
    $excludedFileArray = $excludedFolderArray |
        Where-Object {$_.PSIsContainer -eq $False}
    
    Robocopy $sourceRootDirectory $dest /FFT /MIR /XA:H /R:1 /W:5 /XD $excludedFolderArray /XF $excludedFileArray
    

    I was getting an issue when syncing with robocopy where if a file was placed in the root folder it would be copied over. I had to create a separate list of files to be excluded from the root.