On server running on Windows 2012 R2, version
PS C:\Users\admin> $PSVersionTable.PSVersion
Major Minor Build Revision
----- ----- ----- --------
4 0 -1 -1
I need to delete files older than 180 days in a folder with a lot of subfolders. That is quite simple, BUT not when there are hundreds of thousands of files and the folder is about 800GB. Using Get-ChildItem which first reads all files recursivly while checking the date and then starts deleting them...well, it takes forever and the server did at the end ran out of memory.
So - anyone in the mood to help me speed up my code which currenlty is like this (the delete part)
...
...
foreach ($i in Get-ChildItem $TargetFolder -recurse -exclude
$skipFilePatterns | where { ! $i.PSIsContainer }) {
if (! $i.PSIsContainer -and $i.LastWriteTime -lt ($(Get-Date).AddDays(-$keepForDays))) {
# Add -WhatIf to test the script, remove it when confirmed
$timeStamp = $i.LastWriteTime
$fullName = $i.FullName
$log.Info("Deleting: $fullName with timestamp (LastWriteTime): $timeStamp")
Remove-Item $i.FullName -force -ErrorVariable errVar -ErrorAction SilentlyContinue
...
...
So you can use Select -first $Limit
Second no need to exclude Folders $i.PSIsContainer
you can just tell Get-ChildItem
(alias GCI
) to only get files using the -File
param
Something like
function Remove-ChildItemsInChunks($keepForDays, $Limit, $Path){
$count = 0
gci $Path -Recurse -File |
?{$i.LastWriteTime -lt ($(Get-Date).AddDays(-$keepForDays))} |
select -First $Limit | %{
$count += 1
Remove-Item $_
}
return $Count
}
$GCICount = Get-ChildItemsInChunks -Path C:\test -keepForDays 30 -Limit 500
while($GCICount -gt 0){
$GCICount = Get-ChildItemsInChunks -Path C:\Test -keepForDays 30 -Limit 500
}