I am creating a File copy program which will copy large number of files(~100,000) with size ~50 KB using ROBOCOPY command.
For each file, I am creating a new process and passing the ROBOCOPY command and arguments as follow:
using (Process p = new Process)
{
p.StartInfo.Arguments = string.Format("/C ROBOCOPY {0} {1} {2}",
sourceDir, destinationDir, fileName);
p.StartInfo.FileName = "CMD.EXE";
p.StartInfo.CreateNoWindow = true;
p.StartInfo.UseShellExecute = false;
p.Start();
p.WaitForExit();
}
Instead of creating a process for each file, I am looking for a better approach, which will be good in terms of performance and design. Can someone suggest a better method?
I would just use System.IO. Should be plenty fast enough, and your filename could be a wildcard.
using System.IO;
// snip your code... providing fileName, sourceDir, destinationDir
DirectoryInfo dirInfo = new DirectoryInfo(sourceDir);
FileInfo[] fileInfos = dirInfo.GetFiles(fileName);
foreach (FileInfo file in fileInfos)
{
File.Copy(file.FullName, Path.Combine(destinationDir, file.Name), true); // overwrites existing
}