I'm trying to take a CSV, a tab delimited txt file, and change one of the columns from 'last name, first' to ' First name last name'. I think I am nearly there with getting it into the right format for another program, but my exported file only has one column, the one with the name that I edited.
Import-CSV -Path import.txt -Delimiter "`t" -Header @("col0", "col1", "col2", "col3", "col4", "col5", "col6", "col7", "col8", "col9", "cal1", "cal2", "cal3", "cal4", "cal5") |
foreach{
$last,$first = $_.col3 -split ", "
new-object psobject -Property @{name = "$first $last"}
$_
} |
Export-CSV done.txt -Delimiter "`t" -NoTypeInformation
(Get-Content -Path done.txt ).Replace('"','') | Select-Object -Skip 1 |
Set-Content -Path done.txt
(gc done.txt) | ? {$_.trim() -ne "" } | set-content done.txt
I tried to export to txt or csv. The last two lines is more for format, but I have tried excluding them. There's also no headers. so I added them and took them off at the end. Without adding new-object, it doesn't work at all. Thanks!
Modify the .col3
property of each input object directly - no need for New-Object
.
new-object psobject -Property @{name = "$first $last"}
that caused your problem: as the first output object, it locked in the columns for the Export-Csv
call, which meant that only a name
column was exported.Use ConvertTo-Csv
in lieu of Export-Csv
to produce the CSV (TSV) data in memory, as a stream of lines, which allows you to skip the header line via Select-Object
-Skip 1
and then write the remaining lines to your output file with Set-Content
.
Import-CSV -Path import.txt -Delimiter "`t" -Header col0, col1, col2, col3, col4, col5, col6, col7, col8, col9, cal1, cal2, cal3, cal4, cal5 |
ForEach-Object{
$last, $first = $_.col3 -split ", "
$_.col3 = "$first $last"
$_
} |
ConvertTo-Csv -Delimiter "`t" -NoTypeInformation |
Select-Object -Skip 1 |
Set-Content -Encoding utf8 done.txt
Note:
I've added -Encoding utf8
to the Set-Content
call as a reminder that you may want to control the character encoding explicitly; given that Set-Content
uses the system's active legacy ANSI code page by default; note that -Encoding utf8
invariably produces UTF-8 files with a BOM in Windows PowerShell.
In Windows PowerShell, you get no control over whether the fields in the output CSV (TSV) data are double-quoted or not - they all are, but for well-behaved CSV / TSV readers that shouldn't make a difference.
ConvertTo-Csv
and Export-Csv
via the -UseQuotes
and -QuoteFields
parameters.-NoHeader
switch to avoid emitting / writing a header row, and the use of -NoTypeInformation
is no longer necessary.-Encoding utf8BOM
Therefore, in PowerShell (Core) 7+ the code can be simplified to the following:
# PS 7+ only
Import-CSV -Path import.txt -Delimiter "`t" -Header col0, col1, col2, col3, col4, col5, col6, col7, col8, col9, cal1, cal2, cal3, cal4, cal5 |
ForEach-Object{
$last, $first = $_.col3 -split ", "
$_.col3 = "$first $last"
$_
} |
Export-Csv done.txt -Delimiter "`t" -UseQuotes Never -NoHeader
The above omits double-quoting from the fields and creates a BOM-less UTF-8 output file.