I recorded a text file which include some unicode characters : e.g."degree sign" \u00b0 and "SUPERSCRIPT TWO" \u00b2.
Then I want to read this text file with c# StreamReader. Those unicode characters cannot be read properly.
text File include lines as following:
26,VehicleData Acceleration Z,m/s²,System.Single 27,VehicleData Angular Velocity about X,°/s,System.Single
Data Read section:
1. StreamReader indexReader = File.OpenText( filename + ".txt");
2. StreamReader indexReader = new StreamReader(filename + ".txt", System.Text.Encoding.Unicode);
...
Data Assign section:
for ( int i = 0; i < headerCount; i++ )
{
string line = indexReader.ReadLine();
string[] parameterHeader = line.Split( ',' );
var next = new ReportParameters.ParameterInfoElement();
next.parameterID = Int32.Parse( parameterHeader[ 0 ] );
next.name = parameterHeader[ 1 ];
next.units = parameterHeader[ 2 ];
next.type = Type.GetType( parameterHeader[ 3 ] );
_header.Add( next );
}
m/s² and °/s will be read as m/s� and �/s.
I want to read it properly.
The key thing here is to pass the correct Encoding
to the reader; since you say it is UTF-8:
/* write a dummy file as raw UTF-8; this is just test data that looks like:
1°
2²
3
*/
File.WriteAllBytes("test.txt", new byte[] {
0x31, 0xC2, 0xB0, 0x0D, 0x0A,
0x32, 0xC2, 0xB2, 0x0D, 0x0A, 0x33 });
// use the TextReader API to consume the file
using (var reader = new StreamReader("test.txt", Encoding.UTF8))
{
string line;
while ((line = reader.ReadLine()) != null)
{
Console.WriteLine(line);
}
}
Note, however, that it is easier to use foreach
with File.ReadLines("test.txt", Encoding.UTF8)
:
foreach(var line in File.ReadLines("test.txt", Encoding.UTF8))
{
Console.WriteLine(line);
}