var value = "0".PadRight('4', '0');
Console.WriteLine(value);
The result of this is 0000000000000000000000000000000000000000000000000000
in both my project and .NET Fiddle.
Can someone explain why?
The first parameter of String.PadRight
is an int
, not a char
:
public string PadRight (int totalWidth, char paddingChar);
In this line:
var value = "0".PadRight('4', '0');
'4'
is converted to an int
using its ascii value which is 52.
The result is 52 zeroes.