var value = "0".PadRight('4', '0');
Console.WriteLine(value);
The result of this is 0000000000000000000000000000000000000000000000000000
in both my project and .NET Fiddle.
Can someone explain why?
2
The first argument of [String.PadRight
] is an int
, not a char
.
In this line:
var value = "0".PadRight('4', '0');
4
is converted to an int
using it’s ascii value which is 53.
The result is 54 zeroes.