implicit conversion from char to int
char is of 1 byte in c++ and have range -128 to 127 (signed ) and 0 to 255 (unsigned) , so 123456 can’t store in it , that i understand but how compiler convert 123456 to char 64 which ascii value “@” .
I was expected it give me a error or convert it ,but this conversion i can not understand.
New contributor
Tarun Malviya is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
1