Computer works with binary. But the humans use decimal, not binary. So, to show what is a value of number in memory, the conversion is need. Then I suspected how the computer shows a decimal number. it may have data to convert.
For instance, on x86 system, 1 => 1, 10 => 2,
…. 1 0110 => 22 …. 1111 1111 1111 1111 1111 1111 1111 1111 => 2^32 – 1; all was programmed on hardware.
Or it may show a number based on string like that; string_sum(“123″,”4567”) returns 4690. And each string is also converted from binary with a below code.
A sample code:
...
int i = 12, count = 0;
char num[10] = {0, };
while(i) {
int tmp = (i % 2) * pow(2, count);
string_sum(num, tmp);
i >>= 1;
++ count;
}
...
How shows that on such a CLI console, terminal, and graphical interface?
I didn’t find an algorithm how the computer show a decimal number.
PICOPress is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.