0

I have this piece of code:

#include <stdio.h>
#include <iostream>
int main(int n, char** args)
{
    int i = 140;
    char c = i;
    int j = c;
    printf("%d", j);
    system("pause");
}

Output:

-116

As I understand, char c = i would assign the character which has ASCII code 140 to c, but I wonder what happened when assigning int j = c; after that and where does -116 value come from?

if I change char c = i; to unsigned char c = i; then the output is 140. Is there anyway to get the value of j equal to 140 if I don't use unsigned here? (assumed that i >= 0 and i <= 255)

Pete
  • 65
  • 9

1 Answers1

1

The range of values for an unsigned char are [0,255] inclusive - but the range of values for a signed char are [-128,127] inclusive.

Your value of 140 is outside the range of the signed char, so it overflows and wraps round to -116. At least that's what seems to be happening - overflow of a signed integer is undefined behaviour.

There's not really any way to avoid this without using unsigned char or a larger alternative to char - you're simply trying to represent a value that can't be represented by that type.

hnefatl
  • 5,860
  • 2
  • 27
  • 49