For a byte, positive 2 should be equal to 0000 0010.
I read from somewhere that for a signed number, the most significant bit will be used to determine whether the number is positive or negative.
So why doesn't -2 equal to 1000 0010?
I already checked in calculator, -2 equals to 1111 1110.
Thanks in advance.