Modern computer store and process information represented as two-valued signals.
Three most important representations of numbers:
- Unsigned: encodings are base on traditional binary notation, representing number greater than or equal to 0.
- Two's-complement: encodings are the most common way to represent signed integers, that is, numbers that may be either positive or negative.
- Floating-point: encodings are base-two version of scientific notation for representing real numbers.
sections: