## #1,078 – The Hexadecimal Numeral System

Humans normally represent numeric values using the decimal (base 10) numeral system.  We can also represent numbers as hexadecimal, or base 16.

The hexadecimal numeral system uses 16 different digits (numeric symbols), rather than 10.  You represent a numerical value using a string of these hexadecimal (or hex) digits.  The 16 digits used are the digits 0-9 and the letters A-F.  (E.g. 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, A, B, C, D, E, and F).  Comparing to the decimal system, A then represents 10, B represents 11, up to F, which represents 15.  Hexadecimal constants are often written as “0x”, followed by the digits.

Moving from right to left, the digits in a base 16 system represent powers of 16 (1, 16, 256, 4096, etc. or 16^0, 16^1, 16^2, 16^3, etc).  The value of a number represented as hexadecimal is the sum of the value of each digit multiplied by the appropriate power of 16.  For example: As an example, the hexadecimal number 0x3A8E is equivalent to the decimal number 14,990. ## #990 – Converting Hexadecimal Strings to Numeric Data

You can convert a numeric value to its equivalent hex string using the hexadecimal format specifier.  To convert in the other direction, from a hex string to its corresponding integer value, you use the int.Parse function, passing NumberStyles.AllowHexSpecifier as the second parameter.

```            // using System.Globalization
int n = int.Parse("A", NumberStyles.AllowHexSpecifier);
int n2 = int.Parse("9D", NumberStyles.AllowHexSpecifier);
int n3 = int.Parse("400", NumberStyles.AllowHexSpecifier);
int n4 = int.Parse("1b3f", NumberStyles.AllowHexSpecifier);
int n5 = int.Parse("000b", NumberStyles.AllowHexSpecifier);
``` In C#, integer literals are normally specified using base-10 notation (e.g. 123), but can also be specified as a base-16 (hexadecimal or hex) number.

Each hex digit represents a 4-bit value and can therefore represent a value in the range [0,15].  Values from 0-9 are represented by their decimal digits.  Values from 10-15 are represented by the hex digits A-F.

In C#, hex literals begin with the characters “0x”.

Each hex digit represents a value to be multiplied by a power of 16.

Example: 0x1A2F = (1 x 163) + (10 x 162) + (2 x 161) + (15 x 160) = 4096 + 2560 + 32 + 15 = 6703

You can also think of each hex digit as representing four bits:

0 = 0000
1 = 0001
2 = 0010

E = 1110
F = 1111

So 0x1A2F would be:  0001 1010 0010 1111

In C#, you can use hex numbers for integer literals.

``` int n = 0x1A2F;
ushort u1 = 0xFFFF;         // 16 bits
uint u2 = 0x12341234;       // 32 bits
```

Hex numbers are a convenient way of expressing integral values, denoting exactly the bits stored in memory for that integer.