#72 – Hexadecimal Numbers

In C#, integer literals are normally specified using base-10 notation (e.g. 123), but can also be specified as a base-16 (hexadecimal or hex) number.

Each hex digit represents a 4-bit value and can therefore represent a value in the range [0,15].  Values from 0-9 are represented by their decimal digits.  Values from 10-15 are represented by the hex digits A-F.

In C#, hex literals begin with the characters “0x”.

Each hex digit represents a value to be multiplied by a power of 16.

Example: 0x1A2F = (1 x 163) + (10 x 162) + (2 x 161) + (15 x 160) = 4096 + 2560 + 32 + 15 = 6703

You can also think of each hex digit as representing four bits:

0 = 0000
1 = 0001
2 = 0010

E = 1110
F = 1111

So 0x1A2F would be:  0001 1010 0010 1111

In C#, you can use hex numbers for integer literals.

 int n = 0x1A2F;
 ushort u1 = 0xFFFF;         // 16 bits
 uint u2 = 0x12341234;       // 32 bits

Hex numbers are a convenient way of expressing integral values, denoting exactly the bits stored in memory for that integer.

About Sean
Software developer in the Twin Cities area, passionate about software development and sailing.

One Response to #72 – Hexadecimal Numbers

  1. Pingback: #989 – Formatting Numbers as Hexadecimal | 2,000 Things You Should Know About C#

Leave a comment