programming: ASCII

The American Standard Code for Information Interchange (ASCII) is a “character encoding standard for electronic communication” (Wikipedia). The basic premise behind ASCII is that every character has its own unique 8-bit number code. The character ‘A’, for example, has an ASCII value of 65. However, the character ‘a’ has an ASCII value of 97. The ASCII table shown has value mappings for each character. 

Since ASCII codes are 8 bits long, it is typical to store ASCII characters in either the char or the int8_t data types. When using the debugger, you will see that a char or int8_t does not save the character itself; instead, it stores the ASCII code of the character -- a binary number. As such, you can do arithmetic on the variable and change the character it represents.

char myChar = ‘A’; // myChar would show up as 0x41 in the debugger

myChar += 5; // myChar would show up as 0x46 in the debugger

// myChar now represents the character ‘F’

Numbers are also characters that can be represented by ASCII codes. However, it is important to note that 0 is NOT the same as ‘0’.

char firstChar = 0; // this represents the NULL character -- 0x00 in the debugger

char secondChar = ‘0’; // this represents the 0 character -- 0x30 in the debugger

Knowing this, however, we can convert a single digit of a number into its character form with some arithmetic.

int myNumber = 4;

char myChar = myNumber + ‘0’;

// the following would also have the same effect:

// char myChar = myNumber + 48;

This works because ‘0’ and 48 are the same. 48 is the ASCII encoding that corresponds to the character ‘0’ and vice versa. In the end, characters are just numbers.