Up to this point, we’ve mostly used variables of type int, float, and double. In this chapter, we’ll focus on the char type. To show you how to work characters, we assume that the underlying character set relies on the most popular set, the 8-bit ASCII (American Standard Code for Information Interchange) code. As you can see in Appendix B, ordinary characters, such as letters and digits, are represented by integers from 0 to 255.
Since a character in the ASCII set is represented by an integer between 0 and 255, we can use the char type to store its value. When a character is stored into a variable, it is the character’s ASCII value that is actually stored. In the following example,
ch = 'c';
the value ...