The char type (see Table 6.2 of Chapter 6) is used to represent Unicode characters, each taking up 16 bits of memory. Unicode is an international, standardized character set enabling computers to represent the characters found in most human languages. For more information about Unicode, see Appendix E, “Unicode Character Set.”
A char literal can be
A standard letter represented with single quotes, such as the lowercase 'a' or the uppercase 'E'.
You could accordingly assign the character 'T' to a variable, as shown in the following:
char myChar; myChar = 'T';
A single digit, such as '4'. Notice that to the C# compiler, a single digit character is not a number but just another character and so it cannot participate in any arithmetic ...