'Char' is a data type used to represent a single character in the Unicode character set. Each character is represented by an integer value in the range from 0 to 65,535 (or from '\u0000' to '\uffff' in Java). While some other programming languages might support extended Unicode character sets, within this range, each char still occupies 16 bits.