Basic and Advance Java Question:
Download Questions PDF

How many bits are used to represent Unicode, ASCII, UTF-16, and UTF-8 characters in Java Programming?

Answer:

Unicode requires 16 bits and ASCII require 7 bits. Although the ASCII character set uses only 7 bits, it is usually represented as 8 bits. UTF-8 represents characters using 8, 16, and 18 bit patterns. UTF-16 uses 16-bit and larger bit patterns.

Download Java Interview Questions And Answers PDF

Previous QuestionNext Question
What is an Iterator interface in Java Programming?What is the difference between yielding and sleeping in Java Programming?