Java Developer Question:
Download Questions PDF

How many bits are used to represent the Unicode, ASCII, UTF-16, and UTF-8 characters in Java Programming?

Answer:

Unicode requires 16 bits and ASCII require 7 bits. Although the ASCII character set uses only 7 bits, it is usually represented as 8 bits. UTF-8 represents characters using 8, 16, and 18 bit patterns. UTF-16 uses 16-bit and larger bit patterns.

Download Java Developer Interview Questions And Answers PDF

Previous QuestionNext Question
Can you please explain the difference between yielding and sleeping in Java Programming?How to write a loop indefinitely in Java Programming?