Jasper Reports Developer Question:
Download Questions PDF

How many bits are used to represent Unicode, ASCII, UTF-16, and UTF-8 characters in Java Programming?

Answer:

Unicode requires 16 bits and ASCII require 7 bits. Although the ASCII character set uses only 7 bits, it is usually represented as 8 bits. UTF-8 represents characters using 8, 16, and 18 bit patterns. UTF-16 uses 16-bit and larger bit patterns.

Download Jasper Reports Developer Interview Questions And Answers PDF

Previous QuestionNext Question
Define Vector class in Java Programming?What modifiers may be used with inner class that is a member of an outer class in Java Programming?