How does it matter if a character is 8 bit or 16 bit or 32 bit
        Posted  
        
            by 
                vin
            
        on Programmers
        
        See other posts from Programmers
        
            or by vin
        
        
        
        Published on 2012-07-23T11:40:54Z
        Indexed on 
            2013/06/24
            16:37 UTC
        
        
        Read the original article
        Hit count: 415
        
character-encoding
|unicode
Well, I am reading Programing Windows with MFC, and I came across Unicode and ASCII code characters. I understood the point of using Unicode over ASCII, but what I do not get is how and why is it important to use 8bit/16bit/32bit character? What good does it do to the system? How does the processing of the operating system differ for different bits of character.
My question here is, what does it mean to a character when it is a x-bit character?
© Programmers or respective owner