01 Dec, 2017

1 commit


10 Oct, 2017

1 commit

  • The constant MAX_UTF8_PER_UTF16 is used to calculate
    required memory when converting from UTF-16 to UTF-8.
    If this constant is too big we waste memory.

    A code point encoded by one UTF-16 symbol is converted to a
    maximum of three UTF-8 symbols, e.g.

    0xffff could be encoded as 0xef 0xbf 0xbf.
    The first byte carries four bits, the second and third byte
    carry six bits each.

    A code point encoded by two UTF-16 symbols is converted to four
    UTF-8 symbols.

    So in this case we need a maximum of two UTF-8 symbols per
    UTF-16 symbol.

    As the overall maximum is three UTF-8 symobls per UTF-16 symbol
    we need MAX_UTF8_PER_UTF16 = 3.

    Fixes: 78178bb0c9d lib: add some utf16 handling helpers
    Signed-off-by: Heinrich Schuchardt
    Signed-off-by: Alexander Graf

    Heinrich Schuchardt
     

13 Sep, 2017

1 commit